Bluetooth device simulates Xbox joystick from wheelchair interfaces
Chameleon
Independent Project
ROS
C++
Computer Vision
Mechatronics
Programs turtlebot to camouflage like a chameleon
Connect-4
Group Project
ROS
python
tk
Computer Vision
Minimax
Programs Sawyer to play connect-4 against human player
youBot planning
Independent Project
python
CoppeliaSim
Motion Planning
Control System
Plans trajectory for youBot to pick and place block at specified location
Jack in the box
Independent Project
python
numpy
dynamics
Simulates a jack in a box
Gear-train design
Group Project
CAD
SiemensNX
Design elements
Designs a compound gear-train system for power transmission
Walk distance tracking
Group Project
MATLAB
Mobile device
Android
Tracks walk distance of mobile device carrier through sensors' data analysis
JD.com crawler
Independent Project
python
django
MySQL
Haystack
HTML
Crawls mobilephones' information from JD.com, build database and search engine
Laser Guitar
Group Project
Arduino
Mechatronics
Builds simple laser guitar from laser modules, light to frequency converters and buzzer
Robotic Arm
Group Project
CAD
Arduino
Mechatronics
Builds a robotic arm from scratch
PāPēi
Independent Project
C#
python
CIE Past Papers Viewer crawled and sorted
×
Tim Multi-Interfacing Device
Apr 2021 - Dec 2021, Independent project
Northwestern Master Final Project
Introduction
Tim Multi-Interfacing Device was inspired by
the Freedom Wing adapter
designed by AbleGamers which enables disabled video game players to use their electric wheelchair interfaces as an xbox controller.
The Freedom Wing adapter serves as the connection device between the wheelchair and a Microsoft’s XBox Adaptive Controller,
which takes any personal input device and convert it into a standard XBox controller. Freedom Wing is open source, and is reproducible at a cost of around 35 dollars,
which is amazing.
Based on the Freedom Wing adapter, the idea of my project is to replace the wired connection with bluetooth, which means once my device is attached
to the wheelchair, the disabled only needs to turn its power on/off instead of pluging/unpluging a bunch of wires. My device is also part of a bigger plan designed by
Argallab which involves user's interface data collection. My device will collect data which will be
used by Android app for future studies.
In the first version of design I came up with, I utilize the Raspberry Pi built-in bluetooth.
I was able to pass the usb mouse signals through the built-in bluetooth of Raspberry Pi. Then by adding a udev rule to the linux computer which invokes a script calling the
Xboxdrv, I managed to map the mouse buttons & axis to a standard xbox gamepad, which I could use to play xbox
supported games as shown in the demo video.
Write Udev rule which triggers bash script when detect specific bluetooth mouse connection.
Code bash scipt that calls Xboxdrv to map mouse buttons & axis to standard Xbox controller.
Version 2
In the second version of design, I added an additional Bluetooth Low Energy (BLE) module to the Raspberry Pi (Model 3B+).
The BLE module I used is the Adafruit Feather nRF52840 Express.
With the help of the existing nRF52 Arduino package, I was able to capture
the usb mouse signals, pass it on to nRF52840 through the serial port using python script, and have the nRF52840 running an Arduino script which send the mouse
signals to a Linux laptop through BLE.
With furture understanding of the nRF52 arduino library, I tried to write a
custom class to send bluetooth gamepad report but failed. The developers put a recent update which includes an example of the bluetooth gamepad but ends up failing as well (might have
something to do with bluetooth boot protocol not supporting gamepad at all).
Fall back to working bluetooth mouse code, and write udev rule which triggers bash script when detect specific bluetooth mouse connection.
Code bash scipt that calls Xboxdrv to map mouse buttons & axis to standard Xbox controller.
Problems
Linux support for BLE (even BT in general) is unstable and still under improvement. Different updates & verions of bluetooth on different device can easily mess up the connection.
Culminating delay happens due to multiple layers of communication (USB to serial to BLE), especially when using the nRF52840 module.
Future Works
While some of the personal wheelchair at Argallab is using the usb mouse interface, the others use different ones (R-Net, Omni, etc.). With reference to the project
can2RNET by Stephen Chavez, I should be able to read the button & axis status information from
the R-Net connection, and send them out as general bluetooth mouse signals.
Get rid of the xboxdrv layer on Linux, which involves significant amount of work to construct a generally recognizable bluetooth gamepad descriptor.
Add Android app joystick support.
Additional data collection (most should be done by the Android app) if needed.
Pack everything up as open-source easy-reproducible project.
×
Chameleon
Jan 2021 - Mar 2021, Independent project
Northwestern ME499 MSR Winter Project
Introduction
Inspired by chameleon, I have this idea of a project to add camouflage feture to turtlebot, named CHAMELEON, that can move in front of an
image without the audience notice it. The robot is based on turtlebot3 burger with a LED screen (large enough to cover the body)
on the front and a wide-range camera on the back.
The idea is when the robot moves in front of the image, it will process the part of the image captured by the camera and put it on the front LED screen. So
that when someone look from the front view, the robot is ’invisible’.
The camouflage feature is added to the turtlebot burger, based on its solid moving functionality.
Since the turtlebot only needs to move in the forward/backward direction, a ros node is setup to constantly publishing
commanded velocities to the turtlebot to make it move in the forward/backward direction at a desired speed (0.001 m/s).
The speed is set by allowing sufficient time for the computer vision algorithm the run between matrix updates without the audience seeing the latching performance.
Mechatronics
The turtlebot does no have a built-in camera module. So a Arducam 5MP Camera
module is added to the raspberrypi 3 B+ used by the turtlebot to capture images at the back.
Since the turtlebot burger has a height of 192mm, I choose to use the 64x64 LED Matrix (160mm x 160mm) along with the
RGB Matrix Bonnet as the front display module for the CHAMELEON. The LED Matrix can cover most of the turtlebot, leaving only
part of the LiDAR at top, and part of the wheels at the bottom visible to the audience.
LED Matrix
The LED Matrix is initialized with the following parameters:
Hardware gpio mapping: adafruit-hat (compatible with the bonnet I used)
Refresh rate: >=400Hz
Brightness level: 5% for grssland & desert demo; 80% for chemeleon demo
The offscreen canvas is enabled to allow pixel drawing on the offscreen canvas and swap canvas among refreshing, which provides more stable display.
Computer Vision
The computer vision part of the project relies on the opencv2 C++ module. Once the camera take the picture at the back, the picture is cropped based on a linear perspective index
$$perspective index = {pose_{end} - pose_{start} \over duration}.$$
which compensates the changing of the perspective of the audience as the robot move. By applying Hough Circle Transform on the cropped image, the color of the corresponding
pixels of the recognized circles are set and draw on the offscreen canvas, which will be swap to the matrix on next vertical synchronization.
Demonstration
The CHAMELEON drive in two different scenarios
One pixelated chameleon poster at the back
Two pixelated landscape (grassland & desert) poster at the back
Future Works
The screen flushing performance as shown in the demo might due to computer vision algorithm competing with the matrix bonnet for the memory bus. The issue is expected
to be solved by improving algorithm.
Current implement shows the concept of basic camouflage, but is limited by the audience perspective. If the use see the CHAMELEON from the left/right, it is no longer camouflaged.
Some potential solution for the problem are
Put more screens around the turtlebot to cover every perspective.
Add a sensor to detect the location of the audience, then rotate and update the screen accordingly.
The 64x64 LED Matrix used is pixelated. So the camouflage effect will look different when applying to real world scenarios instead of pixelate background image. Using a screen
at a better resolution could potentially solve the problem.
Light sensors could be added to the project to better accommodate the light coming from the surrounding environment that will affect how audience see the poster at the back.
Github Page
Close Project
×
Connect-4
Oct 2020 - Dec 2020, Group project with Peter Sauer, Bill Bi, Juntao He
Northwestern ME495 Embedded Systems Final Project
Introduction
This project is about utilizing a Sawyer robot arm to play the famous board game connect-4 against a human player. As two players take turns and place pieces,
the first who vertically, horizontally or diagonally connects four of its color wins the game.
We choose to build our project using the Sawyer robot arm available in The Center of Robotics and Biosystems at McCormick School of Engineering.
Since the connect-4 only requires the player to pick and place the piece in the slots on top of the game board, Sawyer becomes a good choice with its
powerful arm.
Computer Vision
Sawyer is equipped with cameras in its head and arm which we used for our computer vision tasks.
To increase the accuracy of the results, we used a combination of Hough Line transformations and april tags.
The first task is to visualize the board and the pieces inside using the head camera.
The resultant state of the board is stored into a 2D-array for future algorithms.
The second task is to locate the slot which the AI wants to put its piece in on its turn using the arm camera.
(Contributted by Peter Sauer & Bill Bi)
Artificial Intelligence
We implement the Minimax algorithm to build a perfect game strategy for the AI.
(Contributted by Juntao He & Me)
Simulation
Since our access to the lab is heavily limited by the COVID pandamic, we use the tk module in python to build a simulation of the full gameplay to allow easier testing.
(Contributed by Me)
Github Page
Close Project
×
youBot Planning
Nov 2020 - Dec 2020, Independent Capstone project
Northwestern ME449 Robotics Manipulation
Introduction
This project is about a software written in python that plans a trajectory for the end-effector of the youBot mobile manipulator
(a mobile base with four mecanum wheels and a 5R robot arm), performs odometry as the chassis moves, and performs feedback control
to drive the youBot to pick up a block at a specified location, carry it to a desired location, and put it down.
Well-tuned feedforward-plus-P controller with different task
Github Page
Close Project
×
Jack in the Box
Nov 2020 - Dec 2020, Independent Final project
Northwestern ME314 Machine Dynamics
Introduction
This project is a physical simulation of a jack in a box written in python. The box is shaked under external force.
The jack falls due to gravity, hits the box, and bounces back.
Oct 2020 - Dec 2020, Group project with Malav Patel, Chengjian Tao
Northwestern ME315 Design Element
Introduction
This project is to design a compound three-shaft gear-train system, for transmitting power from shaft A to shaft C with speed reductions.
Each member is responsible for one shaft system. Our team consider the best part arrangements for the most effective design in terms of compact structure,
stiffness, power density, or cost. The designs or selections of gears, shafts, and bearings are supported by detailed analyses,
while other components (spacers, seals, end caps, etc.) simply come from selections and geometric designs without calculation.
The shaft systems are modeled using Siemens NX.
Apr 2018 - May 2018, Group project with Junnun Safoan
UIUC ECE498 Mobile Computing and Applications
Introduction
This project is about utilizing accelerometer, gyroscope and magnetic Field data colleted from a mobile device carried by someone who walked randomly
and tracks the walking distance of the device carrier.