Miguel Simao presents the results of his JVSRP project to develop an inertial sensing system to assist EMG-based gesture recognition devices. He created a body-centered system using three IMUs and a kinematic model to determine arm orientation. After calibrating the system, he was able to accurately recognize arm postures using artificial neural networks on the rotation matrices with over 99.5% accuracy. Future work includes improving gyro drift correction in zero-gravity and further integrating the system with an EMG-based device.
1. Assisting EMG-based Gesture
Recognition Devices with Inertial Sensing
MIGUEL ÂNGELO SIMÃO
7/31/2015 JVSRP FINAL PRESENTATION
Portugal
JVSRP
February to July 2015
2. Presentation Outline
Introduction
- Background
- The project
Development
- Stage 1: using
sensor raw data
- Stage 2: using a
kinematic model
Conclusion
- Future work
- Final remarks
- Demo
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 2
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
3. Background
• Currently an EU PhD student at the UC and
ENSAM, France.
•Graduated in July ‘14, MSc in Mechanical
Engineering, University of Coimbra,
Portugal
• Human-machine interfaces targeted at
smart manufacturing
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 3
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
4. University of
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 4
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
5. Previous Work
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 5
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
6. JPL’s BioSleeve
• Based on electromyography
(EMG) signals
• Hand Gesture Recognition
• Supervised control
• But doesn’t account for arm
posture - > Inertial sensing (IMUs)
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 6
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
7. Motion Capture Hardware
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 7
Arduino UNO R3
• Processing unit
• I2C Interface
(digital)
• Serial
communications
with a computer
Sparkfun
Multiplexer
• Allows multiple
sensors with the
same I2C
address to be
used
Sparkfun’s
9 DOF IMU (x2)
• Gyroscope
• Magnetometer
• Accelerometer
• I2C interface
only
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
8. Gesture Recognition with Sensor Data
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 8
Motion Based Segmentation
frame #
window w
0
1
motionsensorreadingsDOFi
frame #
window w
Motion
Can be either:
- Movement
epenthesis
- Dynamic gesture
Pauses
Can be either:
- Static gesture
- DG ending
Parallel classification models
Motion Detection
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
9. Gesture Recognition with Sensor Data
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 9
Motion
Detection
Features:
First derivatives of all the
variables
Thresholds?
Window size?
Optimization
Problem
Sensor
Data
Ground Truth
Motion Detection
Output
+
-
Minimize
Difference
Motion Detection
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
10. Gesture Recognition with Sensor Data
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 10
What it actually looks like…
Ground truth System Output
Motion Detection
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
11. Gesture Recognition with Sensor Data
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 11
Gestures Classification: Library
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
12. 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 12
Gestures Classification: Supervised Training
Gesture Recognition with Sensor Data
Classification model: Artificial Neural Networks (ANN)
Features: Acceleration and magnetic field differences between
consecutive sensors discretized in 5 classes
# training samples: 20/ gesture
K-fold cross-validation: 4 folds
0
10
20
30
40
50
60
70
0 5 10 15 20
AverageErrorfor10runs
Number of Hidden Layer Neurons
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
13. Body-Centered System
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 13
Problem:
- Arm’s orientation in the trunk reference frame?
Challenges:
- Adding a third IMU
- No precise placement of the IMUs
- Creating a kinematic model
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
14. Hardware Layout
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 14
Previous iteration New iteration
3D-printed clothing
clipping mechanism
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
15. Software
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 15
Microcontroller: Teensy
Orientation:
Drift correction:Sparkfun’s
AHRS Firmware
DCM Algorithm
50 Hz
Gyroscope
integrated
angular
velocities
Magnetometer
for yaw
Accelerometer
for pitch and roll
Euler Angles
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
16. Software
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 16
Euler Angles
Matlab
Teensy
PC/Matlab
Serial
through
USB
Sampling ModuleGUI Refresh Module
• Run whenever serial
terminator (‘n’) is
reached (@50 Hz)
• Decodes serial sentence
• Calculates rotation
matrices
• Calculates calibrated
transformation matrices
• Triggered by a Matlab
timer (period set by user)
• Redraws the scene (takes
most of the processing
time)
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
17. Kinematic Model
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 17
Calibration
1-Click
Calibration
Sensor’s orientations in
respect to the new
reference frame
Sensor’s orientations in
respect to world
reference frame
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
18. 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 18
Kinematic Model
Calibration Issues
Problem: IMU roll axis (X) not parallel to the forearm’s roll axis
Solution: Apply a linear correction to pitch
- introduces a problem at the
Euler angles singularity…
0
20
40
60
80
100
120
0 50 100 150 200 250
R
P
y = 0.1026x
0
2
4
6
8
10
12
0 20 40 60 80 100
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Same forearm pitch, different
IMU pitch….
19. Graphical User Interface
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 19
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
20. 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 20
Graphical User Interface
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
21. Gesture Recognition
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 21
Using the Kinematic Model
Model: Neural networks
K-fold validation method
Features: rotation matrices for forearm and arm (values rounded
to one decimal place)
Average RR: >99.5%
but a lot of false positives (NN problem)
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
22. Future Work
Avoid the use of the accelerometer for gyro drift correction:
- For the Canadarm2 demo: arm telemetry + magnetometer may work
- For the Spheres demo (0G flight): drift may be acceptable for less than
30 seconds (aprox. 0G time), recalibration before every testing period
Create a new algorithm for the IMUs based on quaternions to avoid
singularities (bound to happen in 0G)
Still not fully integrated with the Biosleeve
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 22
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
23. Concluding
◦ Learned a lot about electronics: using sensors and programming
microcontrollers for embedded applications
◦ Reliable way of knowing the relative orientations of the arm in respect
to the body and the world
◦ Successfully tested ANNs for accurate arm posture recognition
◦ Thank you all!
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 23
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Editor's Notes
Greet, thank
Present myself, where I am from
What program am I in
Showing a brief recap of what I’ve been working on
Mention that I’ve been working on inertial sensing for gesture recognition
In this presentation…
Start with an introduction:
Share my background
Explain what the project I am working on is
Then about the development I’ve done:
First stage: learning how to use the sensors
How to set up the electronic circuit
Using the sensor’s raw data to do gesture recognition
Second stage: adding a third IMU
Obtaining actual orientations in a body reference frame
Concluding with:
What the next developments should be
Adding some final remarks
And finishing with a live demo
Starting with my background:
Finished my MSc degree last year on Mechanical Engineering, given by the university of Coimbra, Portugal
Am currently a starting PhD student at the same university, and working in close proximity with ENSAM, France
Interests in human-machine interfaces, based on machine learning to read and classify sensor data
Focus on smart manufacturing, enabling a better interaction between workers and robots
Quickly talk about where I come from. On the top right.
Coimbra is the third biggest city in the country, after Lisbon and Oporto
Pop. Half a million in the county.
Known as the City of the Students, the city itself had 140k inhabitants and over 25k students
On the top of the hill, the main campus of the UC
The UC is the oldest university of the country, founded in 1290
Strong student community
Bottom left, the senior’s serenade: the goodbye to the city and the student life
Most of the students wear a traditional student dark attire with a black cape
Moving on, showing my previous work
A demo for my master’s thesis
Using a data glove and a 6 DOF magnetic tracker
This is a demonstration of a virtual joystick
With gestures to change types of commands sent
The project I’ve been working on at the JPL: Biosleeve
It’s a hand gesture recognition device
Worn on the forearm
Based on EMG
Hands-free
Allows for supervised control of several types of robots, from mobile robots to prosthetics
It’s goal: be worn by an astronaut in an EVA mission, under his suit, to control robots, or even for human interplanetary exploration
Problem: Can’t detect the arm’s orientation in respect to the user
That’s were I come in
The setup I started with:
Arduino board
Two low cost Sparkfun 9 DOF IMU sensor sticks
Gyros, Magnetometers and Accelerometers
I2C interfaces, sensor sticks have the same I2C addresses
So we have to use a multiplexer to read each sensor individually
At this point, the Arduino is used just to get calibrated sensor data
My first idea was designing a recognition system similar to what I did before
Using one IMU on the arm and the other on the wrist
The system was based on a motion segmentation algorithm
Dynamic and static parts are classified separately
A dynamic segment can either be: ME or a DG
A static segment: an actual gesture or a DG ending
These segments are classified in parallel and the outcome is selected by classification score
The motion segmentation algorithm is a sliding window threshold method
It assumes that there is movement if any of the features is above a certain threshold, inside a window of frames. This ensures a quick response to movement.
This creates a problem: how to select the thresholds and sliding window size?
The features chosen are usually the first derivatives of the variables and/or the second as well, if necessary
I treated the threshold selection and window size as an optimization problem
The idea is to get some data to use as ground truth
And then optimize the thresholds to obtain a system response close to the ground truth
For this instance I went old-school and I created a spreadsheet
Used the absolute value of the first derivatives for the accelerometer and gyro data
The thresholds have a lower limit of zero and an upper limit equal to the maximum value experienced in motion
The colored values are the correlations between the individual thresholds for the variables and the ground truth
They give an idea of what the most important variables are for the motion detection
This is the library of gestures I used for the static gesture recognition system
Most of these are commands recommended by some army people to make it more similar to the actual commands sent on the field
Some of them are dynamic but I treated them as static for simplicity, using the initial posture
I added a few to increase gesture count and complexity
Classification model: neural networks
The features were the differences between the measurements of the two sensors, accel and magn, gyros are irrelevant for static gestures
The values were rescaled and discretized to reduce input noise
Trained with 20 samples/ gesture
Used k-fold cross validation to ensure good NN performance with untrained samples
This way the training data is split into k subsets, the NN is trained with one of them and tested with the remaining. Prevents overfitting
Each fold was tested 10 times to account for random initialization of the bias and weights, the average error percentage was taken
Tested with different architectures, in regards to the number of hidden nodes
Drew the following graph, ER vs number of nodes
ER lowers with number of nodes but quickly stabilizes at 2.5%
In the mean time, we got a new IMU, and he challenge is adding a third IMU to know the arm’s pose in the user’s POV
For this we need the full orientation for each sensor, with Euler angles or otherwise
We shouldn’t need to place the IMUs with precision
This is what the hardware looked like before and afterwards
I changed the processing board to a Teensy, with a 32-bit 72MHz ARM microprocessor
It’s smaller and more capable
The Arduino wasn’t able to calculate in time the orientation for the three sensors
You can see the sensor clips that I 3D printed to attach the sensors to clothing
The filter I used to estimate the orientation was provided by Sparkfun and it’s what is called a AHRS firmware
Based on DCM
Had to be customize to use with several IMUs through the multiplexer
The code was quickly adapted to work with the Teensy
Still uses the Arduino IDE, different pin assignment and some minor changes in the code
The DCM algorithm uses the integrated gyroscope’s angular velocities and corrects their drift
Using the magnetometer as a compass to correct yaw
And accelerometer and gravity vector to correct pitch and roll
We get the Euler angles of the sensor’s orientation in respect to the world reference frame
On the computer side: running Matlab
The software has two modules: the GUI refresh module and the sampling module
The GUI module is triggered by a timer with a period set by the user, it’s responsible for redrawing the model and updating some of the interface’s values
It works well until around 24 fps
The sampling mode actually reads the data from the USB connection at the rate of the Teensy algorithm
It is run whenever a terminator (newline) is reached, which is sent after all the orientations are calculated for all the IMUs. @50 Hz
It decodes the serial message, calculates and calibrates the rotation matrices for the model’s links
This is what the model looks like:
It has 3 links, the trunk, the upper arm and the forearm
Because we allow for inaccurate sensor placement, a quick calibration is performed every time the program is run
There’ a calibration home position, on the right
On the left, you can see what the model looks like when the sensors are placed randomly
During the calibration, we calculate the rotation matrices to rotate the links to the reference frames
This calibration method is not without issues.
The problem is that there is a pitch offset depending on the arm’s roll
This happens because the roll axis (X) of the IMU is not parallel to the arm’s actual X axis
The IMU has a conical motion
I got some data during a roll movement of the forearm
Analyzed the variation of pitch and roll
The pitch offset is minimum at the calibration orientation and varies to a maximum of around 10 degrees
I assumed a linear correction
I designed a new recognition system
Again using neural networks
Same methodology
This time using the rotation matrices linking the body to the arm, and the arm to the forearm
They were rounded to one decimal place
Trained with 20 samples/ gesture
The recognition rate was above 99.5% this time
Recognizes the gestures very well, but has problems untrained classes, happens a lot with NN