SlideShare a Scribd company logo
1 of 23
Assisting EMG-based Gesture
Recognition Devices with Inertial Sensing
MIGUEL ÂNGELO SIMÃO
7/31/2015 JVSRP FINAL PRESENTATION
Portugal
JVSRP
February to July 2015
Presentation Outline
Introduction
- Background
- The project
Development
- Stage 1: using
sensor raw data
- Stage 2: using a
kinematic model
Conclusion
- Future work
- Final remarks
- Demo
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 2
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Background
• Currently an EU PhD student at the UC and
ENSAM, France.
•Graduated in July ‘14, MSc in Mechanical
Engineering, University of Coimbra,
Portugal
• Human-machine interfaces targeted at
smart manufacturing
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 3
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
University of
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 4
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Previous Work
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 5
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
JPL’s BioSleeve
• Based on electromyography
(EMG) signals
• Hand Gesture Recognition
• Supervised control
• But doesn’t account for arm
posture - > Inertial sensing (IMUs)
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 6
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Motion Capture Hardware
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 7
Arduino UNO R3
• Processing unit
• I2C Interface
(digital)
• Serial
communications
with a computer
Sparkfun
Multiplexer
• Allows multiple
sensors with the
same I2C
address to be
used
Sparkfun’s
9 DOF IMU (x2)
• Gyroscope
• Magnetometer
• Accelerometer
• I2C interface
only
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Gesture Recognition with Sensor Data
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 8
Motion Based Segmentation
frame #
window w
0
1
motionsensorreadingsDOFi
frame #
window w
Motion
Can be either:
- Movement
epenthesis
- Dynamic gesture
Pauses
Can be either:
- Static gesture
- DG ending
Parallel classification models
Motion Detection
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Gesture Recognition with Sensor Data
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 9
Motion
Detection
Features:
First derivatives of all the
variables
Thresholds?
Window size?
Optimization
Problem
Sensor
Data
Ground Truth
Motion Detection
Output
+
-
Minimize
Difference
Motion Detection
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Gesture Recognition with Sensor Data
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 10
What it actually looks like…
Ground truth System Output
Motion Detection
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Gesture Recognition with Sensor Data
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 11
Gestures Classification: Library
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 12
Gestures Classification: Supervised Training
Gesture Recognition with Sensor Data
Classification model: Artificial Neural Networks (ANN)
Features: Acceleration and magnetic field differences between
consecutive sensors discretized in 5 classes
# training samples: 20/ gesture
K-fold cross-validation: 4 folds
0
10
20
30
40
50
60
70
0 5 10 15 20
AverageErrorfor10runs
Number of Hidden Layer Neurons
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Body-Centered System
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 13
Problem:
- Arm’s orientation in the trunk reference frame?
Challenges:
- Adding a third IMU
- No precise placement of the IMUs
- Creating a kinematic model
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Hardware Layout
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 14
Previous iteration New iteration
3D-printed clothing
clipping mechanism
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Software
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 15
Microcontroller: Teensy
Orientation:
Drift correction:Sparkfun’s
AHRS Firmware
DCM Algorithm
50 Hz
Gyroscope
integrated
angular
velocities
Magnetometer
for yaw
Accelerometer
for pitch and roll
Euler Angles
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Software
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 16
Euler Angles
Matlab
Teensy
PC/Matlab
Serial
through
USB
Sampling ModuleGUI Refresh Module
• Run whenever serial
terminator (‘n’) is
reached (@50 Hz)
• Decodes serial sentence
• Calculates rotation
matrices
• Calculates calibrated
transformation matrices
• Triggered by a Matlab
timer (period set by user)
• Redraws the scene (takes
most of the processing
time)
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Kinematic Model
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 17
Calibration
1-Click
Calibration
Sensor’s orientations in
respect to the new
reference frame
Sensor’s orientations in
respect to world
reference frame
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 18
Kinematic Model
Calibration Issues
Problem: IMU roll axis (X) not parallel to the forearm’s roll axis
Solution: Apply a linear correction to pitch
- introduces a problem at the
Euler angles singularity…
0
20
40
60
80
100
120
0 50 100 150 200 250
R
P
y = 0.1026x
0
2
4
6
8
10
12
0 20 40 60 80 100
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Same forearm pitch, different
IMU pitch….
Graphical User Interface
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 19
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 20
Graphical User Interface
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Gesture Recognition
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 21
Using the Kinematic Model
Model: Neural networks
K-fold validation method
Features: rotation matrices for forearm and arm (values rounded
to one decimal place)
Average RR: >99.5%
but a lot of false positives (NN problem)
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Future Work
Avoid the use of the accelerometer for gyro drift correction:
- For the Canadarm2 demo: arm telemetry + magnetometer may work
- For the Spheres demo (0G flight): drift may be acceptable for less than
30 seconds (aprox. 0G time), recalibration before every testing period
Create a new algorithm for the IMUs based on quaternions to avoid
singularities (bound to happen in 0G)
Still not fully integrated with the Biosleeve
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 22
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition
Concluding
◦ Learned a lot about electronics: using sensors and programming
microcontrollers for embedded applications
◦ Reliable way of knowing the relative orientations of the arm in respect
to the body and the world
◦ Successfully tested ANNs for accurate arm posture recognition
◦ Thank you all!
7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 23
Conclusion
Future Work
Remarks
Demo
Introduction
Outline
Background
Hometown
Previous work
JPL Biosleeve
Proj. Stg 2:
Body Centered
Hardware
Software
Calibration
Recog.
Proj. Stg 1:
Hardware
Motion Detection
Gestures
Training/Recognition

More Related Content

Viewers also liked

Best interior designer in delhi (29)
Best interior designer in delhi (29)Best interior designer in delhi (29)
Best interior designer in delhi (29)Fedisa Group
 
Best interior designer in delhi (52)
Best interior designer in delhi (52)Best interior designer in delhi (52)
Best interior designer in delhi (52)Fedisa Group
 
A historical, linguistic and semantic analysis of the term /?irhaab/ 'terrori...
A historical, linguistic and semantic analysis of the term /?irhaab/ 'terrori...A historical, linguistic and semantic analysis of the term /?irhaab/ 'terrori...
A historical, linguistic and semantic analysis of the term /?irhaab/ 'terrori...iosrjce
 
Convenio 016-2012-mdp
Convenio 016-2012-mdpConvenio 016-2012-mdp
Convenio 016-2012-mdpMarKCreative
 
Presentación1.ppt_...ppt_
  Presentación1.ppt_...ppt_  Presentación1.ppt_...ppt_
Presentación1.ppt_...ppt_marprato
 
Proceso de la trabla de multiplicar}
Proceso de la trabla de multiplicar}Proceso de la trabla de multiplicar}
Proceso de la trabla de multiplicar}Afelipe901
 
AMR 13-0000 CPR Poster8.5x11FINAL
AMR 13-0000 CPR Poster8.5x11FINALAMR 13-0000 CPR Poster8.5x11FINAL
AMR 13-0000 CPR Poster8.5x11FINALGary Schoenike
 
Non-fiscal Linkages Development in Katanga’s Mining Sector (DR Congo)
Non-fiscal Linkages Development in Katanga’s Mining Sector (DR Congo)Non-fiscal Linkages Development in Katanga’s Mining Sector (DR Congo)
Non-fiscal Linkages Development in Katanga’s Mining Sector (DR Congo)RCS Global
 
Lectura del texto narrativo 2016
Lectura  del texto narrativo 2016Lectura  del texto narrativo 2016
Lectura del texto narrativo 2016gloriaelena325
 

Viewers also liked (13)

Best interior designer in delhi (29)
Best interior designer in delhi (29)Best interior designer in delhi (29)
Best interior designer in delhi (29)
 
Best interior designer in delhi (52)
Best interior designer in delhi (52)Best interior designer in delhi (52)
Best interior designer in delhi (52)
 
A historical, linguistic and semantic analysis of the term /?irhaab/ 'terrori...
A historical, linguistic and semantic analysis of the term /?irhaab/ 'terrori...A historical, linguistic and semantic analysis of the term /?irhaab/ 'terrori...
A historical, linguistic and semantic analysis of the term /?irhaab/ 'terrori...
 
סטודיו דויד ויוסף-מטבח
סטודיו דויד ויוסף-מטבחסטודיו דויד ויוסף-מטבח
סטודיו דויד ויוסף-מטבח
 
Convenio 016-2012-mdp
Convenio 016-2012-mdpConvenio 016-2012-mdp
Convenio 016-2012-mdp
 
Presentación1.ppt_...ppt_
  Presentación1.ppt_...ppt_  Presentación1.ppt_...ppt_
Presentación1.ppt_...ppt_
 
Proceso de la trabla de multiplicar}
Proceso de la trabla de multiplicar}Proceso de la trabla de multiplicar}
Proceso de la trabla de multiplicar}
 
TIC-UF1-Practica1
TIC-UF1-Practica1TIC-UF1-Practica1
TIC-UF1-Practica1
 
AMR 13-0000 CPR Poster8.5x11FINAL
AMR 13-0000 CPR Poster8.5x11FINALAMR 13-0000 CPR Poster8.5x11FINAL
AMR 13-0000 CPR Poster8.5x11FINAL
 
Non-fiscal Linkages Development in Katanga’s Mining Sector (DR Congo)
Non-fiscal Linkages Development in Katanga’s Mining Sector (DR Congo)Non-fiscal Linkages Development in Katanga’s Mining Sector (DR Congo)
Non-fiscal Linkages Development in Katanga’s Mining Sector (DR Congo)
 
APadarath-Resume
APadarath-ResumeAPadarath-Resume
APadarath-Resume
 
Lectura del texto narrativo 2016
Lectura  del texto narrativo 2016Lectura  del texto narrativo 2016
Lectura del texto narrativo 2016
 
Actividades de verbos
Actividades de verbosActividades de verbos
Actividades de verbos
 

Similar to jpl_final_presentation_miguelsimaov2

DESIGN AND IMPLEMENTATION OF INTEL-SPONSORED REAL-TIME MULTIVIEW FACE DETECTI...
DESIGN AND IMPLEMENTATION OF INTEL-SPONSORED REAL-TIME MULTIVIEW FACE DETECTI...DESIGN AND IMPLEMENTATION OF INTEL-SPONSORED REAL-TIME MULTIVIEW FACE DETECTI...
DESIGN AND IMPLEMENTATION OF INTEL-SPONSORED REAL-TIME MULTIVIEW FACE DETECTI...csandit
 
IRJET- Full Body Motion Detection and Surveillance System Application
IRJET-  	  Full Body Motion Detection and Surveillance System ApplicationIRJET-  	  Full Body Motion Detection and Surveillance System Application
IRJET- Full Body Motion Detection and Surveillance System ApplicationIRJET Journal
 
IRJET - Simulink based Real Time Blood Pressure and Body Tempraure Monitr...
IRJET -  	  Simulink based Real Time Blood Pressure and Body Tempraure Monitr...IRJET -  	  Simulink based Real Time Blood Pressure and Body Tempraure Monitr...
IRJET - Simulink based Real Time Blood Pressure and Body Tempraure Monitr...IRJET Journal
 
FFicili_Curriculum - v1.3 - 07092014 - ENG
FFicili_Curriculum - v1.3 - 07092014 - ENGFFicili_Curriculum - v1.3 - 07092014 - ENG
FFicili_Curriculum - v1.3 - 07092014 - ENGFrancesco Ficili
 
SENSITIVITY OF A VIDEO SURVEILLANCE SYSTEM BASED ON MOTION DETECTION
SENSITIVITY OF A VIDEO SURVEILLANCE SYSTEM BASED ON MOTION DETECTIONSENSITIVITY OF A VIDEO SURVEILLANCE SYSTEM BASED ON MOTION DETECTION
SENSITIVITY OF A VIDEO SURVEILLANCE SYSTEM BASED ON MOTION DETECTIONsipij
 
SOFI Developer Meeting Göttingen 28th March 2015
SOFI Developer MeetingGöttingen 28th March 2015 SOFI Developer MeetingGöttingen 28th March 2015
SOFI Developer Meeting Göttingen 28th March 2015 Dirk Hähnel
 
Robust sensor fault detection and isolation of an anerarobic bioreactor model...
Robust sensor fault detection and isolation of an anerarobic bioreactor model...Robust sensor fault detection and isolation of an anerarobic bioreactor model...
Robust sensor fault detection and isolation of an anerarobic bioreactor model...Francisco Ronay López Estrada
 
IRJET- Driver’s Sleep Detection
IRJET-  	  Driver’s Sleep DetectionIRJET-  	  Driver’s Sleep Detection
IRJET- Driver’s Sleep DetectionIRJET Journal
 
Crude-Oil Scheduling Technology: moving from simulation to optimization
Crude-Oil Scheduling Technology: moving from simulation to optimizationCrude-Oil Scheduling Technology: moving from simulation to optimization
Crude-Oil Scheduling Technology: moving from simulation to optimizationBrenno Menezes
 
IRJET- Moving Object Detection with Shadow Compression using Foreground Segme...
IRJET- Moving Object Detection with Shadow Compression using Foreground Segme...IRJET- Moving Object Detection with Shadow Compression using Foreground Segme...
IRJET- Moving Object Detection with Shadow Compression using Foreground Segme...IRJET Journal
 
IRJET - Mobile Application Testing and Report Generation using Image Proc...
IRJET -  	  Mobile Application Testing and Report Generation using Image Proc...IRJET -  	  Mobile Application Testing and Report Generation using Image Proc...
IRJET - Mobile Application Testing and Report Generation using Image Proc...IRJET Journal
 
Survey Paper for Different Video Stabilization Techniques
Survey Paper for Different Video Stabilization TechniquesSurvey Paper for Different Video Stabilization Techniques
Survey Paper for Different Video Stabilization TechniquesIRJET Journal
 
Flow Trajectory Approach for Human Action Recognition
Flow Trajectory Approach for Human Action RecognitionFlow Trajectory Approach for Human Action Recognition
Flow Trajectory Approach for Human Action RecognitionIRJET Journal
 
Definition and Validation of Scientific Algorithms for the SEOSAT/Ingenio GPP
Definition and Validation of Scientific Algorithms for the SEOSAT/Ingenio GPPDefinition and Validation of Scientific Algorithms for the SEOSAT/Ingenio GPP
Definition and Validation of Scientific Algorithms for the SEOSAT/Ingenio GPPEsri
 
Automatic_gait_detection_poster_IEEE
Automatic_gait_detection_poster_IEEEAutomatic_gait_detection_poster_IEEE
Automatic_gait_detection_poster_IEEEParth Shah
 
Suspicious Activity Detection
Suspicious Activity DetectionSuspicious Activity Detection
Suspicious Activity DetectionMushahid Ali
 
The Significance of IFPUG Base Functionality Types in Effort Estimation - An ...
The Significance of IFPUG Base Functionality Types in Effort Estimation - An ...The Significance of IFPUG Base Functionality Types in Effort Estimation - An ...
The Significance of IFPUG Base Functionality Types in Effort Estimation - An ...Luigi Buglione
 
The significance of ifpug base functionality types in effort estimation cig...
The significance of ifpug base functionality types in effort estimation   cig...The significance of ifpug base functionality types in effort estimation   cig...
The significance of ifpug base functionality types in effort estimation cig...IWSM Mensura
 
ET143016 ET143015-1.pptx
ET143016 ET143015-1.pptxET143016 ET143015-1.pptx
ET143016 ET143015-1.pptxMdSazzad28
 

Similar to jpl_final_presentation_miguelsimaov2 (20)

DESIGN AND IMPLEMENTATION OF INTEL-SPONSORED REAL-TIME MULTIVIEW FACE DETECTI...
DESIGN AND IMPLEMENTATION OF INTEL-SPONSORED REAL-TIME MULTIVIEW FACE DETECTI...DESIGN AND IMPLEMENTATION OF INTEL-SPONSORED REAL-TIME MULTIVIEW FACE DETECTI...
DESIGN AND IMPLEMENTATION OF INTEL-SPONSORED REAL-TIME MULTIVIEW FACE DETECTI...
 
IRJET- Full Body Motion Detection and Surveillance System Application
IRJET-  	  Full Body Motion Detection and Surveillance System ApplicationIRJET-  	  Full Body Motion Detection and Surveillance System Application
IRJET- Full Body Motion Detection and Surveillance System Application
 
EMPhASIS - Work Organisation
EMPhASIS - Work OrganisationEMPhASIS - Work Organisation
EMPhASIS - Work Organisation
 
IRJET - Simulink based Real Time Blood Pressure and Body Tempraure Monitr...
IRJET -  	  Simulink based Real Time Blood Pressure and Body Tempraure Monitr...IRJET -  	  Simulink based Real Time Blood Pressure and Body Tempraure Monitr...
IRJET - Simulink based Real Time Blood Pressure and Body Tempraure Monitr...
 
FFicili_Curriculum - v1.3 - 07092014 - ENG
FFicili_Curriculum - v1.3 - 07092014 - ENGFFicili_Curriculum - v1.3 - 07092014 - ENG
FFicili_Curriculum - v1.3 - 07092014 - ENG
 
SENSITIVITY OF A VIDEO SURVEILLANCE SYSTEM BASED ON MOTION DETECTION
SENSITIVITY OF A VIDEO SURVEILLANCE SYSTEM BASED ON MOTION DETECTIONSENSITIVITY OF A VIDEO SURVEILLANCE SYSTEM BASED ON MOTION DETECTION
SENSITIVITY OF A VIDEO SURVEILLANCE SYSTEM BASED ON MOTION DETECTION
 
SOFI Developer Meeting Göttingen 28th March 2015
SOFI Developer MeetingGöttingen 28th March 2015 SOFI Developer MeetingGöttingen 28th March 2015
SOFI Developer Meeting Göttingen 28th March 2015
 
Robust sensor fault detection and isolation of an anerarobic bioreactor model...
Robust sensor fault detection and isolation of an anerarobic bioreactor model...Robust sensor fault detection and isolation of an anerarobic bioreactor model...
Robust sensor fault detection and isolation of an anerarobic bioreactor model...
 
IRJET- Driver’s Sleep Detection
IRJET-  	  Driver’s Sleep DetectionIRJET-  	  Driver’s Sleep Detection
IRJET- Driver’s Sleep Detection
 
Crude-Oil Scheduling Technology: moving from simulation to optimization
Crude-Oil Scheduling Technology: moving from simulation to optimizationCrude-Oil Scheduling Technology: moving from simulation to optimization
Crude-Oil Scheduling Technology: moving from simulation to optimization
 
IRJET- Moving Object Detection with Shadow Compression using Foreground Segme...
IRJET- Moving Object Detection with Shadow Compression using Foreground Segme...IRJET- Moving Object Detection with Shadow Compression using Foreground Segme...
IRJET- Moving Object Detection with Shadow Compression using Foreground Segme...
 
IRJET - Mobile Application Testing and Report Generation using Image Proc...
IRJET -  	  Mobile Application Testing and Report Generation using Image Proc...IRJET -  	  Mobile Application Testing and Report Generation using Image Proc...
IRJET - Mobile Application Testing and Report Generation using Image Proc...
 
Survey Paper for Different Video Stabilization Techniques
Survey Paper for Different Video Stabilization TechniquesSurvey Paper for Different Video Stabilization Techniques
Survey Paper for Different Video Stabilization Techniques
 
Flow Trajectory Approach for Human Action Recognition
Flow Trajectory Approach for Human Action RecognitionFlow Trajectory Approach for Human Action Recognition
Flow Trajectory Approach for Human Action Recognition
 
Definition and Validation of Scientific Algorithms for the SEOSAT/Ingenio GPP
Definition and Validation of Scientific Algorithms for the SEOSAT/Ingenio GPPDefinition and Validation of Scientific Algorithms for the SEOSAT/Ingenio GPP
Definition and Validation of Scientific Algorithms for the SEOSAT/Ingenio GPP
 
Automatic_gait_detection_poster_IEEE
Automatic_gait_detection_poster_IEEEAutomatic_gait_detection_poster_IEEE
Automatic_gait_detection_poster_IEEE
 
Suspicious Activity Detection
Suspicious Activity DetectionSuspicious Activity Detection
Suspicious Activity Detection
 
The Significance of IFPUG Base Functionality Types in Effort Estimation - An ...
The Significance of IFPUG Base Functionality Types in Effort Estimation - An ...The Significance of IFPUG Base Functionality Types in Effort Estimation - An ...
The Significance of IFPUG Base Functionality Types in Effort Estimation - An ...
 
The significance of ifpug base functionality types in effort estimation cig...
The significance of ifpug base functionality types in effort estimation   cig...The significance of ifpug base functionality types in effort estimation   cig...
The significance of ifpug base functionality types in effort estimation cig...
 
ET143016 ET143015-1.pptx
ET143016 ET143015-1.pptxET143016 ET143015-1.pptx
ET143016 ET143015-1.pptx
 

jpl_final_presentation_miguelsimaov2

  • 1. Assisting EMG-based Gesture Recognition Devices with Inertial Sensing MIGUEL ÂNGELO SIMÃO 7/31/2015 JVSRP FINAL PRESENTATION Portugal JVSRP February to July 2015
  • 2. Presentation Outline Introduction - Background - The project Development - Stage 1: using sensor raw data - Stage 2: using a kinematic model Conclusion - Future work - Final remarks - Demo 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 2 Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 3. Background • Currently an EU PhD student at the UC and ENSAM, France. •Graduated in July ‘14, MSc in Mechanical Engineering, University of Coimbra, Portugal • Human-machine interfaces targeted at smart manufacturing 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 3 Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 4. University of 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 4 Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 5. Previous Work 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 5 Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 6. JPL’s BioSleeve • Based on electromyography (EMG) signals • Hand Gesture Recognition • Supervised control • But doesn’t account for arm posture - > Inertial sensing (IMUs) 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 6 Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 7. Motion Capture Hardware 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 7 Arduino UNO R3 • Processing unit • I2C Interface (digital) • Serial communications with a computer Sparkfun Multiplexer • Allows multiple sensors with the same I2C address to be used Sparkfun’s 9 DOF IMU (x2) • Gyroscope • Magnetometer • Accelerometer • I2C interface only Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 8. Gesture Recognition with Sensor Data 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 8 Motion Based Segmentation frame # window w 0 1 motionsensorreadingsDOFi frame # window w Motion Can be either: - Movement epenthesis - Dynamic gesture Pauses Can be either: - Static gesture - DG ending Parallel classification models Motion Detection Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 9. Gesture Recognition with Sensor Data 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 9 Motion Detection Features: First derivatives of all the variables Thresholds? Window size? Optimization Problem Sensor Data Ground Truth Motion Detection Output + - Minimize Difference Motion Detection Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 10. Gesture Recognition with Sensor Data 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 10 What it actually looks like… Ground truth System Output Motion Detection Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 11. Gesture Recognition with Sensor Data 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 11 Gestures Classification: Library Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 12. 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 12 Gestures Classification: Supervised Training Gesture Recognition with Sensor Data Classification model: Artificial Neural Networks (ANN) Features: Acceleration and magnetic field differences between consecutive sensors discretized in 5 classes # training samples: 20/ gesture K-fold cross-validation: 4 folds 0 10 20 30 40 50 60 70 0 5 10 15 20 AverageErrorfor10runs Number of Hidden Layer Neurons Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 13. Body-Centered System 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 13 Problem: - Arm’s orientation in the trunk reference frame? Challenges: - Adding a third IMU - No precise placement of the IMUs - Creating a kinematic model Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 14. Hardware Layout 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 14 Previous iteration New iteration 3D-printed clothing clipping mechanism Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 15. Software 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 15 Microcontroller: Teensy Orientation: Drift correction:Sparkfun’s AHRS Firmware DCM Algorithm 50 Hz Gyroscope integrated angular velocities Magnetometer for yaw Accelerometer for pitch and roll Euler Angles Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 16. Software 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 16 Euler Angles Matlab Teensy PC/Matlab Serial through USB Sampling ModuleGUI Refresh Module • Run whenever serial terminator (‘n’) is reached (@50 Hz) • Decodes serial sentence • Calculates rotation matrices • Calculates calibrated transformation matrices • Triggered by a Matlab timer (period set by user) • Redraws the scene (takes most of the processing time) Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 17. Kinematic Model 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 17 Calibration 1-Click Calibration Sensor’s orientations in respect to the new reference frame Sensor’s orientations in respect to world reference frame Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 18. 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 18 Kinematic Model Calibration Issues Problem: IMU roll axis (X) not parallel to the forearm’s roll axis Solution: Apply a linear correction to pitch - introduces a problem at the Euler angles singularity… 0 20 40 60 80 100 120 0 50 100 150 200 250 R P y = 0.1026x 0 2 4 6 8 10 12 0 20 40 60 80 100 Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition Same forearm pitch, different IMU pitch….
  • 19. Graphical User Interface 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 19 Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 20. 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 20 Graphical User Interface Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 21. Gesture Recognition 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 21 Using the Kinematic Model Model: Neural networks K-fold validation method Features: rotation matrices for forearm and arm (values rounded to one decimal place) Average RR: >99.5% but a lot of false positives (NN problem) Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 22. Future Work Avoid the use of the accelerometer for gyro drift correction: - For the Canadarm2 demo: arm telemetry + magnetometer may work - For the Spheres demo (0G flight): drift may be acceptable for less than 30 seconds (aprox. 0G time), recalibration before every testing period Create a new algorithm for the IMUs based on quaternions to avoid singularities (bound to happen in 0G) Still not fully integrated with the Biosleeve 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 22 Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition
  • 23. Concluding ◦ Learned a lot about electronics: using sensors and programming microcontrollers for embedded applications ◦ Reliable way of knowing the relative orientations of the arm in respect to the body and the world ◦ Successfully tested ANNs for accurate arm posture recognition ◦ Thank you all! 7/31/2015 MIGUEL SIMÃO - FINAL PRESENTATION 23 Conclusion Future Work Remarks Demo Introduction Outline Background Hometown Previous work JPL Biosleeve Proj. Stg 2: Body Centered Hardware Software Calibration Recog. Proj. Stg 1: Hardware Motion Detection Gestures Training/Recognition

Editor's Notes

  1. Greet, thank Present myself, where I am from What program am I in Showing a brief recap of what I’ve been working on Mention that I’ve been working on inertial sensing for gesture recognition
  2. In this presentation… Start with an introduction: Share my background Explain what the project I am working on is Then about the development I’ve done: First stage: learning how to use the sensors How to set up the electronic circuit Using the sensor’s raw data to do gesture recognition Second stage: adding a third IMU Obtaining actual orientations in a body reference frame Concluding with: What the next developments should be Adding some final remarks And finishing with a live demo
  3. Starting with my background: Finished my MSc degree last year on Mechanical Engineering, given by the university of Coimbra, Portugal Am currently a starting PhD student at the same university, and working in close proximity with ENSAM, France Interests in human-machine interfaces, based on machine learning to read and classify sensor data Focus on smart manufacturing, enabling a better interaction between workers and robots
  4. Quickly talk about where I come from. On the top right. Coimbra is the third biggest city in the country, after Lisbon and Oporto Pop. Half a million in the county. Known as the City of the Students, the city itself had 140k inhabitants and over 25k students On the top of the hill, the main campus of the UC The UC is the oldest university of the country, founded in 1290 Strong student community Bottom left, the senior’s serenade: the goodbye to the city and the student life Most of the students wear a traditional student dark attire with a black cape
  5. Moving on, showing my previous work A demo for my master’s thesis Using a data glove and a 6 DOF magnetic tracker This is a demonstration of a virtual joystick With gestures to change types of commands sent
  6. The project I’ve been working on at the JPL: Biosleeve It’s a hand gesture recognition device Worn on the forearm Based on EMG Hands-free Allows for supervised control of several types of robots, from mobile robots to prosthetics It’s goal: be worn by an astronaut in an EVA mission, under his suit, to control robots, or even for human interplanetary exploration Problem: Can’t detect the arm’s orientation in respect to the user That’s were I come in
  7. The setup I started with: Arduino board Two low cost Sparkfun 9 DOF IMU sensor sticks Gyros, Magnetometers and Accelerometers I2C interfaces, sensor sticks have the same I2C addresses So we have to use a multiplexer to read each sensor individually At this point, the Arduino is used just to get calibrated sensor data
  8. My first idea was designing a recognition system similar to what I did before Using one IMU on the arm and the other on the wrist The system was based on a motion segmentation algorithm Dynamic and static parts are classified separately A dynamic segment can either be: ME or a DG A static segment: an actual gesture or a DG ending These segments are classified in parallel and the outcome is selected by classification score The motion segmentation algorithm is a sliding window threshold method It assumes that there is movement if any of the features is above a certain threshold, inside a window of frames. This ensures a quick response to movement.
  9. This creates a problem: how to select the thresholds and sliding window size? The features chosen are usually the first derivatives of the variables and/or the second as well, if necessary I treated the threshold selection and window size as an optimization problem The idea is to get some data to use as ground truth And then optimize the thresholds to obtain a system response close to the ground truth
  10. For this instance I went old-school and I created a spreadsheet Used the absolute value of the first derivatives for the accelerometer and gyro data The thresholds have a lower limit of zero and an upper limit equal to the maximum value experienced in motion The colored values are the correlations between the individual thresholds for the variables and the ground truth They give an idea of what the most important variables are for the motion detection
  11. This is the library of gestures I used for the static gesture recognition system Most of these are commands recommended by some army people to make it more similar to the actual commands sent on the field Some of them are dynamic but I treated them as static for simplicity, using the initial posture I added a few to increase gesture count and complexity
  12. Classification model: neural networks The features were the differences between the measurements of the two sensors, accel and magn, gyros are irrelevant for static gestures The values were rescaled and discretized to reduce input noise Trained with 20 samples/ gesture Used k-fold cross validation to ensure good NN performance with untrained samples This way the training data is split into k subsets, the NN is trained with one of them and tested with the remaining. Prevents overfitting Each fold was tested 10 times to account for random initialization of the bias and weights, the average error percentage was taken Tested with different architectures, in regards to the number of hidden nodes Drew the following graph, ER vs number of nodes ER lowers with number of nodes but quickly stabilizes at 2.5%
  13. In the mean time, we got a new IMU, and he challenge is adding a third IMU to know the arm’s pose in the user’s POV For this we need the full orientation for each sensor, with Euler angles or otherwise We shouldn’t need to place the IMUs with precision
  14. This is what the hardware looked like before and afterwards I changed the processing board to a Teensy, with a 32-bit 72MHz ARM microprocessor It’s smaller and more capable The Arduino wasn’t able to calculate in time the orientation for the three sensors You can see the sensor clips that I 3D printed to attach the sensors to clothing
  15. The filter I used to estimate the orientation was provided by Sparkfun and it’s what is called a AHRS firmware Based on DCM Had to be customize to use with several IMUs through the multiplexer The code was quickly adapted to work with the Teensy Still uses the Arduino IDE, different pin assignment and some minor changes in the code The DCM algorithm uses the integrated gyroscope’s angular velocities and corrects their drift Using the magnetometer as a compass to correct yaw And accelerometer and gravity vector to correct pitch and roll We get the Euler angles of the sensor’s orientation in respect to the world reference frame
  16. On the computer side: running Matlab The software has two modules: the GUI refresh module and the sampling module The GUI module is triggered by a timer with a period set by the user, it’s responsible for redrawing the model and updating some of the interface’s values It works well until around 24 fps The sampling mode actually reads the data from the USB connection at the rate of the Teensy algorithm It is run whenever a terminator (newline) is reached, which is sent after all the orientations are calculated for all the IMUs. @50 Hz It decodes the serial message, calculates and calibrates the rotation matrices for the model’s links
  17. This is what the model looks like: It has 3 links, the trunk, the upper arm and the forearm Because we allow for inaccurate sensor placement, a quick calibration is performed every time the program is run There’ a calibration home position, on the right On the left, you can see what the model looks like when the sensors are placed randomly During the calibration, we calculate the rotation matrices to rotate the links to the reference frames
  18. This calibration method is not without issues. The problem is that there is a pitch offset depending on the arm’s roll This happens because the roll axis (X) of the IMU is not parallel to the arm’s actual X axis The IMU has a conical motion I got some data during a roll movement of the forearm Analyzed the variation of pitch and roll The pitch offset is minimum at the calibration orientation and varies to a maximum of around 10 degrees I assumed a linear correction
  19. I designed a new recognition system Again using neural networks Same methodology This time using the rotation matrices linking the body to the arm, and the arm to the forearm They were rounded to one decimal place Trained with 20 samples/ gesture The recognition rate was above 99.5% this time Recognizes the gestures very well, but has problems untrained classes, happens a lot with NN