3. Context of the problem
3D mid-air gestures are developing as alternative method of interaction
Many sensors can be used, but each type has some limitations
7. Radars sensors:
• Less privacy concerns
• Not sensitive to lighting conditions
• Less intrusive
rfbeam.ch
ti.com
projects-raspberry.com
8. However…
• Less mature than other solutions
• Sensitive to noise (e.g., antennas effects)
• Sensitive to clutter
• Data interpretation can be complex
9. However…
• Less mature than other solutions
• Sensitive to noise (e.g., antennas effects)
• Sensitive to clutter
• Data interpretation can be complex
10. Internal reflections, transmissions
However…
• Less mature than other solutions
• Sensitive to noise (e.g., antennas effects)
• Sensitive to clutter
• Data interpretation can be complex
11. Radar
However…
• Less mature than other solutions
• Sensitive to noise (e.g., antennas effects)
• Sensitive to clutter
• Data interpretation can be complex
12. Radar
Walls
However…
• Less mature than other solutions
• Sensitive to noise (e.g., antennas effects)
• Sensitive to clutter
• Data interpretation can be complex
13. Radar
Furniture
However…
• Less mature than other solutions
• Sensitive to noise (e.g., antennas effects)
• Sensitive to clutter
• Data interpretation can be complex
14. However…
• Less mature than other solutions
• Sensitive to noise (e.g., antennas effects)
• Sensitive to clutter
• Data interpretation can be complex
Signal from multiple receivers
Identify hand motion from raw radar data
16. Some hypotheses
1. Walabot device
2. Hand gestures
3. Users can add/remove
gestures nearly
instantaniously
17. Some hypotheses
1. Walabot device
2. Hand gestures
3. Users can add/remove
gestures nearly
instantaniously
18. Some hypotheses
1. Walabot device
2. Hand gestures
3. Users can add/remove
gestures nearly
instantaneously
Template matching
recognizers
Fast execution
time
Very short/no
training time
Few training
templates
19. Plan of the presentation
1. Research methodology
2. Current status & next steps
3. Conclusion
34. Gesture elicitation studies
• Multiple studies have been conducted
• Send commands to IoT devices
• Gestures in emergency situations
• Interact with an office door
• Results not yet analyzed
45. Software environment for gesture recognition
Framework for engineering
gesture-based applications
Originally built for the LMC
46. Software environment for gesture recognition (challenges)
• Integration of the radar
pre-processing pipeline
• Gesture segmentation
from a continuous stream
of radar data
51. Expected contributions
• SLR
• Gesture Elicitation Studies
• Datasets of radar gestures
• Pipeline for pre-processing radar gestures
• Framework for (radar) gesture recognition
• Benchmarkings
• Radar gesture-based application for IoT or multimedia
52. Many challenges remain…
• Multi-user interaction
• Improve performance across different users
• Achieve real-time radar gesture recognition
• Jump from the lab to real-life
• Interference?
• Gesture segmentation
• Building real apps
Many sensors can be used, but each type has some limitations that may be problematic in some situations
Perform a Systematic literature review to identify radar systems, algorithms, gesture sets,…
Conduct gesture elicitation studies to explore user-defined gestures in various environments
Conduct gesture elicitation studies to explore user-defined gestures in various environments
Then, using the information from the SLR and gesture elicitation studies, we will acquire some new gesture sets that will be made public
Then, using the information from the SLR and gesture elicitation studies, we will acquire some new gesture sets that will be made public
Then, we’ll design an environment for gesture recognition, in two parts:
The first part will process radar signal to remove noise and clutter, and extract relevant data from the signal
The second part will handle the process of gesture recognition and send the results to applications
Finally, we will evaluate the system, by testing the performance of gesture recognition and creating some radar-based application.
first implementation, subject of 1 (soon 2) papers
Modular architecture for gesture recognition
API for associating actions to gestures