Hand Posture Recognition with Standard Webcam for Natural Interaction
1. Applied Computing Group (ACG)
May 21, 2015
Portada
Hand Posture Recognition
with Standard Webcam for
Natural Interaction
Juan Jesús Ojeda Castelo
joc028@inlumine.ual.es
11 April, 2017Universidad de Almería
1
2. Applied Computing Group (ACG)
May 21, 2015
Introduction
Sign Language Recognition Education
Medicine
2
3. Applied Computing Group (ACG)
May 21, 2015
Hand Process Recognition
The recognition process is organized
in three parts:
• Background Subtraction
• Calibration
• Hand Pose Recognition
3
4. Applied Computing Group (ACG)
May 21, 2015
Background Subtraction
The
Segmentation is complemented with skin-color detection to separate the
background and the user.
4
5. Applied Computing Group (ACG)
May 21, 2015
Calibration Process
1. Using the Cascade
Classifier to detect the
user’s face.
2. The face color
pixels are extracted.
3. Detection of the
hands.
4. Save an image of
the hands.
5. The geometric
feature extraction gets
the points from the
hand contour.
5
6. Applied Computing Group (ACG)
May 21, 2015
Hand Pose Recognition
The pose recognition consists in:
1. Checking the points of the geometric feature extraction.
2. Checking the current image with the image saved in the
process of calibration.
6
7. Applied Computing Group (ACG)
May 21, 2015
Results
The features of the test are:
• 3 participants
• Different background
• 180 videos
• 1800 movements
Motions
Distances
Accuracy
60 cm 1 m Two distances
Opening and closing
hand motions
576/600 (96%) 561/600 (93.5%) 1137/1200 (94.75%)
91%Horizontal motions
with the closed hand
549/600 (91.5%) 531/600 (88.5%) 1080/1200 (90%)
Vertical motions with
the closed hand
543/600 (90.5%) 519/600 (86.5%) 1062/1200 (88.5%)
7
8. Applied Computing Group (ACG)
May 21, 2015
Conclusions
• It has been used background subtraction and
skin color detection to detect hands in a faster
way.
• The combination of the geometric feature
extraction and SSIM to compare different frames
is to get a better accuracy.
• The rate accuracy in this process is 91%.
8
9. Applied Computing Group (ACG)
May 21, 2015
Future Work
Add face recognition.
Recognize more gestures and attach them to the ENIA Project.
For environmental intelligence, integrate this natural
interaction system in intelligent buildings or smart homes.
9