MICRO ELECTRICAL MECHANICAL SYSTEM
ACCELEROMETER BASED NONSPECIFIC-USER
HAND GESTURE RECOGNITION
APARNA C BHADRAN
2. GESTURE MOTION ANALYSIS
3. SYSTEM SENSING OVERVIEW
4. GESTURE SEGMENTATION
5. GESTURE RECOGNITION BASED ON SIGN
SEQUENCE AND HOPFIELD NETWORK
6. GESTURE RECOGNITION BASED ON
7.GESTURE RECOGNITION BASED ON
SIGN SEQUENCE AND TEMPLATE
• The increase in human-machine interactions
has made user interface technology more
• MEMS-Micro Electrical Mechanical System.
• Physical gestures will
1. greatly ease the interaction process.
2. enable humans to more naturally command computers
• controlling a television set remotely
• enabling a hand as a 3-D mouse .
• Many existing devices can capture gestures
– touch tablet.
• The technology employed for capturing
gestures can be
• Relatively expensive.
• So Micro Inertial Measurement Unit is used.
• Two types of gesture recognition methods
• Due to the limitations of vision based such as
• unexpected optical noise,
• slower dynamic response
• Recognition system is implemented based on
MEMS acceleration sensors.
• The acceleration patterns are not mapped into
• not transformed into frequency domain.
• But are recognized in time domain.
• There are three different gesture recognition
1) sign sequence and Hopfield based gesture recognition
2) velocity increment based gesture recognition model
3) sign sequence and template matching based gesture
GESTURE MOTION ANALYSIS
• Gesture motions are in the vertical plane i.e x-
• The alternate sign changes of acceleration on
the two axes are required to differentiate any
one of the 7 gestures:
• the gesture up has
• the acceleration on z-axis in the order:
negative —positive— negative
• no acceleration on x-axis
1 3: acceleration on z-axis is negative
• velocity changes from zero to a maximum
value at 3
• acceleration at point 3 is zero.
• 3 4: acceleration on z-axis is positive;
velocity changes from negative to positive and
is maximum at point 4, where acceleration
• 4 1: acceleration on z-axis is negative;
velocity changes from positive to zero.
• acceleration and velocity become zero at point
SENSING SYSTEM OVERVIEW
1. Sensor Description
• The sensing system utilized for hand motion
data collection :
2. System Work Flow
1. The system is switched on
2. The accelerations in three perpendicular
directions are detected by the MEMS sensors.
3. Transmitted to a PC via Bluetooth protocol.
4. The data is passed through a segmentation
5. the processed data are recognized by a
comparison program to determine the presented
A. Data Acquisition
• The sensing devices should be held
horizontally during the whole data collection
• The time interval between two gestures should
be no less than 0.2 seconds
B. Gesture Segmentation
1. Data Preprocessing:
• Raw data received from the sensors are
preprocessed by two 2 processes:
a) vertical axis offsets are removed in the
b) a filter is applied to the data sets to
eliminate high-frequency noise data
• The purpose is to find the terminal points of
each gesture in a data set .
• The conditions of determining the gesture
terminal points are
a) amplitude of the points
b) point separation
c) mean value
d) distance from the nearest intersection
e) sign variation between two successive points.
• all these five conditions are checked separately on x- and z-
axes acceleration data.
• two matrices are generated for each of gesture sequence data
GESTURE RECOGNITION BASED ON SIGN
AND HOPFIELD NETWORK
1) Feature Extraction:
• The gestures which motions on one axis are
separated from those which involve 2D
• The gesture code is 1,-1,1,-1
2) Gesture Encoding:
• Before recognition
– the obtained gesture code should be encoded
– it can be restored later by Hopfield network.
• The maximum number of signs for one gesture on
one axis is four.
• so if the x and z axes sign sequences are
combined, there will be totally eight numbers in
one gesture code.
• the input for Hopfield network can only be ―1‖ or
• we encoded the positive sign, negative sign
and zero using the following rules:
• ―1 1‖ represents positive sign;
• ― - 1 -1‖ represents negative sign;
• ―1 -1‖ represents zero.
3) Hopfield Network as Associative Memory:
• The involvement of Hopfield network has
made more fault tolerant.
• The network can retrieve the patterns that has
been stored previously.
4) Gesture Comparison:
• After gesture code restoration, each gesture
code is compared with the standard gesture
• The comparison is made by calculating the
difference between the two codes
• Smallest difference indicates the most likely
GESTURE RECOGNITION BASED
ON VELOCITY INCREMENT
• The acceleration of a gesture on one axis is
partitioned firstly according to the signs.
• Due to the intensity variance of each gesture,
an area sequence should be normalized
• after normalization:
• the area sequences are not compared immediately
• They are processed by using an algorithm analogous to
―center of mass‖.
• The final step
- to compare the velocity increment sequence
• The gesture, which has the minimum value
can be recognized.
GESTURE RECOGNITION BASED ON SIGN
SEQUENCE AND TEMPLATE MATCHING
• The recognition algorithm of this model is
very similar to that of model one
• except that no Hopfield network is used
• All the sign sequences are represented by -
1, 1 and 0
• Model III has the highest accuracy among the
three models, while the performance of Model
II is the worst of the three.
• The test results shown in Table III are based on
72 test samples.
• To enhance the performance :
• improve the segmentation algorithm .
• Moreover, other features of the motion data
may be utilized for pattern classification in
 T. H. Speeter, ―Transformation human hand motion for telemanipulation,‖
Presence, vol. 1, no. 1, pp. 63–79, 1992.
 S. Zhou, Z. Dong, W. J. Li, and C. P. Kwong, ―Hand-written char-
acter recognition using MEMS motion sensing technology,‖ in
Proc.IEEE/ASMEInt.Conf. Advanced Intelligent Mechatronics, 2008, pp.
 J. K. Oh, S. J. Cho, and W. C. Bang et al., ―Inertial sensor based recognition of
3-D character gestures with an ensemble of classifiers,‖ pre-
sented at the 9th Int. Workshop on Frontiers in Handwriting
 W. T. Freeman and C. D. Weissman, ―TV control by hand gestures,‖
presented at the IEEE Int. Workshop on Automatic Face and Gesture
Recognition, Zurich, Switzerland, 1995.