feature processing and modelling for 6D motion gesture database.....


Published on

gives gesture recognition and details of motion tracking

Published in: Education, Technology, Business
1 Comment
  • how to download this.kindly help me.
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

feature processing and modelling for 6D motion gesture database.....

  2. 2. ABSTRACT • A 6D motion gesture is represented by a 3D spatial trajectory and augmented by another three dimensions of orientation. Using different tracking technologies, the motion can be tracked explicitly with the position and orientation or implicitly with the acceleration and angular speed. This deals with the relative effectiveness of various feature dimensions for motion gesture recognition in both user-dependent and user-independent cases. A statistical feature- based classifier and an HMM-based recognizer, which offers more flexibility in feature selection and achieves better performance in recognition accuracy than another systems. Motion gesture database contains both explicit and implicit motion information which allows comparing the recognition performance of different tracking signals on a common ground. This gives an insight into the attainable recognition rate with different tracking devices, which is valuable for the system designer to choose the proper tracking technology.
  3. 3. INTRODUCTION • “Dimension” is a property of space(1D,2D, 3D..) • A straight line has 1dimension( 1 coordinate) • A parallelogram has 2 dimension(2 coordinate) • A parallelepiped has 3dimension.(3 coordinates) • Similar way, a 6D in any space that has 6 coordinates. • The implementation of a gesture-control interface contains two key components:  GESTURE RECOGNITION &  MOTION TRACKING • 6D motion gestures are used for recognizing the gesture. • The 6D motion gesture is represented by,  3D SPATIAL TRAJECTORY &  OTHER 3 DIMENSIONS OF ORIENTATION
  4. 4. What is gesture? • GESTURE is a meaningful body movement expressed by a subject. • Purpose of human gestures: conversational, controlling, commanding, manipulative, and communicative.  Two types;  Natural gestures  Sign language gestures  Natural gestures • Free form • Can occur in any order, dimension, shape etc… • The gesture performed by different individuals can vary dynamically. • Sign language gestures • It has a defined grammar. • Dynamic motions, i.e., motion gestures are commonly used for ‘command & control’ process.
  5. 5. Why gestures? • It is quite Natural. • Most Efficient. • High accuracy. • Gestures can be culture specific.
  6. 6. Gesture recognition • Two extreme cases are, o User dependent gesture recognition. o User independent gesture recognition. • Major approaches for analyzing or recognizing a gesture includes, • Dynamic Time Warping (DTW) • Neural Networks (NNs) • Hidden Markov Models (HMMs) • Data-driven template matching. • Statistical feature based classifiers. • Statistical feature based classifiers are also called as rubine classifier
  7. 7. DTW(Dynamic Time Warping…) Dynamic time warping (DTW) is an algorithm for measuring similarity between two sequences which may vary in time or speed. DTW can be useful for personalized gesture recognition. These recognizers are simple to implement, computationally inexpensive, and require only a few training samples to function properly. A significant amount of templates are needed to cover the range of variations. When a large set of training samples is hard to collect, we use the DTW technique.
  8. 8. A child being sensed by a simple gesture recognition algorithm detecting hand location and movement
  9. 9. HMM BASED CLASSIFIER • Most commonly used classifier for 6D gesture recognition. • The HMM is efficient at modelling a time series with and temporal variations, and has been successfully applied to gesture recognition. • The features (observations)for the HMMs vary, including the position, moving direction, acceleration, etc. • Normalization procedure is used specifically for the explicit and implicit motion data.
  10. 10. 6DMG: 6D MOTION GESTURE DATABASE • Here we define a total of 20 gestures. • Including swiping motions in eight directions, poke gestures that swipe rapidly forth and back in four directions, v-shape, x shape, clockwise and counter clockwise circles in both vertical and horizontal planes, and wrist twisting (roll). • Wii Remote Plus (Wiimote) for the inertial measurement of the acceleration and angular speed. • There are no mirror gestures, which means the direction and rotation are the same for both right- and left-handed users. • Wiimote used here, which works as a position tracker. • The tracking device provides both explicit and implicit 6D spatio-temporal information sampled at 60 Hz, including the position, orientation, acceleration, and angular speed.
  12. 12. Selected gestures from the 6DMG database
  13. 13. What is wiimote • It is a REMOTE. • The body of the Wii Remote measures 148 mm (5.8 in) long, 36.2 mm (1.43 in) wide, and 30.8 mm (1.21 in) thick. • Feature of the Wii Remote is its motion sensing capability. • Wiimote allows the user to interact with and manipulate items on screen via gesture recognition and pointing through the use of accelerometer and optical sensor technology.
  14. 14. wiimote
  15. 15. Block diagram for gesture recognition • These are all the techniques that are used for recognizing a gesture. RECOGNIZED GESTURE GESTURE RECOGNITION HMMDTW 6D MG
  16. 16. Motion tracking • Motion tracking is used to capture the motion before performing gesture recognition.  Tracking an object in space actually requires six dimensions:  Three for translation  Three for rotation • Methods used for tracking motion…  Two methods used for tracking the motion ,  Explicit method.  Implicit method. • The motion can be tracked explicitly with the position and orientation. • Implicitly with the acceleration and angular speed.
  17. 17. Technologies for motion tracking • There are several technologies for motion tracking. Vision based technique. & Tracker based technique. The tracker based technique ( used for sensing) can be divided into, Optical sensing & Inertial sensing
  18. 18. VISION BASED TECHNIQUE • This provides more natural and unencumbered (free form) interaction. MONOCULAR IMAGES OR VIDEOS:  Can extract the projected 2D trajectory and the orientation DEPTH CAMERA:  estimates the depth in a rougher scale. STEREO OR MULTI VIEW CAMERAS:  To track a full 3D motion  XBOX 360 KINECT:  Used for tracking a human body in 3D
  19. 19. Xbox 360 Kinect:
  20. 20. TRACKER BASED TECHNIQUE • Tracker-based techniques achieve more precise motion tracking at the expense of requiring the user to wear certain equipment. • A motion tracking system most often derives estimate of motion information from magnetic, acoustic, inertial, or optical sensors. • From this optical sensors and inertial sensors are most commonly used.
  21. 21. Optical sensors • An optical sensor is a device that converts light rays into electronic signals. • It is Similar to a photo resistor • Tracks the global orientation and position. • i.e. Optical sensors track either active or reflective markers and provide accurate motion tracking results at a relatively high speed. • Here at least two pairs of the tracker-sensor relationship are needed for valid triangulation to determine the position of one tracker. • Advantages 1. High sensitivity. 2. Small size and longer lifetime. 3. High intensity.
  22. 22. Optical sensor….
  23. 23. Inertial sensor • Inertial sensors are commonly referred to the MEMS (microelectronic mechanical systems) • It includes, accelerometers and gyroscopes in chip form. • The accelerometer measures the accelerations in the device- wise coordinates. • Gyroscope measures the angular speed and the orientation. • Both has the ability to sense the vibration, rotation, tilt etc…
  24. 24. Inertial sensors can sense….
  25. 25. BLOCK DIAGRAM FOR MOTION TRACKING….. Control motion Optical tracking Inertial tracking Position Orientation Acceleration Angular speed Motion dataMotion tracking
  27. 27. CONCLUSION • Gestures can be a natural and intuitive way for interaction, and we are especially interested in motion gestures rendered by the hand or handheld device in free space without regard to the posture, finger or body movements. • With different tracking technologies, the affordable motion information varies, which can be the position, orientation, acceleration, and angular speed. Although motion gestures are usually defined by the spatial trajectory, other kinematic properties still contain information to distinguish the gestures. • Two techniques have been used, motion tracking and gesture recognition.
  28. 28. REFERENCES • G. Welch and E. Foxlin, “Motion tracking: no silver bullet, but a respectable arsenal,” Computer Graphics and Application • R. Teather, A. Pavlovych, W. Stuerzlinger, and I. MacKenzie, “Effects of tracking technology, latency, and spatial jitter on object movement, "Proceedings of IEEE Symposium on 3D User Interfaces • M. Chen, G. AlRegib, and B.-H. Juang, “6dmg: A new 6d motiongesture database,” in Proceedings of the third annual ACM conferenceon Multimedia systems • J. Ruiz, Y. Li, and E. Lank, “User-defined motion gestures for mobile interaction,” in Proceedings of the 29th international conference on Human factors in computing systems
  29. 29. THANKYOU…...