feature processing and modelling for 6D motion gesture database.....
Upcoming SlideShare
Loading in...5
×
 

feature processing and modelling for 6D motion gesture database.....

on

  • 522 views

gives gesture recognition and details of motion tracking

gives gesture recognition and details of motion tracking

Statistics

Views

Total Views
522
Views on SlideShare
522
Embed Views
0

Actions

Likes
0
Downloads
24
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

feature processing and modelling for 6D motion gesture database..... feature processing and modelling for 6D motion gesture database..... Presentation Transcript

  • FEATURE PROCESSING & MODELLING FOR 6D MOTION GESTURE RECOGNITION 1 GUIDED BY PRESENTED BY Mrs. Seena. M .K Jeeva John Asst. Prof., ECE S7 ECE (035)
  • ABSTRACT • A 6D motion gesture is represented by a 3D spatial trajectory and augmented by another three dimensions of orientation. Using different tracking technologies, the motion can be tracked explicitly with the position and orientation or implicitly with the acceleration and angular speed. This deals with the relative effectiveness of various feature dimensions for motion gesture recognition in both user-dependent and user-independent cases. A statistical feature- based classifier and an HMM-based recognizer, which offers more flexibility in feature selection and achieves better performance in recognition accuracy than another systems. Motion gesture database contains both explicit and implicit motion information which allows comparing the recognition performance of different tracking signals on a common ground. This gives an insight into the attainable recognition rate with different tracking devices, which is valuable for the system designer to choose the proper tracking technology.
  • INTRODUCTION • “Dimension” is a property of space(1D,2D, 3D..) • A straight line has 1dimension( 1 coordinate) • A parallelogram has 2 dimension(2 coordinate) • A parallelepiped has 3dimension.(3 coordinates) • Similar way, a 6D in any space that has 6 coordinates. • The implementation of a gesture-control interface contains two key components:  GESTURE RECOGNITION &  MOTION TRACKING • 6D motion gestures are used for recognizing the gesture. • The 6D motion gesture is represented by,  3D SPATIAL TRAJECTORY &  OTHER 3 DIMENSIONS OF ORIENTATION
  • What is gesture? • GESTURE is a meaningful body movement expressed by a subject. • Purpose of human gestures: conversational, controlling, commanding, manipulative, and communicative.  Two types;  Natural gestures  Sign language gestures  Natural gestures • Free form • Can occur in any order, dimension, shape etc… • The gesture performed by different individuals can vary dynamically. • Sign language gestures • It has a defined grammar. • Dynamic motions, i.e., motion gestures are commonly used for ‘command & control’ process.
  • Why gestures? • It is quite Natural. • Most Efficient. • High accuracy. • Gestures can be culture specific.
  • Gesture recognition • Two extreme cases are, o User dependent gesture recognition. o User independent gesture recognition. • Major approaches for analyzing or recognizing a gesture includes, • Dynamic Time Warping (DTW) • Neural Networks (NNs) • Hidden Markov Models (HMMs) • Data-driven template matching. • Statistical feature based classifiers. • Statistical feature based classifiers are also called as rubine classifier
  • DTW(Dynamic Time Warping…) Dynamic time warping (DTW) is an algorithm for measuring similarity between two sequences which may vary in time or speed. DTW can be useful for personalized gesture recognition. These recognizers are simple to implement, computationally inexpensive, and require only a few training samples to function properly. A significant amount of templates are needed to cover the range of variations. When a large set of training samples is hard to collect, we use the DTW technique.
  • A child being sensed by a simple gesture recognition algorithm detecting hand location and movement
  • HMM BASED CLASSIFIER • Most commonly used classifier for 6D gesture recognition. • The HMM is efficient at modelling a time series with and temporal variations, and has been successfully applied to gesture recognition. • The features (observations)for the HMMs vary, including the position, moving direction, acceleration, etc. • Normalization procedure is used specifically for the explicit and implicit motion data.
  • 6DMG: 6D MOTION GESTURE DATABASE • Here we define a total of 20 gestures. • Including swiping motions in eight directions, poke gestures that swipe rapidly forth and back in four directions, v-shape, x shape, clockwise and counter clockwise circles in both vertical and horizontal planes, and wrist twisting (roll). • Wii Remote Plus (Wiimote) for the inertial measurement of the acceleration and angular speed. • There are no mirror gestures, which means the direction and rotation are the same for both right- and left-handed users. • Wiimote used here, which works as a position tracker. • The tracking device provides both explicit and implicit 6D spatio-temporal information sampled at 60 Hz, including the position, orientation, acceleration, and angular speed.
  • THE GESTURE LIST OF 6DMG
  • Selected gestures from the 6DMG database
  • What is wiimote • It is a REMOTE. • The body of the Wii Remote measures 148 mm (5.8 in) long, 36.2 mm (1.43 in) wide, and 30.8 mm (1.21 in) thick. • Feature of the Wii Remote is its motion sensing capability. • Wiimote allows the user to interact with and manipulate items on screen via gesture recognition and pointing through the use of accelerometer and optical sensor technology.
  • wiimote
  • Block diagram for gesture recognition • These are all the techniques that are used for recognizing a gesture. RECOGNIZED GESTURE GESTURE RECOGNITION HMMDTW 6D MG
  • Motion tracking • Motion tracking is used to capture the motion before performing gesture recognition.  Tracking an object in space actually requires six dimensions:  Three for translation  Three for rotation • Methods used for tracking motion…  Two methods used for tracking the motion ,  Explicit method.  Implicit method. • The motion can be tracked explicitly with the position and orientation. • Implicitly with the acceleration and angular speed.
  • Technologies for motion tracking • There are several technologies for motion tracking. Vision based technique. & Tracker based technique. The tracker based technique ( used for sensing) can be divided into, Optical sensing & Inertial sensing
  • VISION BASED TECHNIQUE • This provides more natural and unencumbered (free form) interaction. MONOCULAR IMAGES OR VIDEOS:  Can extract the projected 2D trajectory and the orientation DEPTH CAMERA:  estimates the depth in a rougher scale. STEREO OR MULTI VIEW CAMERAS:  To track a full 3D motion  XBOX 360 KINECT:  Used for tracking a human body in 3D
  • Xbox 360 Kinect:
  • TRACKER BASED TECHNIQUE • Tracker-based techniques achieve more precise motion tracking at the expense of requiring the user to wear certain equipment. • A motion tracking system most often derives estimate of motion information from magnetic, acoustic, inertial, or optical sensors. • From this optical sensors and inertial sensors are most commonly used.
  • Optical sensors • An optical sensor is a device that converts light rays into electronic signals. • It is Similar to a photo resistor • Tracks the global orientation and position. • i.e. Optical sensors track either active or reflective markers and provide accurate motion tracking results at a relatively high speed. • Here at least two pairs of the tracker-sensor relationship are needed for valid triangulation to determine the position of one tracker. • Advantages 1. High sensitivity. 2. Small size and longer lifetime. 3. High intensity.
  • Optical sensor….
  • Inertial sensor • Inertial sensors are commonly referred to the MEMS (microelectronic mechanical systems) • It includes, accelerometers and gyroscopes in chip form. • The accelerometer measures the accelerations in the device- wise coordinates. • Gyroscope measures the angular speed and the orientation. • Both has the ability to sense the vibration, rotation, tilt etc…
  • Inertial sensors can sense….
  • BLOCK DIAGRAM FOR MOTION TRACKING….. Control motion Optical tracking Inertial tracking Position Orientation Acceleration Angular speed Motion dataMotion tracking
  • FUTURE APPLICATION 6D THEATRES & 6D MOVIES…..
  • CONCLUSION • Gestures can be a natural and intuitive way for interaction, and we are especially interested in motion gestures rendered by the hand or handheld device in free space without regard to the posture, finger or body movements. • With different tracking technologies, the affordable motion information varies, which can be the position, orientation, acceleration, and angular speed. Although motion gestures are usually defined by the spatial trajectory, other kinematic properties still contain information to distinguish the gestures. • Two techniques have been used, motion tracking and gesture recognition.
  • REFERENCES • G. Welch and E. Foxlin, “Motion tracking: no silver bullet, but a respectable arsenal,” Computer Graphics and Application • R. Teather, A. Pavlovych, W. Stuerzlinger, and I. MacKenzie, “Effects of tracking technology, latency, and spatial jitter on object movement, "Proceedings of IEEE Symposium on 3D User Interfaces • M. Chen, G. AlRegib, and B.-H. Juang, “6dmg: A new 6d motiongesture database,” in Proceedings of the third annual ACM conferenceon Multimedia systems • J. Ruiz, Y. Li, and E. Lank, “User-defined motion gestures for mobile interaction,” in Proceedings of the 29th international conference on Human factors in computing systems
  • THANKYOU…...