Gesture recognition


Published on

The Gesture Recognition Technology is rapidly growing technology and this PPT describes about the working of gesture recognition technology,the sub fields in it, its applications and the challenges it faces.

Published in: Technology, Art & Photos
1 Comment
No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Gesture recognition

  1. 1. Gestures are an important aspect of humaninteraction, both interpersonally and in thecontext of man-machine interfaces.A gesture is a form of non-verbal communicationin which visible bodily actions communicateparticular messages, either in place of speech ortogether and in parallel with words.Gestures include movement of the hands, face, orother parts of the body.
  2. 2. Military air marshals usehand and body gestures todirect flight operationsaboard aircraft carriers.
  3. 3. Gesticulation:-Spontaneous movements of the hands and arms thataccompany speech.Language-like gestures:-Gesticulation that is integrated into a spokenutterance, replacing a particular spoken word or phrase.Pantomimes:-Gestures that depict objects or actions, with or withoutaccompanying speech.Emblems:-Familiar gestures such as V for victory, thumbs up, andassorted rude gestures.Sign languages.:-Linguistic systems, such as American Sign Language, whichare well defined.
  4. 4. What isGesture Recognition ?Interface with computers using gestures ofthe human body, typically hand movements.Gesture recognition is an important skill forrobots that work closely with humans.Gesture recognition is especially valuable inapplications involving interactionhuman/robot for several reasons.
  5. 5. A child beingsensed by asimplegesturerecognitionalgorithmdetectinghand locationandmovement.
  6. 6. A basic working of the gesturerecognition system
  7. 7. Hand gesture recognitionis one obvious way tocreate a useful, highlyadaptive interfacebetween machines andtheir users.Hand gesture recognitiontechnology would allow forthe operation of complexmachines using only aseries of finger and handmovements, eliminatingthe need for physicalcontact between operatorand machine.
  8. 8. Facial gesture recognitionis another way of creatingan effective non-contactinterface between usersand their machines.The goal of facial gesturerecognition is for machinesto effectively understandemotions and othercommunication cueswithin humans, regardlessof the countless physicaldifferences betweenindividuals.
  9. 9. Sign languagerecognition is one ofthe most promisingsub-fields in gesturerecognition research.Effective signlanguage recognitionwould grant the deafand hard-of-hearingexpanded tools forcommunicating withboth other people andmachines.
  10. 10. Gesture Sensing Technologies:- Device Gesture Technologies Vision-based Technologies Electrical Field SensingTouch based gesturesNon-Contact:Contact type:
  11. 11.  Device-based techniquesuse a glove, stylus, orother positiontracker, whosemovements send signalsthat the system uses toidentify the gesture. The glove is equippedwith a variety of sensorsto provide informationabout handposition, orientation, andflex of fingers.
  12. 12. There are two approachesto vision based gesturerecognition:Model based techniques:They try to create a threedimensional model of theusers hand and use this forrecognition.Image based methods:Image-based techniquesdetect a gesture bycapturing pictures of auser’s motions during thecourse of a gesture.
  13. 13. Proximity of a humanbody or body part can bemeasured by sensingelectric fields .These measurements canbe used to measure thedistance of a humanhand or other body partfrom an object; thisfacilitates a vast range ofapplications for a widerange of industries.
  14. 14. These can provide input to the computerabout the position and rotation of the handsusing magnetic or inertial tracking devices.The first commercially available hand-tracking glove-type device was the DataGlove , a glove-type device which coulddetect hand position, movement and fingerbending.This uses fiber optic cables running downthe back of the hand. Light pulses arecreated and when the fingers are bent, lightleaks through small cracks and the loss isregistered, giving an approximation of thehand pose.Wired gloves:-
  15. 15. A Stereo camera is a camera that hastwo lenses about the same distanceapart as your eyes and takes twopictures at the same time. Thissimulates the way we actually seeand therefore creates the 3D effectwhen viewed.Using two cameras whose relationsto one another are known, a 3Drepresentation can be approximatedby the output of the cameras.Stereo cameras:-
  16. 16. Using specialized cameras suchas structured light or time-of-flightcameras, one can generate a depthmap of what is being seen throughthe camera at a short range, and usethis data to approximate a 3drepresentation of what is beingseen.These can be effective for detectionof hand gestures due to their shortrange capabilities.Depth-aware cameras.
  17. 17. Technology Behind It:-Thermal cameras:An infrared camera is a device thatdetects infraredradiation(temperature) from thetarget object and converts it into anelectronic signal to generate athermal picture on a monitor or tomake temperature calculations on it.The temperature which is capturedby an infrared camera can bemeasured or quantified exactly, sothat not only the thermal behaviorcan be observed but also the relativemagnitude of temperature relatedproblems can be recognized andnoted.
  18. 18. These controllers act as an extension of the body so that whengestures are performed, some of their motion can beconveniently captured by software.Mouse gestures are one such example, where the motion ofthe mouse is correlated to a symbol being drawn by apersons hand, as is the Wii Remote, which can study changesin acceleration over time to represent gestures.Controller –based gestures:-
  19. 19. A normal camera can be used for gesture recognition where theresources/environment would not be convenient for other forms of image-based recognition.Earlier it was thought that single camera may not be as effective as stereoor depth aware cameras, but a start-up based out of Palo Altonamed Flutter is challenging this theory. It has released an app that couldbe downloaded to by any windows/mac computer with built-in webcam.Single camera:-
  20. 20. 3D model-based algorithmsSkeletal-based algorithmsAppearance-based models
  21. 21. 3D model-based algorithms:-A real hand (left) is interpreted as a collection ofvertices and lines in the 3D mesh version(right), and the software uses their relativeposition and interaction in order to infer thegesture.Skeletal based algorithms:-The skeletal version (right) is effectively modellingthe hand (left). This has fewer parameters than thevolumetric version and its easier tocompute, making it suitable for real-time gestureanalysis systemsAppearance based models:-These binary silhouette(left) or contour(right)images represent typical input for appearance-based algorithms. They are compared withdifferent hand templates and if they match, thecorrespondent gesture is inferred.
  22. 22. Socially assistive robotics:-Sign languagerecognition:-By using proper sensors worn on the body of apatient and by reading the values from thosesensors, robots can assist in patientrehabilitation. The best example can be strokerehabilitation.Just as speech recognition cantranscribe speech to text,certain types of gesturerecognition software cantranscribe the symbolsrepresented through signlanguage into text.
  23. 23. Virtual controllers:-Remote control:-Through the use of gesturerecognition, remote control with thewave of a hand of various devices ispossible.For systems where the act of finding oracquiring a physical controller could require toomuch time, gestures can be used as analternative control mechanism. Controllingsecondary devices in a car, or controlling atelevision set are examples of such usage.
  24. 24. Control through facial gestures:-Immersive gametechnology:-Gestures can be used to controlinteractions within video games totry and make the game playersexperience more interactive orimmersive.Controlling a computer through facial gesturesis a useful application of gesture recognition forusers who may not physically be able to use amouse or keyboard. Eye tracking in particularmay be of use for controlling cursor motion orfocusing on elements of a display.
  25. 25. 1.LatencyImage processing can be significantly slow creating unacceptable latency for videogames and other similar applications.2.Lack of Gesture LanguageDifferent users make gestures differently, causing difficulty in identifying motions.3.RobustnessMany gesture recognition systems do not read motions accurately or optimally due tofactors like insufficient background light, high background noise etc.4.PerformanceImage processing involved in gesture recognition is quite resource intensive and theapplications may found difficult to run on resource constrained devices.