• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Robochair
 

Robochair

on

  • 342 views

 

Statistics

Views

Total Views
342
Views on SlideShare
342
Embed Views
0

Actions

Likes
1
Downloads
1
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Robochair Robochair Presentation Transcript

    • VIVEKANANDHA COLLEGE OF ENGINEERING FOR WOMEN (Department of ECE)HEAD GESTURE RECOGNITION FOR HANDS – FREE CONTROLOF AN INTELLIGENT WHEEL CHAIR Presented by : Suganya D (III ECE) Suganthi priya T(III ECE)
    • OBJECTIVE This paper presents a novel hands-free controlsystem for intelligent wheelchairs (IWs) based onvisual recognition of head gestures for elderly anddisabled people who have restricted limbmovements
    • ABSTRAC T Electric-powered wheelchairs (EPWs) have been rapidly deployed over the last 20 years These EPWs are controlled by users’ hands and are very difficult for elderly and disabled users. As cheap computers and sensors are embedded into EPWs, then they named as intelligent wheelchair(IWs).
    • INTRODUCTION Our IWS is based on novel head gesture-based interface (HGI), namely RoboChair, Based on the integration of the Adaboost face detection algorithm and the Camshift object tracking algorithm. Head gesture recognition is conducted by means of real-time face detection and tracking.
    • SYSTEM HARDWARE STRUCTURE : Consists of parts as follows.  six ultrasonic sensors at a height of 50 cm.  DSP TMS320LF2407-based controller.  a Logitech 4000 Pro Webcam.  a local joystick controller.  Intel Pentium-M 1.6G Centrino laptop
    • CONTROl SYSTEM Of ROBO CHAIR Control system able to achieve both real time signal processing and high performance driving control due to the following features viz.,  Excellent processing capabilities(30 MIPS)  Compact peripheral integration Two control modes of robo chair:  Intelligent control mode  Manual control mode
    • CHAIRBlock diagram:
    • MANUAl CONTROl MODE In this mode of operation, Robochair is controlled by the JOYSTICK JOYSTICK is connected to an A/D converter of the DSP motion controller.
    • INTEllIgENT CONTROl MODE – Robochair is controlled by the proposed ( Head Gesture Interface ) HGI. – A Logitech web camera is used to acquire the facial images of the user. – Image data is sent to the laptop. Head gesture analysis and decision making stages are implemented. – Finally, the laptop sends control decision to the DSP motion controlled that actuates two DC motors.
    • HgI ( HEAD gESTURE INTERfACE ) It uses two algorithm.  Adaboost face detection algorithm  Camshift object tracking algorithm ADABOOST FACE DETECTION ALGORITHM ADVANTAGES:  Extracts the Haar-like features of images that contain image frequency information.  Adaboost is able to detect profile faces  High accuracy and speed in face detection CAMSHIFT OBJECT TRACKING ALGORITHM ADVANTAGES:  Very efficient color tracking method based on image hue and achieve real time performance.
    • INTEgRATION Of BOTH AlgORITHMS  Since low cost IW’s have limited onboard computing power, Adaboost face detection algorithm can’t achieve real time performance.  On the other hand, camshift face tracking algorithm runs very fast ,but is not robust to varying illumination conditions and noisy backgrounds.  So to obtain both speed and accuracy, it is necessary to integrate both algorithm.
    • Flowchart for integrated algorithms
    • HEAD gESTURE RECOgNITION To recognize the head gesture ,Adaboost frontal, left profile and right profile head gesture classifiers are adopted. If the profile face is detected, our Robochair is going to turn left or right. By calculating the precise nose position can detect the exact frontal face head gesture using classical template matching method.
    • NOSE TEMPlATE MATCHINg There are five frontal head gestures to be recognized, namely: 1. center frontal; 2. up frontal; 3. down frontal; 4. left frontal; and 5. right frontal.
    • ROBOCHAIR ACTIONS fOR MOTION CONTROl COMMANDSRules to be followed for action for Robochair: Speed up(if frontal face up is recognized) Slow down until stop(if frontal face down is recognized) Turn left (if left profile/frontal face is recognized) Turn right (if right profile/frontal face is recognized ) Keep speed (if central face is recognized)
    • DEMONSTRATION fOR PROfIlE fACES A sequence of images under head gesture control are  Turn right  Right up  Turn left  Turn left with hand color noise
    • CONClUSION This paper describes the design and implementation of a novel hands-free control system for IW’s. A robust HGI, is designed for vision-based head gesture recognition of the Robo Chair user. To avoid unnecessary movements caused by the user looking around randomly, our HGI is focused on the central position of the wheelchair
    • REfERENCES: Bradski, G. (1998), “Real-time face and object tracking as a component of a perceptual user interface”. Ding, D. and Cooper, R.A. (1995), “Electric powered wheelchairs”, IEEE Control Systems, Galindo, C., Gonzalez, J. and Fernandez- Madrigal, J.A. (2005), “An architecture for cognitive human-robot integration. Application to rehabilitation robotics”, Proceedings of IEEE International Conference on Mechatronics
    • THANK YOU & QUERIES