Handy Fb (Gesture recognition and facebook manipulation project)


Published on

Used gesture recognition to perform events like " chat, write on wall , like this and poke" on face-book look alike page.

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Handy Fb (Gesture recognition and facebook manipulation project)

  1. 1. Handy Facebook (Hand gestures to manipulate social networking website)
  2. 2. Team Members <ul><li>Jaskaran Uppal (0419) </li></ul><ul><li>Sandeep Mallela (9769) </li></ul><ul><li>Darpan Dhamija (0550) </li></ul><ul><li>Rahul Perhar (4562) </li></ul>
  3. 3. Project Objective <ul><li>Identify hand gestures in front of a webcam </li></ul><ul><li>Navigate the website depending on the gestures recognized </li></ul>
  4. 4. Tasks to be performed <ul><li>Making of gestures in front of the camera </li></ul><ul><li>Gesture detection at a suitable frame rate </li></ul><ul><li>Capturing the gestures and storing them in a .jpg file </li></ul><ul><li>System training to recognize the gestures with a low error rate </li></ul><ul><li>Execution of events upon the successful gesture recognition on the webpage </li></ul><ul><li>Notification to be sent to the user </li></ul>
  5. 5. Gesture Making <ul><li>Usage of a small set of gestures (fingers). </li></ul><ul><li>Every finger raised will perform some predefined navigation of the webpage </li></ul><ul><li>System capabilities can be programmed to accommodate other human gestures as well </li></ul><ul><li>Error in detection can be reduced by training </li></ul>
  6. 6. Gesture Detection <ul><li>Gestures are detected at a suitable frame rate. </li></ul><ul><li>The camera captures the hand gesture and we apply canny edge detection algorithm to store the gestures in the following format </li></ul>
  7. 7. System Training <ul><li>System training is done using “Neuroph” an open source Image Recognition tool that takes images as input and produces a neural network. </li></ul><ul><li>This Neural network can be trained to recognize the gestures </li></ul><ul><li>This can be used with Java Classes to be integrated in our application, using plug-in provided with the tool </li></ul>
  8. 8. Website Navigation <ul><li>The default page shown to the user </li></ul><ul><li>User makes gesture </li></ul><ul><li>System recognizes </li></ul><ul><li>Website navigates </li></ul><ul><li>Facebook profile loaded </li></ul><ul><li>Furthermore the user can use other gestures to navigate though additional WebPages </li></ul>
  9. 9. Website Navigation Contd. <ul><li>After initial gesture recognition, user is navigated to a personal profile page where he is given additional options </li></ul><ul><li>The user can make gestures to perform either of the actions </li></ul><ul><li>Chat </li></ul><ul><li>Write on wall </li></ul><ul><li>Like a post </li></ul><ul><li>Poke a person </li></ul>
  10. 10. Implementation Details <ul><li>The application is implemented using the following: </li></ul><ul><li>OpenCV libraries for gesture recognition code </li></ul><ul><li>Using Java to capture the image and convert it into a BufferedImage for easy processing </li></ul><ul><li>Neuroph tool is used to train the system </li></ul><ul><li>The output from Neuroph is the recognition of Gesture upon which we have actions defined </li></ul>
  11. 11. Results <ul><li>Home Page </li></ul>
  12. 12. Results <ul><li>Personal Page of a user </li></ul>
  13. 13. Results <ul><li>Opening Chat for a user </li></ul>
  14. 14. Results <ul><li>Writing on the wall of a user </li></ul>
  15. 15. Results <ul><li>“ Like” a user post </li></ul>
  16. 16. Results <ul><li>“ Poke” a user </li></ul>
  17. 17. Limitations <ul><li>The Limitations to the system includes the following: </li></ul><ul><li>The error rate in gesture recognition is persistent </li></ul><ul><li>It is a Lo-Fi prototype of what can be done on a larger scale further improvements can be done </li></ul><ul><li>Gesture recognition is dependent upon on available light. </li></ul>
  18. 18. Future Additions <ul><li>Improvement in Hand gesture recognition. Making the system more refined and gestures easily recognizable </li></ul><ul><li>We can Integrate this into a number of applications like Google maps to get the address of a particular place. </li></ul><ul><li>A lot more different gestures can be used and trained in the system </li></ul><ul><li>We can have a real chat window in the future </li></ul>
  19. 19. Credits & References <ul><li>Prof. Suya You, for all the support and knowledge of various User Interface Designs </li></ul><ul><li>Vijayakumar Gopalakrishnan, TA for giving an initial idea and helping us in realizing the project till the completion </li></ul><ul><li>Neuroph and related documentation for gesture recognition ( http://neuroph.sourceforge.net/documentation.html ) </li></ul>
  20. 20. Thank You