Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Mie presentation

Interactive Screen is a Graduation Project for the 2010 year, Ain Shams University, It is extended project of Interactive Wall 2009

Interactive Screen won in a competition called MIE (Made In Egypt) Organized by IEEE, and The project did take TOP ONE rating.

For Any Details Please Contact:

  • Login to see the comments

  • Be the first to like this

Mie presentation

  1. 1. Faculty of computer and information science <br />Ain Shams University<br />Section 1<br />Multi-Touch Interactive Screen<br />Touch Your Vision<br />
  2. 2. Supervisors:<br />Professor Dr. Mohammed Roushdy<br />Faculty of computer and information sciences<br />Dr. Haythem El-Messiry<br />Faculty of computer and information sciences<br />T.A. Ahmad Salah<br />Faculty of computer and information sciences<br />Sponsors:<br />
  3. 3. Teamwork:<br />
  4. 4. Agenda:<br />Introduction.<br />Interactive Screen Vs Other systems.<br />Market Research &Customer needs.<br />Physical Environment.<br />System Framework.<br />System Modules.<br />Applications.<br />Limitations.<br />Future Work.<br /> References.<br />
  5. 5. Introduction:<br />Problem Definition:<br /> Human interacts normally with another human by using motions, It might be annoying and impractical to use hardware equipments to interact with someone/something.<br />Overview:<br /><ul><li>Projector and two cams system
  6. 6. HCI (Human Computer Interaction) system.
  7. 7. Interact with hand gestures(shapes).
  8. 8. Extension of interactive wall ‘09</li></li></ul><li>Introduction:<br />Motivation:<br />1. “Interactive Wall 2009”.<br /> 2. multi touch technology<br /> 3. large size of touch screen with appropriate cost. <br /> 4. flexibility.<br />
  9. 9. future without annoying input devices, and to be proud of being a part of accomplishing such a dream.<br />develop the Interactive Screen System to be able to handle more features and gestures and to get over its limitation which will make the user get satisfied with its usability and flexibility of use.<br />
  10. 10. Time Plan:<br /> Milestones:<br /> . Segmentation modules. May-2010 <br /><ul><li> Multi-hand tracking. April-2010
  11. 11. Automatic hand detection. April-2010
  12. 12. Z-Depth Module. April-2010
  13. 13. Dynamic gesture Module. May-2010 </li></li></ul><li>Physical Environment:<br /><ul><li>Simple components constructs a new environment of interactive screens that overcomes limitations of other systems.</li></ul>Traditional Environment <br /><ul><li>Environment limitations</li></li></ul><li>Proposed Physical Environment:<br />Mirror<br />Interactive Screen<br />Projector<br />Z-Depth Camera<br />Camera<br />
  14. 14. Physical Environment:<br />other alternative solutions VS proposed solution<br />
  15. 15. Interactive screen Vs other Systems<br /><ul><li>Microsoft surface.
  16. 16. Diamond Touch.
  17. 17. Touch Screens.
  18. 18. Cost
  19. 19. No need To touch the screen.
  20. 20. Gesture Recognition.
  21. 21. Dynamic Gesture Recognition.
  22. 22. Bare Hands.
  23. 23. No sensors, pure image processing.</li></li></ul><li>In 1991, First Smart White Board.<br />Over 1.6 million smart whiteboards have been installed throughout the world.<br />Surveys indicates that interactive whiteboards benefit student engagement, learner motivation and knowledge retention.<br />Market Research<br />
  24. 24. Input <br />Framework<br />Calibration<br />Segmentation<br />System Controlling<br />Hand Detection<br />Multi hand Tracking<br />Touch Detection<br />Gesture Recognition<br />Event<br />Interface<br />
  25. 25. Input <br />apply a geometric calibration using the four calibration points acquired by the configuration module.<br />Calibration<br />Experimental Results <br />Segmentation<br />Hand Detection<br />Multi hand Tracking<br />Touch Detection<br />Gesture Recognition<br />Event<br />Interface<br />
  26. 26. Input <br />It’s main task is to generate a binary image from the captured image represents the foreground<br />Calibration<br />Experimental Results<br />Simple back ground<br />Complex back ground<br />Segmentation<br />Hand Detection<br />Multi hand Tracking<br />Touch Detection<br />Gesture Recognition<br />Event<br />Interface<br />
  27. 27. Input <br />responsible for detecting the hand position automatically at any position, with certain gesture (Open Hand)<br />Calibration<br />Segmentation<br />Experimental Results<br />Hand Detection<br />Multi hand Tracking<br />Touch Detection<br />Gesture Recognition<br />Event<br />Interface<br />
  28. 28. Input <br />responsible for keep track with the user hand, know the actual position of the hand all the time<br />Calibration<br />Experimental Results<br />Segmentation<br />Hand Detection<br />Multi hand Tracking<br />Touch Detection<br />Gesture Recognition<br />Event<br />Interface<br />
  29. 29. Input <br /> the main task is to decide whether the user did touch the screen or not<br />Calibration<br />Experimental Results<br />Segmentation<br />Hand Detection<br />Multi hand Tracking<br />Touch Detection<br />Gesture Recognition<br />Event<br />Interface<br />
  30. 30. Input <br />Responsible for recognize the shape of the hand.<br /><ul><li>Static.
  31. 31. Dynamic.</li></ul>Calibration<br />Experimental Results<br />Segmentation<br />Hand Detection<br />Multi hand Tracking<br />Touch Detection<br />Gesture Recognition<br />Event<br />Interface<br />
  32. 32. Application<br />Smart whiteboard application<br /><ul><li>In class rooms.
  33. 33. In meeting rooms.</li></li></ul><li>Limitations<br /><ul><li> Non Skin color for the user top clothes.
  34. 34. user must wear Long Sleeves.
  35. 35. Enter the system with certain gesture.</li></li></ul><li>Future work<br />Multi-user system.<br /> Body Tracking.<br />Detecting any shape of hand.<br />
  36. 36. References<br />[1]. Mennat-Allah MostafaMohammad,NadaSherifAbd El Galeel,Rana Mohammad Ali Roshdy,Sarah Ismail Ibrahim, Multi Touch Interactive Surface, Faculty of Computer and Information Sciences, Ain shams University, Cairo, Egypt, 2009. <br />[2]. Kai Briechle, Uwe D. Hanebeck, Template Matching using Fast Normalized Cross Correlation, Institute of Automatic Control Engineering,TechnischeUniversitätMünchen, 80290 München, Germany.,2001. <br />[3]. Rafeal C. Gonzalez, Richard E. Woods, DIGITAL IMAGE PROCESSING, Third edition, Pearson, 2008. <br />[4]. Gray Bradski, Adrian kaebler, Learning Open CV,O'Reilly Media, 2008. <br />[5]. Alan M. McIvor, Background Subtraction Techniques In Proc. of Image and Vision Computing, Auckland, New Zealand, 2000. <br />[6]. Francesca Gasparini, RaimondoSchettini , Skin Segmentation using multiple thresholding, Milano Italy, 2007. <br />
  37. 37. References<br />[7]. Hideki Koike, MasatakaToyoura, Kenji Oka and Yoichi Sato, 3-D Interaction with Wall-Sized Display, IEEE computer society,2008.<br />[8].MahdiMirzabaki , A New Method for Depth Detection Using Interpolation Functions using single camera, INTERNATIONAL ARCHIVES OF PHOTOGRAMMETRY REMOTE SENSING AND SPATIAL INFORMATION SCIENCES, 2004, VOL 35; PART 3, pages 724-726.<br />[9]. Patrick Horain and Mayank Bomb, 3D Model Based Gesture Acquisition Using a Single Camera,proceedings of sixth IEEE on applications of computer vision ,2002.<br />[10] Z. Cˇernekova´, N. Nikolaidis and I. Pitas, SINGLE CAMERA POINTING GESTURE RECOGNITION USING SPATIAL<br />FEATURES AND SUPPORT VECTOR MACHINES,EUSIPCO,Pozan,2007.<br />
  38. 38. Any<br />Questions <br />Thank You<br />