Multi touch interactive screen, MIE Competition

2,660 views

Published on

Multi touch Interactive Screen is a Graduation Project for the 2010 year,computer science, Ain Shams University, It is extended project of Interactive Wall 2009. Multi touch interactive Screen won the first rank in Made in Egypt Competition, IEEE Egypt section

Published in: Education, Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,660
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
265
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Check the name of the project..
  • Seprate two slid
  • According to many surveys, people are impressed about HCI technologies, we become very enthusiastic due to these facts that enable us to participate in developing such area of technology with less cost.
  • Light is off. Shadow.
  • Cost.. 1000 $ **If they will do not see the environment.. Must add the real physical environment picture
  • Smart white board Smart Wall Touch screen Cost Gesture Recognition . No need to touh screen. Bare hands . Dynamic gesture recognition..
  • Interactive Screen could be used in Meeting rooms Class rooms For presentation We found that
  • All the Modules Overview and Sample Output Here …isA But The Demos! … it is Ok Final Scene  The FrameWork iis on the right hand side, the current module is highlighted, the overview and demo is on the left hand side of the slide
  • Change it in the whole presentation .
  • Multi touch interactive screen, MIE Competition

    1. 1. Faculty of computer and information science Ain Shams University Section 1
    2. 2. Supervisors: Professor Dr. Mohammed Roushdy Faculty of computer and information sciences Dr. Haythem El-Messiry Faculty of computer and information sciences T.A. Ahmad Salah Faculty of computer and information sciences Sponsors:
    3. 3. Teamwork: Abu-Bakr Taha Abdel Khalek Hadeel Mahmoud Mohammed Hager Abdel Motaal Mohammed Mahmoud Fayez El-Khateeb Yasmeen Abdel Naby Aly
    4. 4. Agenda: <ul><li>Introduction. </li></ul><ul><li>Interactive Screen Vs Other systems. </li></ul><ul><li>Market Research &Customer needs. </li></ul><ul><li>Physical Environment. </li></ul><ul><li>System Framework. </li></ul><ul><li>System Modules. </li></ul><ul><li>Applications. </li></ul><ul><li>Limitations. </li></ul><ul><li>Future Work. </li></ul><ul><li>Final Demo. </li></ul><ul><li>References. </li></ul>
    5. 5. <ul><li>Overview: </li></ul><ul><li>Projector and two cams system </li></ul><ul><li>HCI (Human Computer Interaction) system. </li></ul><ul><li>Interact with hand gestures(shapes). </li></ul><ul><li>Extension of interactive wall ‘09 </li></ul>Introduction: Problem Definition: Human interacts normally with another human by using motions, It might be annoying and impractical to use hardware equipments to interact with someone/something.
    6. 6. Introduction: Motivation: 1. Interactive Wall 2009. 2. multi touch technology 3. large size of touch screen with appropriate cost. 4. flexibility.
    7. 7. future without annoying input devices, and to be proud of being a part of accomplishing such a dream. develop the Interactive Screen System to be able to handle more features and gestures and to get over its limitation which will make the user get satisfied with its usability and flexibility of use.
    8. 8. <ul><li>Time Plan: </li></ul><ul><li>Milestones: </li></ul><ul><li>. Segmentation modules. May-2010 </li></ul><ul><ul><li>Multi-hand tracking. April-2010 </li></ul></ul><ul><ul><li>Automatic hand detection. April-2010 </li></ul></ul><ul><ul><li>Z-Depth Module. April-2010 </li></ul></ul><ul><ul><li>Dynamic gesture Module. May-2010 </li></ul></ul>
    9. 9. <ul><li>Physical Environment: </li></ul><ul><li>Simple components constructs a new environment of interactive screens that overcomes limitations of other systems. </li></ul><ul><li>Environment limitations </li></ul>Traditional Environment
    10. 10. Proposed Physical Environment:
    11. 11. Physical Environment: other alternative solutions VS proposed solution
    12. 12. Interactive screen Vs other Systems <ul><li>Microsoft surface. </li></ul><ul><li>Diamond Touch. </li></ul><ul><li>Touch Screens. </li></ul><ul><li>Cost </li></ul><ul><li>No need To touch the screen. </li></ul><ul><li>Gesture Recognition. </li></ul><ul><li>Dynamic Gesture Recognition. </li></ul><ul><li>Bare Hands. </li></ul><ul><li>No sensors, pure image processing. </li></ul>
    13. 13. <ul><li>In 1991, First Smart White Board. </li></ul><ul><li>Over 1.6 million smart whiteboards have been installed throughout the world. </li></ul><ul><li>Surveys indicates that interactive whiteboards benefit student engagement, learner motivation and knowledge retention. </li></ul>Market Research
    14. 14. Framework Calibration Segmentation Hand Detection Multi hand Tracking Touch Detection Gesture Recognition Event Interface
    15. 15. apply a geometric calibration using the four calibration points acquired by the configuration module. Calibration Segmentation Hand Detection Multi hand Tracking Touch Detection Gesture Recognition Event Interface
    16. 16. Experimental Results Simple back ground Complex back ground It’s main task is to generate a binary image from the captured image represents the foreground Calibration Segmentation Hand Detection Multi hand Tracking Touch Detection Gesture Recognition Event Interface
    17. 17. Experimental Results responsible for detecting the hand position automatically at any position, with certain gesture (Open Hand) Calibration Segmentation Hand Detection Multi hand Tracking Touch Detection Gesture Recognition Event Interface
    18. 18. Experimental Results responsible for keep track with the user hand, know the actual position of the hand all the time Calibration Segmentation Hand Detection Multi hand Tracking Touch Detection Gesture Recognition Event Interface
    19. 19. Experimental Results the main task is to decide whether the user did touch the screen or not Calibration Segmentation Hand Detection Multi hand Tracking Touch Detection Gesture Recognition Event Interface
    20. 20. Experimental Results <ul><li>Responsible for recognize the shape of the hand. </li></ul><ul><li>Static. </li></ul><ul><li>Dynamic. </li></ul>Calibration Segmentation Hand Detection Multi hand Tracking Touch Detection Gesture Recognition Event Interface
    21. 21. <ul><li>Smart whiteboard application </li></ul><ul><li>In class rooms. </li></ul><ul><li>In meeting rooms. </li></ul>Application
    22. 22. Limitations <ul><li>Enter the system with certain gesture </li></ul><ul><li>Non Skin color for the user top clothes. </li></ul><ul><li>user must wear Long Sleeves. </li></ul>
    23. 23. <ul><ul><ul><li>Multi-user system. </li></ul></ul></ul><ul><ul><ul><li>Body Tracking. </li></ul></ul></ul><ul><ul><ul><li>Detecting any shape of hand. </li></ul></ul></ul>Future work
    24. 24. Demo
    25. 25. [1]. Mennat-Allah Mostafa Mohammad,Nada Sherif Abd El Galeel,Rana Mohammad Ali Roshdy,Sarah Ismail Ibrahim, Multi Touch Interactive Surface, Faculty of Computer and Information Sciences, Ain shams University, Cairo, Egypt, 2009. [2]. Kai Briechle, Uwe D. Hanebeck, Template Matching using Fast Normalized Cross Correlation, Institute of Automatic Control Engineering,Technische Universität München, 80290 München, Germany.,2001. [3]. Rafeal C. Gonzalez, Richard E. Woods, DIGITAL IMAGE PROCESSING, Third edition, Pearson, 2008. [4]. Gray Bradski, Adrian kaebler, Learning Open CV, O'Reilly Media, 2008. [5]. Alan M. McIvor, Background Subtraction Techniques In Proc. of Image and Vision Computing, Auckland, New Zealand, 2000.   [6]. Francesca Gasparini, Raimondo Schettini , Skin Segmentation using multiple thresholding, Milano Italy, 2007. References
    26. 26. [7]. Hideki Koike, Masataka Toyoura, Kenji Oka and Yoichi Sato, 3-D Interaction with Wall-Sized Display, IEEE computer society,2008. [8]. Mahdi Mirzabaki , A New Method for Depth Detection Using Interpolation Functions using single camera, INTERNATIONAL ARCHIVES OF PHOTOGRAMMETRY REMOTE SENSING AND SPATIAL INFORMATION SCIENCES, 2004, VOL 35; PART 3, pages 724-726. [9]. Patrick Horain and Mayank Bomb, 3D Model Based Gesture Acquisition Using a Single Camera,proceedings of sixth IEEE on applications of computer vision ,2002. [10] Z. Cˇernekova´, N. Nikolaidis and I. Pitas, SINGLE CAMERA POINTING GESTURE RECOGNITION USING SPATIAL FEATURES AND SUPPORT VECTOR MACHINES,EUSIPCO,Pozan,2007. References

    ×