Interactive Screen is a Graduation Project for the 2010 year, Ain Shams University, It is extended project of Interactive Wall 2009
Interactive Screen won in a competition called MIE (Made In Egypt) Organized by IEEE, and The project did take TOP ONE rating.
For Any Details Please Contact:
Hajer.Mohammed@gmail.com
hadeel.m.yusef@gmail.com
yasmeen.abdel.naby@gmail.com
algorithmist.abubakr@gmail.com
Interactive Screen System for Multi-Touch Gesture Recognition
1. Faculty of computer and information science Ain Shams University Section 1 Multi-Touch Interactive Screen Touch Your Vision
2. Supervisors: Professor Dr. Mohammed Roushdy Faculty of computer and information sciences Dr. Haythem El-Messiry Faculty of computer and information sciences T.A. Ahmad Salah Faculty of computer and information sciences Sponsors:
4. Agenda: Introduction. Interactive Screen Vs Other systems. Market Research &Customer needs. Physical Environment. System Framework. System Modules. Applications. Limitations. Future Work. References.
9. future without annoying input devices, and to be proud of being a part of accomplishing such a dream. develop the Interactive Screen System to be able to handle more features and gestures and to get over its limitation which will make the user get satisfied with its usability and flexibility of use.
24. Input Framework Calibration Segmentation System Controlling Hand Detection Multi hand Tracking Touch Detection Gesture Recognition Event Interface
25. Input apply a geometric calibration using the four calibration points acquired by the configuration module. Calibration Experimental Results Segmentation Hand Detection Multi hand Tracking Touch Detection Gesture Recognition Event Interface
26. Input It’s main task is to generate a binary image from the captured image represents the foreground Calibration Experimental Results Simple back ground Complex back ground Segmentation Hand Detection Multi hand Tracking Touch Detection Gesture Recognition Event Interface
27. Input responsible for detecting the hand position automatically at any position, with certain gesture (Open Hand) Calibration Segmentation Experimental Results Hand Detection Multi hand Tracking Touch Detection Gesture Recognition Event Interface
28. Input responsible for keep track with the user hand, know the actual position of the hand all the time Calibration Experimental Results Segmentation Hand Detection Multi hand Tracking Touch Detection Gesture Recognition Event Interface
29. Input the main task is to decide whether the user did touch the screen or not Calibration Experimental Results Segmentation Hand Detection Multi hand Tracking Touch Detection Gesture Recognition Event Interface
36. References [1]. Mennat-Allah MostafaMohammad,NadaSherifAbd El Galeel,Rana Mohammad Ali Roshdy,Sarah Ismail Ibrahim, Multi Touch Interactive Surface, Faculty of Computer and Information Sciences, Ain shams University, Cairo, Egypt, 2009. [2]. Kai Briechle, Uwe D. Hanebeck, Template Matching using Fast Normalized Cross Correlation, Institute of Automatic Control Engineering,TechnischeUniversitätMünchen, 80290 München, Germany.,2001. [3]. Rafeal C. Gonzalez, Richard E. Woods, DIGITAL IMAGE PROCESSING, Third edition, Pearson, 2008. [4]. Gray Bradski, Adrian kaebler, Learning Open CV,O'Reilly Media, 2008. [5]. Alan M. McIvor, Background Subtraction Techniques In Proc. of Image and Vision Computing, Auckland, New Zealand, 2000. [6]. Francesca Gasparini, RaimondoSchettini , Skin Segmentation using multiple thresholding, Milano Italy, 2007.
37. References [7]. Hideki Koike, MasatakaToyoura, Kenji Oka and Yoichi Sato, 3-D Interaction with Wall-Sized Display, IEEE computer society,2008. [8].MahdiMirzabaki , A New Method for Depth Detection Using Interpolation Functions using single camera, INTERNATIONAL ARCHIVES OF PHOTOGRAMMETRY REMOTE SENSING AND SPATIAL INFORMATION SCIENCES, 2004, VOL 35; PART 3, pages 724-726. [9]. Patrick Horain and Mayank Bomb, 3D Model Based Gesture Acquisition Using a Single Camera,proceedings of sixth IEEE on applications of computer vision ,2002. [10] Z. Cˇernekova´, N. Nikolaidis and I. Pitas, SINGLE CAMERA POINTING GESTURE RECOGNITION USING SPATIAL FEATURES AND SUPPORT VECTOR MACHINES,EUSIPCO,Pozan,2007.
According to many surveys, people are impressed about HCI technologies, we become very enthusiastic due to these facts that enable us to participate in developing such area of technology with less cost.
Light is off.Shadow.
Cost.. 1000 $**If they will do not see the environment.. Must add the real physical environment picture
Smart white board Smart WallTouch screenCostGesture Recognition .No need to touh screen.Bare hands .Dynamic gesture recognition..
Interactive Screen could be used inMeeting rooms Class roomsFor presentationWe found that
All the Modules Overview and Sample Output Here …isABut The Demos! … it is OkFinal Scene The FrameWorkiis on the right hand side, the current module is highlighted, the overview and demo is on the left hand side of the slide