Your SlideShare is downloading. ×
Interactive Wall (Multi Touch Interactive Surface)
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

Interactive Wall (Multi Touch Interactive Surface)

3,884
views

Published on


A graduation project at the Faculty of Computer and Information Sciences, Ain Shams University, Cairo, Egypt.

Interactive
wall Allows users to interact with the computer using his/her hands
gestures,

The application uses an optical camera to detect and track
the hands using image processing techniques,

The desktop is projected
on a wall using a projector, which gives the user the free experience
of interacting with the computer freely.
_________________________________________________________________
Windows Live™: Keep your life in sync. Check it out!
http://windowslive.com/explore?ocid=TXT_TAGLM_WL_t1_allup_explore_012009

Published in: Business, Technology

0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
3,884
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
2
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Our goal is the development of a more natural interface; a camera-projector system using hand gesture analysis.Optical cameraProjector Framework Hand gesture , ApplicationsA development framework that gives any application using it the ability to allow the user to interact with this application using “Hand Gestures”.A stream of video is captured using low cost webcam then processed in the framework to extract the hand position, recognize the gesture and finally the application handles the fired events.
  • Our goal is the development of a more natural interface; a camera-projector system using hand gesture analysis.Optical cameraProjector Framework Hand gesture , ApplicationsA development framework that gives any application using it the ability to allow the user to interact with this application using “Hand Gestures”.A stream of video is captured using low cost webcam then processed in the framework to extract the hand position, recognize the gesture and finally the application handles the fired events.
  • 1. Initialization (k=0). In this step it is looked for the object in the whole image due we do not know previously the object position. We obtain this way. Also we can considerer initially a big error tolerance.2. Prediction (k>0). In this stage using the Kalman filter we predict the relative position of the object, such position is considered as search center to find the object.3. Correction (k>0). In this part we locate the object (which is in the neighborhood point predicted in the previous stage) and we use its realposition (measurement) to carry out the state correction using the Kalman filter finding this way .The steps 2 and 3 are carried out while the object tracking runs.the equations for the Kalman filter fall into two groups: Time update equations and measurement update equations. The time update equations are responsible for projecting forward (in time) the current state and error covarianceestimates to obtain the a priori estimates for the next time step. The measurement update equations are responsible for the feedback—i.e. for incorporating a new measurement intothe a priori estimate to obtain an improved a posteriori estimate.
  • Change images
  • Transcript

    • 1. Multi-touch Interactive Surface
      Supervisors:
      Professor Dr. Mohammed Roushdy
      Dr. Haythem El-Messiry
      T.A. Ahmad Salah
      1
    • 2. Team Members
      2
    • 3. Agenda
      Introduction
      Physical Environment and Framework
      Project Modules and Applications
      Challenges
      Conclusion and Future work
      Tools and References
      3
    • 4. Motivation
      • A more natural and direct way of Human Computer Interaction (HCI).
      • 5. Current Multi-touch devices are:
      Expensive
      Heavy
      Fragile
      Consume space
      4
    • 6. Problem Definition
      • It would be more comfortable, effective and user friendly if the user could interact directly with the display device without any hardware equipments, just using his hands’ gestures.
      5
      • Our goal is to deliver an interactive surface characterized by low cost, efficiency and ease of use in real life applications.
    • Overview
      • Optical camera-projector system
      • 7. Generic Framework for Human Computer Interaction (HCI) using hand gestures.
      6
    • 8. Physical Environment
      Physical environment consists of:
      A projector.
      A webcam placed over the projector’s lens capturing the projected surface.
      7
    • 9. 1.25 m
      0.8 m
      8
    • 10. Physical Environment
      Surface
      2.15 m
      Camera
      Projector
      9
    • 11. Framework
      10
      Controller
      Configuration Module
      Input Module
      Hand Tracking
      Hand Segmentation
      Hand gesture Recognition
      Interface
    • 12. Controller Module
      11
    • 13. Detect Corners
      Controller Module
      Color Mapping
      Search for hand in entry point
      Segmentation
      Construct the search window
      Track the hand
      Fire Event
      Gesture Recognition
      12
    • 14. Configuration Module
      13
    • 15. Colors Mapping
      • Maps the colors between the desktop and the captured image colors.
      • 16. A set of colors are projected and captured for the color calibration process.
      Desktop Colors
      Projected Colors
      14
    • 17. Corner Detection
      • The four corners of the image are automatically detected using fast corner detection algorithm.
      15
    • 18. Input Module
      16
    • 19.
      • Calibrate captured image according to the four calibration points.
      Geometric Calibration
      17
      Captured Image
      Calibrated image
    • 20. Hand Tracking Module
      18
    • 21. Kalman Filter
      • The Kalman filter algorithm is essentially a set of recursive equations that implement a predictor-corrector estimator.
      • 22. Steps:
      Initialization
      Prediction
      Correction
      19
    • 23. Hand Segmentation Module
      20
    • 24. Skin Color Detection
      • The hand is affected by the projector’s light which results in generating different texture patterns on the hand’s surface which excludes any skin detection algorithm.
      21
      Captured image
      Skin detection applied
    • 25. Subtraction using Color Calibration
      • Subtract captured image form the desktop image.
      • 26. Convert colors of the desktop image to that of the captured image.
      22
    • 27. Color Calibration
      • Get colors’ training set,
      each desktop color and its
      corresponding projected color.
      • Divide each pair of images into regions (3x3).
      • 28. Calculate the transformation matrix A for each region.
      • 29. b=A * x ; where b is the calibrated color, A is the transformation matrix, x the desktop color.
      23
    • 30. Segmentation Results
      Largest Blob
      Extraction
      Desktop
      Captured
      Segmented
      Global
      thresholding
      24
    • 31. Blob Analysis
      • A heuristic method to extract the hand from the arm is applied using morphological operations.
      • 32. A bounding box (60 * 60) is constructed around the largest blob.
      Original
      Closed
      Original - Closed
    • 33. Hand Gesture Recognition Module
      26
    • 34. Hand Gesture Recognition
      27
      Contour
      Tracing
      Contour
      Re-sampling
      EFD
      DB
      EFD
      Gesture
      Type
    • 35. Elliptical Fourier Descriptors
      • Elliptical Fourier descriptors are a parametric representation of closed contours based on
      28
      harmonically related ellipses.
      • Any closed contour can be constructed from an infinite set of Elliptical Fourier descriptors.
    • Training
      Training Set :
      60 training images from each gesture
      29
    • 36. Testing Results
      30
    • 37. Interface Module
      31
    • 38. Interface Module
      • A gesture event is fired whenever the framework recognizes a gesture, the event contain the position and the gesture type.
      • 39. The interface handles the raised events.
      • 40. The user can map the gestures to events.
      32
    • 41. Main Gestures
      33
    • 42. Applications
      Puzzle game
      34
      Painter
      Image Viewer
    • 43. Demo
      35
    • 44. Challenges
      • Controller Module:
      Multi hands tracking
      • Gesture Recognition Module:
      Similar gestures
      • Segmentation Module
      Dark and complex backgrounds
      Arm Extraction
      36
    • 45. Conclusion
      • Human Computer Interaction field is still an open field.
      • 46. Image processing can be very powerful if used in the appropriate environment.
      37
    • 47. Future Work
      • Using the depth Z- axis besides X and Y axes for determining the hand position.
      • 48. Multi hands’ and multi users’ interaction.
      • 49. Interactive Wall can be used with another surface other than the projector, for example a large screen can be used.
      • 50. Recognize dynamic gestures.
      38
    • 51. Tools
      Software
      • Microsoft Visual Studio 2008
      • 52. MatLab
      • 53. OpenCV
      Hardware
      • Optical Camera
      • 54. Projector
      39
    • 55. References
      • Edward Rosten, Reid Porter, and Tom Drummond, Faster and better: a machine learning approach to corner detection, Los Alamos National Laboratory, Los Alamos, New Mexico, USA, 87544, Cambridge University, Cambridge University Engineering Department, Trumpington Street, Cambridge, UK, CB2 1PZ, October 14, 2008.
       
      • YongwonJeong and Richard J. Radke,Reslicing axially-sampled 3D shapes using elliptic Fourier descriptors, Department of Electrical, Computer, and Systems Engineering Rensselaer Polytechnic Institute, USA, 2007.
       
      • Louis Patrick Nicoli,Automatic Target Recognition of Synthetic Aperture Radar Images using Elliptical Fourier Descriptors, Florida Institute of Technology, Melbourne, Florida, August, 2007.
       
      • G. Amayeh, G. Bebis, A. Erol, and M. Nicolescu,A New Approach to Hand-Based Authentication, Computer Vision Laboratory, University of Nevada, Reno, 2007.
      • 56. AsanterabiMalima, ErolÖzgür, and MüjdatÇetin, A Fast algorithm for vision-based hand gesture recognition robot control, Faculty of Engineering and Natural Science, Sabancı University, Tuzla, İstanbul, Turkey, 2006.
      40
    • 57. References
      • Greg Welch and Gary Bishop, An Introduction to the Kalman Filter, Department of Computer Science University of North Carolina at Chapel Hill, NC 27599-3175, Updated: Monday July 24, 2006.
      • 58. E. Rosten and T. Drummond,Machine learning for high-speed corner detection, European Conference on Computer Vision, May 2006.
       
      • Rafael C.Gonzalez, Richard E.Woods, Digital Image Processing ,Second Edition, 2006.
      • 59. Erik Cuevas, Daniel Zaldivar and Raul Rojas, Kalman filter for vision tracking, FreieUniversität Berlin, InstitutfürInformatikTakustr. 9, D 14195 Berlin, Germany Universidad de Guadalajara Av. Revolucion No. 1500, C.P. 44430, Guadalajara, Jal., Mexico, August 10, 2005.
       
      • Jason J. Corso, Techniques for vision based Human computer interaction, A dissertation submitted to The Johns Hopkins University in conformity with the requirements for the degree of Doctor of Philosophy, Baltimore, Maryland, August 2005.
      .
      41
    • 60. References
      • Marcelo Bernardes Vieira, Luiz Velho, AslaS´a, Paulo CezarCarvalho, A Camera Projector System for Real-Time 3D Video, Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), Instituto de Matem´aticaPura e Aplicada Est. Dona Castorina, 110, Riode Janeiro, Brazil, 2005
      • 61.  NgonT.Truong, Jae-GyunGwag, Yong-Jin Park, and Suk-Ha Lee, Genetic Diversity of Soybean Pod Shape Based on Elliptical Fourier Descriptors, Dep. of Plant Science, Seoul National University, Seoul 151-742, Korea Dep. Of Crop Sciences, Can Tho University, Can Tho, Viet Nam Genetic Resources Div., National Institute of Agricultural Biotechnology, Suwon 441-707, Korea, 2005.
      • 62. E. Rosten and T. Drummond,Fusing Points and Lines for High Performance Tracking, ICCV, 2005.
       
      • Attila Licsár1, TamásSzirányi, Dynamic Training of Hand Gesture Recognition System, Proceedings of the 17th International Conference on Pattern Recognition (ICPR’04), University of Veszprém, Department of Image Processing and Neurocomputing, H-8200 Veszprém, Egyetem u. 10. Hungary. Analogical & Neural Computing Laboratory, Computer & Automation Research Institute, Hungarian Academy of Sciences, H 1111 Budapest, Kende u. 13-17, Hungary, 2004.
       
      42
    • 63. References
      • Attila Licsár1, TamásSzirányi, lecture notes in computer science, University of Veszprém, Department of Image Processing and Neurocomputing, H-8200 Veszprém, Egyetem u. 10. Hungary. Analogical & Neural Computing Laboratory, Computer & Automation Research Institute, Hungarian Academy of Sciences, H 1111 Budapest, Kende u. 13-17, Hungary, 2004
      • 64. Stephen Wolf,Color Correction Matrix for Digital Still and Video Imaging Systems, U.S. DEPARTMENT OF COMMERCE, December 2003.
      • 65. Qing Chen, Evaluation of OCR Algorithms for Images with Different Spatial Resolutions and Noises, School of Information Technology and Engineering Faculty of Engineering University of Ottawa ©, Ottawa, Canada, 2003.
      • 66. Vladimir Vezhnevets _ VassiliSazonovAllaAndreeva, A Survey on Pixel-Based Skin Color Detection Techniques, Graphics and Media Laboratory † Faculty of Computational Mathematics and Cybernetics Moscow State University, Moscow, Russia, 2003.
       
      • Yasushi HAMADA, Nobutaka SHIMADA, Yoshiaki SHIRAI, Hand Shape Estimation Using Sequence of Multi-Ocular Images Based on Transition Network, Department of Computer-Controlled Mechanical System, Osaka University, Japan, 2002.
      • 67. Dengsheng Zhang and Guojun Lu,A Comparative Study on Shape Retrieval Using Fourier Descriptors with Different Shape Signatures, Gippsland School of Computing and Information Technology Monash University Churchill, Victoria 3842, Australia, 2001.
      43
    • 68. References
      • Douglas Chai and AbdElsalamBouzerdoum,A Bayesian approach to skin color classification in YbCr color space, School of engineering and mathematics, Edith Cowan University, Australia, 2000
      • 69. Kenny Teng, Jeremy Ng, Shirlene Lim, Computer Vision Based Sign Language Recognition for Numbers.
      • 70. Nguyen Dang Binh, Enokida Shuichi, Toshiaki Ejima,Real-Time Hand Tracking and Gesture Recognation System, GVIP 05 Conference, 19-21 December 2005, CICC, Cairo, Egypt, Intelligence Media Laboratory, Kyushu Institute of Technology 680-4, Kawazu, Iizuka, Fukuoka 820, JAPAN.
       
      • G. Amayeh, G. Bebis, A. Erol, and M. Nicolescu, A New Approach to Hand-Based Authentication, Computer Vision Laboratory, University of Nevada, Reno.
      • 71. A. M. Hamad, Fawziashaaban, Mona Gabr, NohaSayed, RababHussien, Robot Vision, Faculty of computer and Information Sciences Ain Shams University, Cairo, Egypt, 2008.
      44
    • 72. Thank You !
      45