Bodily Interaction                                  Lecture 1, 24.03.2011                                       Ferhat Şen...
Course                    • develop gesture-based Natural User                              Interfaces (NUI)              ...
Course In General                    • the course is not to teach Processing, Pure                              data or Un...
Schedule                •       24th Mar 2011 12-14: Introduction                •       28th Mar 2011 13-15: Kinect with ...
Small Project                    • Very simple project                    • Done in Processing, PD, Unity or QC           ...
Final Project                    •                                                                   Virtual Reality      ...
Programming                    • Processing?                    • Pure Data?                    • Unity?                  ...
User Interface             the system by which             users (people) interact (communicate) with a machine.Friday, 25...
Components of UI                    •         The user interface includes                          •     hardware (physica...
Interaction                    •         indicates the means by which user inputs changes                              to ...
Interaction                                      • How do you do?                                      • How do you feel? ...
Timeline of UI’s                Command-line InterfaceFriday, 25 March 2011, Week
Friday, 25 March 2011, Week
Timeline of UI’s                Command-line Interface   Graphical User InterfaceFriday, 25 March 2011, Week
WIMP Paradigm                              WINDOW                              ICON                              MENU     ...
Friday, 25 March 2011, Week
Timeline of UI’s                                                                    Post WIMP Era                         ...
tangible                                         gesturalFriday, 25 March 2011, Week
Natural Interaction                    •         Experience (Human-Computer --> Human-Human )                          •  ...
Why?                    • Less cognitive load                    • Simpler (for certain applications)                     ...
Bodily Interaction      • whole body in context      • multi-modality         • human=multi-sensory                 ?     ...
Bodily Input Modalities                    •         Skeletal: Hand,Fingers, Head, Leg, Feet                    •         ...
Put-that-There, 1979                    •         voice and gesture at the graphics interface                             ...
Kinect Sensor                              IR Laser Projector    IR CameraFriday, 25 March 2011, Week
How Does it Work                                 Watch the VideoFriday, 25 March 2011, Week
Kinect’s IR-Grid                                 Watch the VideoFriday, 25 March 2011, Week
Friday, 25 March 2011, Week
Friday, 25 March 2011, Week
Two Methods to Track                    • Depth Filtering                    • Skeletal TrackingFriday, 25 March 2011, Week
Depth FilteringFriday, 25 March 2011, Week
Skeletal Tracking                                       Watch the VideoFriday, 25 March 2011, Week
Friday, 25 March 2011, Week
Technical Specs of                                   Kinect      •       Field of View              Horizontal field of vie...
Friday, 25 March 2011, Week
IR Projector   IR CameraFriday, 25 March 2011, Week
Examples                        http://nui.mlog.taik.fi/interesting-projectsFriday, 25 March 2011, Week
Upcoming SlideShare
Loading in...5
×

Bodily Interaction Lecture 1 Slides

1,115

Published on

Lecture Slides of Bodily Interaction course in Media Lab Helsinki.
24.03.2011

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,115
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Bodily Interaction Lecture 1 Slides

  1. 1. Bodily Interaction Lecture 1, 24.03.2011 Ferhat ŞenFriday, 25 March 2011, Week
  2. 2. Course • develop gesture-based Natural User Interfaces (NUI) • in various interactive application platforms for sonic, musical, 2D or 3D interfacesFriday, 25 March 2011, Week
  3. 3. Course In General • the course is not to teach Processing, Pure data or Unity • to develop applications involving bodily interaction by using these programming environments and Kinect sensor. • prerequisite is to know basics of one of them. you will learn by doing more about themFriday, 25 March 2011, Week
  4. 4. Schedule • 24th Mar 2011 12-14: Introduction • 28th Mar 2011 13-15: Kinect with Processing (Joint session with Processing Club) • 31st Mar 2011 12-14: Kinect with Pure Data (4th floor Mac Classroom) WS WEEK (Optional Sessions on Unity) • 11th Apr 2011 13-15: Skeletal Tracking with Processing and Pure Data • 14th Apr 2011 12-14: Small Project Presentation/Concept Idea Presentation/Group Formation • 18th Apr 2011 13-15: Hands on Project Work/Tutoring • 21st Apr 2011 12-14: Hands on Project Work/Tutoring • 28st Apr 2011 12-14: Hands on Project Work/Tutoring • 2nd May 2011 13-15: Final Project Presentation (Group-work) Paja is booked for Week 16-17-18 for you btwn 9-17. (Except 18.04.2011)Friday, 25 March 2011, Week
  5. 5. Small Project • Very simple project • Done in Processing, PD, Unity or QC • Some interaction with body data • Individually • Presented in the classFriday, 25 March 2011, Week
  6. 6. Final Project • Virtual Reality Concept Development Augmented Reality Puppetry • Multiuser interactive environment Sonic/Musical, 2D, 3D applications Augmented interactive dance performance Gestural musical instrument (from image viewer to media art) Architectural Projection • Groups of 2 people (strongly suggested) • Exhibition • Public space in TaiK • Venue outside (public space outside) • Spring Demoday 26.05Friday, 25 March 2011, Week
  7. 7. Programming • Processing? • Pure Data? • Unity? • Any other? • Level of knowledge? (0-5)Friday, 25 March 2011, Week
  8. 8. User Interface the system by which users (people) interact (communicate) with a machine.Friday, 25 March 2011, Week
  9. 9. Components of UI • The user interface includes • hardware (physical) • software (logical) components. • Provide a means of: • Input, allowing the users to manipulate a system, and/or • Output, allowing the system to indicate the effects of the users manipulation.Friday, 25 March 2011, Week
  10. 10. Interaction • indicates the means by which user inputs changes to the system and the feedback supplied by the systemFriday, 25 March 2011, Week
  11. 11. Interaction • How do you do? • How do you feel? • How do you know?Friday, 25 March 2011, Week
  12. 12. Timeline of UI’s Command-line InterfaceFriday, 25 March 2011, Week
  13. 13. Friday, 25 March 2011, Week
  14. 14. Timeline of UI’s Command-line Interface Graphical User InterfaceFriday, 25 March 2011, Week
  15. 15. WIMP Paradigm WINDOW ICON MENU POINTERFriday, 25 March 2011, Week
  16. 16. Friday, 25 March 2011, Week
  17. 17. Timeline of UI’s Post WIMP Era Tangible User Interface WIMP Gesture-Based Interface Command-line Interface Graphical User Interface Natural User InterfaceFriday, 25 March 2011, Week
  18. 18. tangible gesturalFriday, 25 March 2011, Week
  19. 19. Natural Interaction • Experience (Human-Computer --> Human-Human ) • People communicate thru • gestures • expressions • movements • People discover by • looking around • manipulating physical stuffFriday, 25 March 2011, Week
  20. 20. Why? • Less cognitive load • Simpler (for certain applications) • typing • changing slide, navigating in 3D VR • Direct manipulationFriday, 25 March 2011, Week
  21. 21. Bodily Interaction • whole body in context • multi-modality • human=multi-sensory ? • user as an input modality • direct input from user’s sensory modalities • manipulate digital data with bodyFriday, 25 March 2011, Week
  22. 22. Bodily Input Modalities • Skeletal: Hand,Fingers, Head, Leg, Feet • Sonic: Speech,Voice characteristics (volume, frequency), Body-made sounds • Biofeatures: Breath, Sweat, Heartbeat, Blood Pressure • Touch • Facial muscular activation • Presence in space • Multiple bodiesFriday, 25 March 2011, Week
  23. 23. Put-that-There, 1979 • voice and gesture at the graphics interface pioneering multimodal application combined speech and gesture recognition. • Put-that-There, MIT, 1979 • Put-that-there, MIT 1983 • Richard A. Bolt "Put-That-There":Voice and Gesture at the Graphics Interface (pdf) SIGGRAPH 80Friday, 25 March 2011, Week
  24. 24. Kinect Sensor IR Laser Projector IR CameraFriday, 25 March 2011, Week
  25. 25. How Does it Work Watch the VideoFriday, 25 March 2011, Week
  26. 26. Kinect’s IR-Grid Watch the VideoFriday, 25 March 2011, Week
  27. 27. Friday, 25 March 2011, Week
  28. 28. Friday, 25 March 2011, Week
  29. 29. Two Methods to Track • Depth Filtering • Skeletal TrackingFriday, 25 March 2011, Week
  30. 30. Depth FilteringFriday, 25 March 2011, Week
  31. 31. Skeletal Tracking Watch the VideoFriday, 25 March 2011, Week
  32. 32. Friday, 25 March 2011, Week
  33. 33. Technical Specs of Kinect • Field of View Horizontal field of view: 57 degrees Vertical field of view: 43 degrees Physical tilt range: ± 27 degrees Depth sensor range: 1.2m – 3.5m (10m) • Skeletal Tracking System Tracks up to 6 people, including 2 active players Tracks 20 joints per active player • Works in complete darknessFriday, 25 March 2011, Week
  34. 34. Friday, 25 March 2011, Week
  35. 35. IR Projector IR CameraFriday, 25 March 2011, Week
  36. 36. Examples http://nui.mlog.taik.fi/interesting-projectsFriday, 25 March 2011, Week

×