This document outlines the structure and goals of a course on developing gesture-based natural user interfaces using the Kinect sensor. The course will introduce programming environments like Processing, Pure Data, and Unity. Students will complete small individual projects and larger group projects involving bodily interaction. Topics will include skeletal tracking, Kinect functionality, and application examples. The goal is for students to learn how to integrate gesture recognition and body tracking into interactive sonic, musical, 2D, or 3D applications.
2. Course
• develop gesture-based Natural User
Interfaces (NUI)
• in various interactive application platforms
for sonic, musical, 2D or 3D interfaces
04 Kasım 2011 Cuma
3. Course In General
• the course is not to teach Processing, Pure
data or Unity
• to develop applications involving bodily
interaction by using these programming
environments and Kinect sensor.
• prerequisite is to know basics of one of
them. you will learn by doing more about
them
04 Kasım 2011 Cuma
4. Schedule
• 1 Introduction
• 2 Kinect with Processing
• 3 Kinect with Pure Data
• 4 Skeletal Tracking with Processing, Pure Data and Kinect SDK
• 1 Small Project Presentation
• 2 Concept Idea Presentation/Group Formation
• 3 Hands on Project Work/Tutoring
• 4 Hands on Project Work/Tutoring
• 1 Hands on Project Work/Tutoring
• 2 Hands on Project Work/Tutoring
• 3 Hands on Project Work/Tutoring
• 4 Final Project Presentation
04 Kasım 2011 Cuma
5. Small Project
• Very simple project
• Done in Processing, PD, Unity or QC
• Some interaction with body data
• Individually
• Presented in the class
04 Kasım 2011 Cuma
6. Final Project
• Concept Development Virtual Reality
Augmented Reality
Puppetry
• Sonic/Musical, 2D, 3D applications Multiuser interactive environment
Augmented interactive dance performance
(from image viewer to media art) Gestural musical instrument
Architectural Projection
• Groups of 2 people (strongly suggested)
• Exhibition
• Public space in TaiK
• Venue outside (public space outside) Spring2011
• Spring Demoday ()
04 Kasım 2011 Cuma
7. Programming
• Processing?
• Pure Data?
• Unity?
• Any other?
• Level of knowledge? (0-3)
• Online Survey
04 Kasım 2011 Cuma
8. User Interface
the system by which
users (people) interact (communicate) with a machine.
04 Kasım 2011 Cuma
9. Components of UI
• The user interface includes
• hardware (physical)
• software (logical) components.
• Provide a means of:
• Input, allowing the users to manipulate a system,
and/or
• Output, allowing the system to indicate the
effects of the users' manipulation.
04 Kasım 2011 Cuma
10. Interaction
• indicates the means by which user inputs changes
to the system and the feedback supplied by the
system
04 Kasım 2011 Cuma
11. Interaction
• How do you do?
• How do you feel?
• How do you know?
04 Kasım 2011 Cuma
17. Timeline of UI’s
Post WIMP Era
Tangible User Interface
WIMP Gesture-Based Interface
Command-line Interface Graphical User Interface
Natural User Interface
04 Kasım 2011 Cuma
19. Natural Interaction
• Experience (Human-Computer --> Human-Human )
• People communicate thru
• gestures
• expressions
• movements
• People discover by
• looking around
• manipulating physical stuff
04 Kasım 2011 Cuma
20. Why?
• Less cognitive load
• Simpler (for certain applications)
• typing???
• changing slide, navigating in 3D VR
• Direct manipulation
04 Kasım 2011 Cuma
21. Bodily Interaction
• whole body in context
• multi-modality
• human=multi-sensory ?
• user as an input modality
• direct input from user’s sensory
modalities
• manipulate digital data with body
04 Kasım 2011 Cuma
23. Put-that-There, 1979
• voice and gesture at the graphics interface
pioneering multimodal application combined
speech and gesture recognition.
• Put-that-There, MIT, 1979
• Put-that-there, MIT 1983
• Richard A. Bolt "Put-That-There":Voice and Gesture at the Graphics Interface (pdf) SIGGRAPH '80
04 Kasım 2011 Cuma
24. Kinect Sensor
IR Laser Projector IR Camera
04 Kasım 2011 Cuma
25. Step 1: Depth Sensing
Kinect’s IR-Grid
Watch the Video Watch the Video 2
04 Kasım 2011 Cuma
35. Technical Specs of
Kinect
• Field of View
Horizontal field of view: 57 degrees
Vertical field of view: 43 degrees
Physical tilt range: ± 27 degrees
Depth sensor range: 1.2m – 3.5m (10m)
• Skeletal Tracking System
Tracks up to 6 people, including 2 active players
Tracks 20 joints per active player
• Works in complete darkness, problems in direct
sunlight
04 Kasım 2011 Cuma