SlideShare a Scribd company logo
A Natural User Interface for 3D Environments
Eric Maslowski, Theodore Hall; UROP Student: Rachael Miller
Navigating the digital world efficiently requires a natural user interface that provides an intuitive way of
working with information such as research data, three-dimensional images, and simulations. This project
applies the Microsoft Kinect as a mode of interaction with the Michigan Immersive Digital Experience Nexus.
The input systems that we use today, such as joysticks and
computer mice, do not provide a natural mode of
interaction with complex information. This problem is
amplified when we use these two-dimensional input systems
to navigate within a three- dimensional space, as in the
University of Michigan 3D Lab’s Michigan Immersive Digital
Experience Nexus (MIDEN), an immersive virtual reality
space. In an effort to improve interactivity in the MIDEN, the
Microsoft Kinect has been applied as a way of representing
the physical body in a virtual space. By analyzing the data
received from the Kinect, we have created a real-time,
digital model of the body. This body represents an avatar
that corresponds to the user’s location in space, allowing
them to interact with virtual objects. Because the MIDEN
offers the user perspective and depth perception, interaction
feels more natural than maneuvering an avatar on a screen;
the user can reach out and directly “touch” objects.
As a supplement to physical interaction, a gesture-based
user interface provides the user greater control in
simulations (e.g. navigation, selection). By using the hands
rather than other, more restrictive input systems, the
experience becomes more immersive and the user can focus
on their data analysis, training, or whatever other goals they
may have.
Concept
1. Examine space and plan system design
a. Identify ideal location for the Kinect
i. Kinects were best placed in the lower corners of the room
ii. This arrangement is unobtrusive, but still provides reliable tracking
b. Introduce a second Kinect to improve accuracy
2. Develop integration software
a. Implement client-server system to receive data from both Kinects
i. Because a single process cannot call on multiple Kinects, each device
must be managed by a separate process
ii. Each process passes data through a socket to the rendering engine
b. Consolidate data for an improved skeleton
c. Convert data to the coordinate system of the MIDEN
d. Place digital body in appropriate location
i. Each data point corresponds to the location of a virtual object
ii. These objects meet to form a model of the body in its current position
3. Place Kinects and calibrate system
a. Hide Kinects, cables, etc. from viewer
b. Measure offsets and angles of devices within the space
Current Work
Future Plans
We now have a working demonstration of our
application in the MIDEN that allows the user to
interact directly with virtual objects, just as they would
in a physical environment. To create this physical
interactivity, we have placed two Kinect devices in
front of the user to increase the interactive space
within the MIDEN. We then consolidate the data given
by each camera to create a single, improved skeleton
that be used as a guide for placing a set of invisible
PhysX capsules within the simulation. These capsules
are overlaid on the user’s body, giving them an avatar
in Jugular’s virtual space that can be used to interact
with other objects with realistic physics. Additionally,
we have implemented a small array of simple gesture
controls, such as assuming control of the system and
navigation.
In the near future, we plan to expand the gesture
recognition system and include feedback to make the
environment feel more responsive. We are also
considering implementing a small visible interface
that would give the user a wider array of available
actions. Our goal is to create a simple, intuitive mode
of interaction with the virtual world, allowing the user
to focus on their task, rather than learning to use the
technology. The interface will be applied to the
navigation of large datasets that are too complex to
interpret without 3D visualization techniques.
We wanted to make interaction with the MIDEN feel as
natural and immersive as possible, but still allow the user to
control the virtual environment. By augmenting our
direct-contact based system with gesture controls, we have
provided the tools to move through and interact with
simulations, data, and models while freeing the user from
restrictive peripheral controllers and trackers.
1. Lay out gesture recognition architecture
a. User’s pose is compared to an array of trigger poses
b. Trigger poses activate and deactivate gestures
c. Dynamic gestures can be performed while triggers are active
2. Identify ideal trigger poses
a. Poses must be simple, distinct, and comfortable
b. Avoid false positives and failed recognition
3. Build pose recognition software
a. Triggers must work dynamically, for all different body types
b. Joint angles and relative positions work best
i. Hand over head, 90° arm angles
c. Poses are compared to known gesture library
4. Link gestures to commands within MIDEN
a. Move forward, rotate, etc.
Objective
Special thanks:
Rob Soltesz, for being a teammate, project sounding board
and guinea pig
Kaelynn Clolinger and Noah Rice, for providing test
subjects of varied heights
Stage Two: Implement
Gesture Recognition
Stage One: Integrate the Kinect with the MIDEN
(With Rob Soltesz)
flow diagram here
Apply skeletal
smoothing
Aligned Frame
Consolidation
Check for trigger
gestures
Complete
desired action
Place PhysX
capsules
Left Kinect
Final Result
Right Kinect
Toggle ModeMove
Track Me Rotate
Before Kinect
With Kinect
Apply skeletal
smoothing
Aligned Frame
Check for trigger
gestures
Complete
desired action
Place PhysX
capsules
Final Result
Analyzing and understanding complex data often requires advanced 3D visualization, but this type of information
can be hard to navigate using only a mouse and keyboard. What might an interface designed specifically for 3D
data look like? Using the Microsoft Kinect and the Michigan Immersive Digital Experience Nexus, we have begun
developing this interface, giving the user the ability to intuitively interact with data using only their physical body.
We now have a working demonstration of our application
in the MIDEN that allows the user to interact directly with
virtual objects, just as they would in a physical environment.
To create this physical interactivity, we have placed two
Kinect devices in front of the user to increase the interactive
space within the MIDEN. We then consolidate the data
given by each camera to create a single, improved skeleton
that be used as a guide for placing a set of invisible PhysX
capsules within the simulation. These capsules are overlaid
on the user’s body, giving them an avatar in Jugular’s virtual
space that can be used to interact with other objects with
realistic physics. Additionally, we have implemented a small
array of simple gesture controls, such as assuming control of
the system and navigation.
In the near future, we plan to expand the gesture
recognition system and include feedback to make the envi-
ronment feel more responsive. We are also considering
implementing a small visible interface that would give the
user a wider array of available actions. Our goal is to create
a simple, intuitive mode of interaction with the virtual
world, allowing the user to focus on their task, rather than
learning to use the technology. The interface will be applied
to the navigation of large datasets that are too complex to
interpret without 3D visualization techniques.

More Related Content

What's hot

Gesturerecognition
GesturerecognitionGesturerecognition
Gesturerecognition
Mariya Khan
 
Comparative study of augmented reality
Comparative study of augmented realityComparative study of augmented reality
Comparative study of augmented reality
ijcsa
 
Real Time Facial Emotion Recognition using Kinect V2 Sensor
Real Time Facial Emotion Recognition using Kinect V2 SensorReal Time Facial Emotion Recognition using Kinect V2 Sensor
Real Time Facial Emotion Recognition using Kinect V2 Sensor
iosrjce
 
A real time facial emotion recognition using 3D sensor and interfacing the re...
A real time facial emotion recognition using 3D sensor and interfacing the re...A real time facial emotion recognition using 3D sensor and interfacing the re...
A real time facial emotion recognition using 3D sensor and interfacing the re...
Mounika Kakarla
 
Gesture Based Interface Using Motion and Image Comparison
Gesture Based Interface Using Motion and Image ComparisonGesture Based Interface Using Motion and Image Comparison
Gesture Based Interface Using Motion and Image Comparison
ijait
 
Rp2 published
Rp2 publishedRp2 published
Rp2 published
Aman Jain
 
Sixth sense report
Sixth sense reportSixth sense report
Sixth sense report
RAJASHREE B
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
Purbaditya Nath
 
IoT 2010 Talk on System Infrastructure for the Internet of Things.
IoT 2010 Talk on System Infrastructure for the  Internet of Things.IoT 2010 Talk on System Infrastructure for the  Internet of Things.
IoT 2010 Talk on System Infrastructure for the Internet of Things.
Fahim Kawsar
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense Technology
suraj gadiwan
 
Designing UX for the Internet of Things
Designing UX for the Internet of ThingsDesigning UX for the Internet of Things
Designing UX for the Internet of Things
Fahim Kawsar
 
An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...
eSAT Publishing House
 
Stefano Baldassi (Meta): Designing for the Future of Spatial Computing
Stefano Baldassi (Meta): Designing for the Future of Spatial ComputingStefano Baldassi (Meta): Designing for the Future of Spatial Computing
Stefano Baldassi (Meta): Designing for the Future of Spatial Computing
AugmentedWorldExpo
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
Gurpreet Kaur
 
Engineering Seminar Report on Augmented Reality
Engineering Seminar Report on Augmented RealityEngineering Seminar Report on Augmented Reality
Engineering Seminar Report on Augmented Reality
Ayush Agarwal
 
Research Talk at Bell Labs - IoT System Architecture and Interactions
Research Talk at Bell Labs - IoT System Architecture and InteractionsResearch Talk at Bell Labs - IoT System Architecture and Interactions
Research Talk at Bell Labs - IoT System Architecture and Interactions
Fahim Kawsar
 
Augmented Reality
Augmented Reality Augmented Reality
Augmented Reality
Kiran Kumar
 

What's hot (17)

Gesturerecognition
GesturerecognitionGesturerecognition
Gesturerecognition
 
Comparative study of augmented reality
Comparative study of augmented realityComparative study of augmented reality
Comparative study of augmented reality
 
Real Time Facial Emotion Recognition using Kinect V2 Sensor
Real Time Facial Emotion Recognition using Kinect V2 SensorReal Time Facial Emotion Recognition using Kinect V2 Sensor
Real Time Facial Emotion Recognition using Kinect V2 Sensor
 
A real time facial emotion recognition using 3D sensor and interfacing the re...
A real time facial emotion recognition using 3D sensor and interfacing the re...A real time facial emotion recognition using 3D sensor and interfacing the re...
A real time facial emotion recognition using 3D sensor and interfacing the re...
 
Gesture Based Interface Using Motion and Image Comparison
Gesture Based Interface Using Motion and Image ComparisonGesture Based Interface Using Motion and Image Comparison
Gesture Based Interface Using Motion and Image Comparison
 
Rp2 published
Rp2 publishedRp2 published
Rp2 published
 
Sixth sense report
Sixth sense reportSixth sense report
Sixth sense report
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
IoT 2010 Talk on System Infrastructure for the Internet of Things.
IoT 2010 Talk on System Infrastructure for the  Internet of Things.IoT 2010 Talk on System Infrastructure for the  Internet of Things.
IoT 2010 Talk on System Infrastructure for the Internet of Things.
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense Technology
 
Designing UX for the Internet of Things
Designing UX for the Internet of ThingsDesigning UX for the Internet of Things
Designing UX for the Internet of Things
 
An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...
 
Stefano Baldassi (Meta): Designing for the Future of Spatial Computing
Stefano Baldassi (Meta): Designing for the Future of Spatial ComputingStefano Baldassi (Meta): Designing for the Future of Spatial Computing
Stefano Baldassi (Meta): Designing for the Future of Spatial Computing
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
Engineering Seminar Report on Augmented Reality
Engineering Seminar Report on Augmented RealityEngineering Seminar Report on Augmented Reality
Engineering Seminar Report on Augmented Reality
 
Research Talk at Bell Labs - IoT System Architecture and Interactions
Research Talk at Bell Labs - IoT System Architecture and InteractionsResearch Talk at Bell Labs - IoT System Architecture and Interactions
Research Talk at Bell Labs - IoT System Architecture and Interactions
 
Augmented Reality
Augmented Reality Augmented Reality
Augmented Reality
 

Viewers also liked

Healthier lives in the NDIS Kate Goodyer Inala
Healthier lives in the NDIS Kate Goodyer InalaHealthier lives in the NDIS Kate Goodyer Inala
Healthier lives in the NDIS Kate Goodyer Inala
NSWCouncilforIntellectualDisability
 
22nd dec 26th dec 2014 nifty futures newsletter
22nd dec   26th dec 2014 nifty futures newsletter22nd dec   26th dec 2014 nifty futures newsletter
22nd dec 26th dec 2014 nifty futures newsletter
Nikhil Dogra
 
6th Wave member Brie'Nov
6th Wave member Brie'Nov6th Wave member Brie'Nov
6th Wave member Brie'Nov
European Network of Living Labs (ENoLL)
 
Herminio antonio vélez quiros
Herminio antonio vélez quirosHerminio antonio vélez quiros
Herminio antonio vélez quiros
alcaldia municipal
 
Maria janeth
Maria  janethMaria  janeth
Maria janeth
3122811935
 
Diapositivas de la web 2.0
Diapositivas de la web 2.0Diapositivas de la web 2.0
Diapositivas de la web 2.0
Orleam GP
 
LET YOUR EMPLOYEES DEFINE THEIR CAREER PATHS IN 10 QUESTIONS!
LET YOUR EMPLOYEES DEFINE THEIR CAREER PATHS IN   10 QUESTIONS!LET YOUR EMPLOYEES DEFINE THEIR CAREER PATHS IN   10 QUESTIONS!
LET YOUR EMPLOYEES DEFINE THEIR CAREER PATHS IN 10 QUESTIONS!
Alev Demokan Yapıcı
 
Exploración articulación
Exploración articulaciónExploración articulación
Exploración articulación
caremimar3
 
Tic 3°
Tic 3°Tic 3°
Untitled Presentation
Untitled PresentationUntitled Presentation
Untitled Presentation
Duna Nephew
 
Deodorant
DeodorantDeodorant
Deodorant
guest5019c9
 
Rol de la ciencia y tecnologia
Rol de la ciencia y tecnologiaRol de la ciencia y tecnologia
Rol de la ciencia y tecnologia
Carla Nuñez Barboza
 
Tagline ideas
Tagline ideasTagline ideas
Tagline ideas
charmitch21
 
Aleesha Pruett Designs Electronics
Aleesha Pruett Designs ElectronicsAleesha Pruett Designs Electronics
Aleesha Pruett Designs Electronics
Aleesha Pruett
 

Viewers also liked (14)

Healthier lives in the NDIS Kate Goodyer Inala
Healthier lives in the NDIS Kate Goodyer InalaHealthier lives in the NDIS Kate Goodyer Inala
Healthier lives in the NDIS Kate Goodyer Inala
 
22nd dec 26th dec 2014 nifty futures newsletter
22nd dec   26th dec 2014 nifty futures newsletter22nd dec   26th dec 2014 nifty futures newsletter
22nd dec 26th dec 2014 nifty futures newsletter
 
6th Wave member Brie'Nov
6th Wave member Brie'Nov6th Wave member Brie'Nov
6th Wave member Brie'Nov
 
Herminio antonio vélez quiros
Herminio antonio vélez quirosHerminio antonio vélez quiros
Herminio antonio vélez quiros
 
Maria janeth
Maria  janethMaria  janeth
Maria janeth
 
Diapositivas de la web 2.0
Diapositivas de la web 2.0Diapositivas de la web 2.0
Diapositivas de la web 2.0
 
LET YOUR EMPLOYEES DEFINE THEIR CAREER PATHS IN 10 QUESTIONS!
LET YOUR EMPLOYEES DEFINE THEIR CAREER PATHS IN   10 QUESTIONS!LET YOUR EMPLOYEES DEFINE THEIR CAREER PATHS IN   10 QUESTIONS!
LET YOUR EMPLOYEES DEFINE THEIR CAREER PATHS IN 10 QUESTIONS!
 
Exploración articulación
Exploración articulaciónExploración articulación
Exploración articulación
 
Tic 3°
Tic 3°Tic 3°
Tic 3°
 
Untitled Presentation
Untitled PresentationUntitled Presentation
Untitled Presentation
 
Deodorant
DeodorantDeodorant
Deodorant
 
Rol de la ciencia y tecnologia
Rol de la ciencia y tecnologiaRol de la ciencia y tecnologia
Rol de la ciencia y tecnologia
 
Tagline ideas
Tagline ideasTagline ideas
Tagline ideas
 
Aleesha Pruett Designs Electronics
Aleesha Pruett Designs ElectronicsAleesha Pruett Designs Electronics
Aleesha Pruett Designs Electronics
 

Similar to micwic2013_poster

Kinect Arabic Interfaced Drawing Application
Kinect Arabic Interfaced Drawing ApplicationKinect Arabic Interfaced Drawing Application
Kinect Arabic Interfaced Drawing Application
Yasser Hisham
 
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Waqas Tariq
 
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Waqas Tariq
 
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Waqas Tariq
 
Virtual Yoga System Using Kinect Sensor
Virtual Yoga System Using Kinect SensorVirtual Yoga System Using Kinect Sensor
Virtual Yoga System Using Kinect Sensor
IRJET Journal
 
virtual_chess
virtual_chessvirtual_chess
virtual_chess
Chinar Patil
 
28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt
3265mn
 
28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt
ahani
 
ENHANCED EDUCATION THROUGH AUGMENTED REALITY
ENHANCED EDUCATION THROUGH AUGMENTED REALITYENHANCED EDUCATION THROUGH AUGMENTED REALITY
ENHANCED EDUCATION THROUGH AUGMENTED REALITY
IRJET Journal
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense Technology
Navin Kumar
 
Full Body Immersion in AR
Full Body Immersion in ARFull Body Immersion in AR
Full Body Immersion in AR
Ali Said
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
Renjith Ravi
 
MOUSE SIMULATION USING NON MAXIMUM SUPPRESSION
MOUSE SIMULATION USING NON MAXIMUM SUPPRESSIONMOUSE SIMULATION USING NON MAXIMUM SUPPRESSION
MOUSE SIMULATION USING NON MAXIMUM SUPPRESSION
IRJET Journal
 
An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...
eSAT Journals
 
AatifKhan ppt on ppt analytics data and bca
AatifKhan ppt on ppt analytics data and bcaAatifKhan ppt on ppt analytics data and bca
AatifKhan ppt on ppt analytics data and bca
crazychekerislive
 
Sixthsensetechnology
Sixthsensetechnology Sixthsensetechnology
Sixthsensetechnology
syed Rehaman
 
augmented reality paper presentation
augmented reality paper presentationaugmented reality paper presentation
augmented reality paper presentation
Vaibhav Mehta
 
VR- virtual reality
VR- virtual realityVR- virtual reality
VR- virtual reality
Lakshmi Narayanan S
 
Gesture detection by virtual surface
Gesture detection by virtual surfaceGesture detection by virtual surface
Gesture detection by virtual surface
Ashish Garg
 
IRJET- 3D Drawing with Augmented Reality
IRJET- 3D Drawing with Augmented RealityIRJET- 3D Drawing with Augmented Reality
IRJET- 3D Drawing with Augmented Reality
IRJET Journal
 

Similar to micwic2013_poster (20)

Kinect Arabic Interfaced Drawing Application
Kinect Arabic Interfaced Drawing ApplicationKinect Arabic Interfaced Drawing Application
Kinect Arabic Interfaced Drawing Application
 
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
 
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
 
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
 
Virtual Yoga System Using Kinect Sensor
Virtual Yoga System Using Kinect SensorVirtual Yoga System Using Kinect Sensor
Virtual Yoga System Using Kinect Sensor
 
virtual_chess
virtual_chessvirtual_chess
virtual_chess
 
28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt
 
28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt
 
ENHANCED EDUCATION THROUGH AUGMENTED REALITY
ENHANCED EDUCATION THROUGH AUGMENTED REALITYENHANCED EDUCATION THROUGH AUGMENTED REALITY
ENHANCED EDUCATION THROUGH AUGMENTED REALITY
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense Technology
 
Full Body Immersion in AR
Full Body Immersion in ARFull Body Immersion in AR
Full Body Immersion in AR
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
 
MOUSE SIMULATION USING NON MAXIMUM SUPPRESSION
MOUSE SIMULATION USING NON MAXIMUM SUPPRESSIONMOUSE SIMULATION USING NON MAXIMUM SUPPRESSION
MOUSE SIMULATION USING NON MAXIMUM SUPPRESSION
 
An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...
 
AatifKhan ppt on ppt analytics data and bca
AatifKhan ppt on ppt analytics data and bcaAatifKhan ppt on ppt analytics data and bca
AatifKhan ppt on ppt analytics data and bca
 
Sixthsensetechnology
Sixthsensetechnology Sixthsensetechnology
Sixthsensetechnology
 
augmented reality paper presentation
augmented reality paper presentationaugmented reality paper presentation
augmented reality paper presentation
 
VR- virtual reality
VR- virtual realityVR- virtual reality
VR- virtual reality
 
Gesture detection by virtual surface
Gesture detection by virtual surfaceGesture detection by virtual surface
Gesture detection by virtual surface
 
IRJET- 3D Drawing with Augmented Reality
IRJET- 3D Drawing with Augmented RealityIRJET- 3D Drawing with Augmented Reality
IRJET- 3D Drawing with Augmented Reality
 

micwic2013_poster

  • 1. A Natural User Interface for 3D Environments Eric Maslowski, Theodore Hall; UROP Student: Rachael Miller Navigating the digital world efficiently requires a natural user interface that provides an intuitive way of working with information such as research data, three-dimensional images, and simulations. This project applies the Microsoft Kinect as a mode of interaction with the Michigan Immersive Digital Experience Nexus. The input systems that we use today, such as joysticks and computer mice, do not provide a natural mode of interaction with complex information. This problem is amplified when we use these two-dimensional input systems to navigate within a three- dimensional space, as in the University of Michigan 3D Lab’s Michigan Immersive Digital Experience Nexus (MIDEN), an immersive virtual reality space. In an effort to improve interactivity in the MIDEN, the Microsoft Kinect has been applied as a way of representing the physical body in a virtual space. By analyzing the data received from the Kinect, we have created a real-time, digital model of the body. This body represents an avatar that corresponds to the user’s location in space, allowing them to interact with virtual objects. Because the MIDEN offers the user perspective and depth perception, interaction feels more natural than maneuvering an avatar on a screen; the user can reach out and directly “touch” objects. As a supplement to physical interaction, a gesture-based user interface provides the user greater control in simulations (e.g. navigation, selection). By using the hands rather than other, more restrictive input systems, the experience becomes more immersive and the user can focus on their data analysis, training, or whatever other goals they may have. Concept 1. Examine space and plan system design a. Identify ideal location for the Kinect i. Kinects were best placed in the lower corners of the room ii. This arrangement is unobtrusive, but still provides reliable tracking b. Introduce a second Kinect to improve accuracy 2. Develop integration software a. Implement client-server system to receive data from both Kinects i. Because a single process cannot call on multiple Kinects, each device must be managed by a separate process ii. Each process passes data through a socket to the rendering engine b. Consolidate data for an improved skeleton c. Convert data to the coordinate system of the MIDEN d. Place digital body in appropriate location i. Each data point corresponds to the location of a virtual object ii. These objects meet to form a model of the body in its current position 3. Place Kinects and calibrate system a. Hide Kinects, cables, etc. from viewer b. Measure offsets and angles of devices within the space Current Work Future Plans We now have a working demonstration of our application in the MIDEN that allows the user to interact directly with virtual objects, just as they would in a physical environment. To create this physical interactivity, we have placed two Kinect devices in front of the user to increase the interactive space within the MIDEN. We then consolidate the data given by each camera to create a single, improved skeleton that be used as a guide for placing a set of invisible PhysX capsules within the simulation. These capsules are overlaid on the user’s body, giving them an avatar in Jugular’s virtual space that can be used to interact with other objects with realistic physics. Additionally, we have implemented a small array of simple gesture controls, such as assuming control of the system and navigation. In the near future, we plan to expand the gesture recognition system and include feedback to make the environment feel more responsive. We are also considering implementing a small visible interface that would give the user a wider array of available actions. Our goal is to create a simple, intuitive mode of interaction with the virtual world, allowing the user to focus on their task, rather than learning to use the technology. The interface will be applied to the navigation of large datasets that are too complex to interpret without 3D visualization techniques. We wanted to make interaction with the MIDEN feel as natural and immersive as possible, but still allow the user to control the virtual environment. By augmenting our direct-contact based system with gesture controls, we have provided the tools to move through and interact with simulations, data, and models while freeing the user from restrictive peripheral controllers and trackers. 1. Lay out gesture recognition architecture a. User’s pose is compared to an array of trigger poses b. Trigger poses activate and deactivate gestures c. Dynamic gestures can be performed while triggers are active 2. Identify ideal trigger poses a. Poses must be simple, distinct, and comfortable b. Avoid false positives and failed recognition 3. Build pose recognition software a. Triggers must work dynamically, for all different body types b. Joint angles and relative positions work best i. Hand over head, 90° arm angles c. Poses are compared to known gesture library 4. Link gestures to commands within MIDEN a. Move forward, rotate, etc. Objective Special thanks: Rob Soltesz, for being a teammate, project sounding board and guinea pig Kaelynn Clolinger and Noah Rice, for providing test subjects of varied heights Stage Two: Implement Gesture Recognition Stage One: Integrate the Kinect with the MIDEN (With Rob Soltesz) flow diagram here Apply skeletal smoothing Aligned Frame Consolidation Check for trigger gestures Complete desired action Place PhysX capsules Left Kinect Final Result Right Kinect Toggle ModeMove Track Me Rotate Before Kinect With Kinect Apply skeletal smoothing Aligned Frame Check for trigger gestures Complete desired action Place PhysX capsules Final Result Analyzing and understanding complex data often requires advanced 3D visualization, but this type of information can be hard to navigate using only a mouse and keyboard. What might an interface designed specifically for 3D data look like? Using the Microsoft Kinect and the Michigan Immersive Digital Experience Nexus, we have begun developing this interface, giving the user the ability to intuitively interact with data using only their physical body. We now have a working demonstration of our application in the MIDEN that allows the user to interact directly with virtual objects, just as they would in a physical environment. To create this physical interactivity, we have placed two Kinect devices in front of the user to increase the interactive space within the MIDEN. We then consolidate the data given by each camera to create a single, improved skeleton that be used as a guide for placing a set of invisible PhysX capsules within the simulation. These capsules are overlaid on the user’s body, giving them an avatar in Jugular’s virtual space that can be used to interact with other objects with realistic physics. Additionally, we have implemented a small array of simple gesture controls, such as assuming control of the system and navigation. In the near future, we plan to expand the gesture recognition system and include feedback to make the envi- ronment feel more responsive. We are also considering implementing a small visible interface that would give the user a wider array of available actions. Our goal is to create a simple, intuitive mode of interaction with the virtual world, allowing the user to focus on their task, rather than learning to use the technology. The interface will be applied to the navigation of large datasets that are too complex to interpret without 3D visualization techniques.