SlideShare a Scribd company logo
1 of 1
Download to read offline
A Natural User Interface for 3D Environments
Eric Maslowski, Theodore Hall; UROP Student: Rachael Miller
Navigating the digital world efficiently requires a natural user interface that provides an intuitive way of
working with information such as research data, three-dimensional images, and simulations. This project
applies the Microsoft Kinect as a mode of interaction with the Michigan Immersive Digital Experience Nexus.
The input systems that we use today, such as joysticks and
computer mice, do not provide a natural mode of
interaction with complex information. This problem is
amplified when we use these two-dimensional input systems
to navigate within a three- dimensional space, as in the
University of Michigan 3D Lab’s Michigan Immersive Digital
Experience Nexus (MIDEN), an immersive virtual reality
space. In an effort to improve interactivity in the MIDEN, the
Microsoft Kinect has been applied as a way of representing
the physical body in a virtual space. By analyzing the data
received from the Kinect, we have created a real-time,
digital model of the body. This body represents an avatar
that corresponds to the user’s location in space, allowing
them to interact with virtual objects. Because the MIDEN
offers the user perspective and depth perception, interaction
feels more natural than maneuvering an avatar on a screen;
the user can reach out and directly “touch” objects.
As a supplement to physical interaction, a gesture-based
user interface provides the user greater control in
simulations (e.g. navigation, selection). By using the hands
rather than other, more restrictive input systems, the
experience becomes more immersive and the user can focus
on their data analysis, training, or whatever other goals they
may have.
Concept
1. Examine space and plan system design
a. Identify ideal location for the Kinect
i. Kinects were best placed in the lower corners of the room
ii. This arrangement is unobtrusive, but still provides reliable tracking
b. Introduce a second Kinect to improve accuracy
2. Develop integration software
a. Implement client-server system to receive data from both Kinects
i. Because a single process cannot call on multiple Kinects, each device
must be managed by a separate process
ii. Each process passes data through a socket to the rendering engine
b. Consolidate data for an improved skeleton
c. Convert data to the coordinate system of the MIDEN
d. Place digital body in appropriate location
i. Each data point corresponds to the location of a virtual object
ii. These objects meet to form a model of the body in its current position
3. Place Kinects and calibrate system
a. Hide Kinects, cables, etc. from viewer
b. Measure offsets and angles of devices within the space
Current Work
Future Plans
We now have a working demonstration of our
application in the MIDEN that allows the user to
interact directly with virtual objects, just as they would
in a physical environment. To create this physical
interactivity, we have placed two Kinect devices in
front of the user to increase the interactive space
within the MIDEN. We then consolidate the data given
by each camera to create a single, improved skeleton
that be used as a guide for placing a set of invisible
PhysX capsules within the simulation. These capsules
are overlaid on the user’s body, giving them an avatar
in Jugular’s virtual space that can be used to interact
with other objects with realistic physics. Additionally,
we have implemented a small array of simple gesture
controls, such as assuming control of the system and
navigation.
In the near future, we plan to expand the gesture
recognition system and include feedback to make the
environment feel more responsive. We are also
considering implementing a small visible interface
that would give the user a wider array of available
actions. Our goal is to create a simple, intuitive mode
of interaction with the virtual world, allowing the user
to focus on their task, rather than learning to use the
technology. The interface will be applied to the
navigation of large datasets that are too complex to
interpret without 3D visualization techniques.
We wanted to make interaction with the MIDEN feel as
natural and immersive as possible, but still allow the user to
control the virtual environment. By augmenting our
direct-contact based system with gesture controls, we have
provided the tools to move through and interact with
simulations, data, and models while freeing the user from
restrictive peripheral controllers and trackers.
1. Lay out gesture recognition architecture
a. User’s pose is compared to an array of trigger poses
b. Trigger poses activate and deactivate gestures
c. Dynamic gestures can be performed while triggers are active
2. Identify ideal trigger poses
a. Poses must be simple, distinct, and comfortable
b. Avoid false positives and failed recognition
3. Build pose recognition software
a. Triggers must work dynamically, for all different body types
b. Joint angles and relative positions work best
i. Hand over head, 90° arm angles
c. Poses are compared to known gesture library
4. Link gestures to commands within MIDEN
a. Move forward, rotate, etc.
Objective
Special thanks:
Rob Soltesz, for being a teammate, project sounding board
and guinea pig
Kaelynn Clolinger and Noah Rice, for providing test
subjects of varied heights
Stage Two: Implement
Gesture Recognition
Stage One: Integrate the Kinect with the MIDEN
(With Rob Soltesz)
flow diagram here
Apply skeletal
smoothing
Aligned Frame
Consolidation
Check for trigger
gestures
Complete
desired action
Place PhysX
capsules
Left Kinect
Final Result
Right Kinect
Toggle ModeMove
Track Me Rotate
Before Kinect
With Kinect
Apply skeletal
smoothing
Aligned Frame
Check for trigger
gestures
Complete
desired action
Place PhysX
capsules
Final Result
Analyzing and understanding complex data often requires advanced 3D visualization, but this type of information
can be hard to navigate using only a mouse and keyboard. What might an interface designed specifically for 3D
data look like? Using the Microsoft Kinect and the Michigan Immersive Digital Experience Nexus, we have begun
developing this interface, giving the user the ability to intuitively interact with data using only their physical body.
We now have a working demonstration of our application
in the MIDEN that allows the user to interact directly with
virtual objects, just as they would in a physical environment.
To create this physical interactivity, we have placed two
Kinect devices in front of the user to increase the interactive
space within the MIDEN. We then consolidate the data
given by each camera to create a single, improved skeleton
that be used as a guide for placing a set of invisible PhysX
capsules within the simulation. These capsules are overlaid
on the user’s body, giving them an avatar in Jugular’s virtual
space that can be used to interact with other objects with
realistic physics. Additionally, we have implemented a small
array of simple gesture controls, such as assuming control of
the system and navigation.
In the near future, we plan to expand the gesture
recognition system and include feedback to make the envi-
ronment feel more responsive. We are also considering
implementing a small visible interface that would give the
user a wider array of available actions. Our goal is to create
a simple, intuitive mode of interaction with the virtual
world, allowing the user to focus on their task, rather than
learning to use the technology. The interface will be applied
to the navigation of large datasets that are too complex to
interpret without 3D visualization techniques.

More Related Content

What's hot

Gesturerecognition
GesturerecognitionGesturerecognition
GesturerecognitionMariya Khan
 
Comparative study of augmented reality
Comparative study of augmented realityComparative study of augmented reality
Comparative study of augmented realityijcsa
 
Real Time Facial Emotion Recognition using Kinect V2 Sensor
Real Time Facial Emotion Recognition using Kinect V2 SensorReal Time Facial Emotion Recognition using Kinect V2 Sensor
Real Time Facial Emotion Recognition using Kinect V2 Sensoriosrjce
 
A real time facial emotion recognition using 3D sensor and interfacing the re...
A real time facial emotion recognition using 3D sensor and interfacing the re...A real time facial emotion recognition using 3D sensor and interfacing the re...
A real time facial emotion recognition using 3D sensor and interfacing the re...Mounika Kakarla
 
Gesture Based Interface Using Motion and Image Comparison
Gesture Based Interface Using Motion and Image ComparisonGesture Based Interface Using Motion and Image Comparison
Gesture Based Interface Using Motion and Image Comparisonijait
 
Rp2 published
Rp2 publishedRp2 published
Rp2 publishedAman Jain
 
Sixth sense report
Sixth sense reportSixth sense report
Sixth sense reportRAJASHREE B
 
IoT 2010 Talk on System Infrastructure for the Internet of Things.
IoT 2010 Talk on System Infrastructure for the  Internet of Things.IoT 2010 Talk on System Infrastructure for the  Internet of Things.
IoT 2010 Talk on System Infrastructure for the Internet of Things.Fahim Kawsar
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense Technologysuraj gadiwan
 
Designing UX for the Internet of Things
Designing UX for the Internet of ThingsDesigning UX for the Internet of Things
Designing UX for the Internet of ThingsFahim Kawsar
 
An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...eSAT Publishing House
 
Stefano Baldassi (Meta): Designing for the Future of Spatial Computing
Stefano Baldassi (Meta): Designing for the Future of Spatial ComputingStefano Baldassi (Meta): Designing for the Future of Spatial Computing
Stefano Baldassi (Meta): Designing for the Future of Spatial ComputingAugmentedWorldExpo
 
Engineering Seminar Report on Augmented Reality
Engineering Seminar Report on Augmented RealityEngineering Seminar Report on Augmented Reality
Engineering Seminar Report on Augmented RealityAyush Agarwal
 
Research Talk at Bell Labs - IoT System Architecture and Interactions
Research Talk at Bell Labs - IoT System Architecture and InteractionsResearch Talk at Bell Labs - IoT System Architecture and Interactions
Research Talk at Bell Labs - IoT System Architecture and InteractionsFahim Kawsar
 
Augmented Reality
Augmented Reality Augmented Reality
Augmented Reality Kiran Kumar
 

What's hot (17)

Gesturerecognition
GesturerecognitionGesturerecognition
Gesturerecognition
 
Comparative study of augmented reality
Comparative study of augmented realityComparative study of augmented reality
Comparative study of augmented reality
 
Real Time Facial Emotion Recognition using Kinect V2 Sensor
Real Time Facial Emotion Recognition using Kinect V2 SensorReal Time Facial Emotion Recognition using Kinect V2 Sensor
Real Time Facial Emotion Recognition using Kinect V2 Sensor
 
A real time facial emotion recognition using 3D sensor and interfacing the re...
A real time facial emotion recognition using 3D sensor and interfacing the re...A real time facial emotion recognition using 3D sensor and interfacing the re...
A real time facial emotion recognition using 3D sensor and interfacing the re...
 
Gesture Based Interface Using Motion and Image Comparison
Gesture Based Interface Using Motion and Image ComparisonGesture Based Interface Using Motion and Image Comparison
Gesture Based Interface Using Motion and Image Comparison
 
Rp2 published
Rp2 publishedRp2 published
Rp2 published
 
Sixth sense report
Sixth sense reportSixth sense report
Sixth sense report
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
IoT 2010 Talk on System Infrastructure for the Internet of Things.
IoT 2010 Talk on System Infrastructure for the  Internet of Things.IoT 2010 Talk on System Infrastructure for the  Internet of Things.
IoT 2010 Talk on System Infrastructure for the Internet of Things.
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense Technology
 
Designing UX for the Internet of Things
Designing UX for the Internet of ThingsDesigning UX for the Internet of Things
Designing UX for the Internet of Things
 
An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...
 
Stefano Baldassi (Meta): Designing for the Future of Spatial Computing
Stefano Baldassi (Meta): Designing for the Future of Spatial ComputingStefano Baldassi (Meta): Designing for the Future of Spatial Computing
Stefano Baldassi (Meta): Designing for the Future of Spatial Computing
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
Engineering Seminar Report on Augmented Reality
Engineering Seminar Report on Augmented RealityEngineering Seminar Report on Augmented Reality
Engineering Seminar Report on Augmented Reality
 
Research Talk at Bell Labs - IoT System Architecture and Interactions
Research Talk at Bell Labs - IoT System Architecture and InteractionsResearch Talk at Bell Labs - IoT System Architecture and Interactions
Research Talk at Bell Labs - IoT System Architecture and Interactions
 
Augmented Reality
Augmented Reality Augmented Reality
Augmented Reality
 

Viewers also liked

Handout for Poetics of Superheroes
Handout for Poetics of SuperheroesHandout for Poetics of Superheroes
Handout for Poetics of SuperheroesElliott Santos
 
M.c.academico
M.c.academicoM.c.academico
M.c.academico30olga
 
Screen media final
Screen media finalScreen media final
Screen media finalHiuTung Yan
 
степень с натуральным показателем
степень с натуральным показателемстепень с натуральным показателем
степень с натуральным показателемteachkz
 
Educação à distância
Educação à distânciaEducação à distância
Educação à distânciamictagarroso
 
Thanksgiving Ideas - Songs of Thanksgiving - Give Thanks
Thanksgiving Ideas - Songs of Thanksgiving - Give ThanksThanksgiving Ideas - Songs of Thanksgiving - Give Thanks
Thanksgiving Ideas - Songs of Thanksgiving - Give ThanksKen Sapp
 
CompTIA Network+ Certificate
CompTIA Network+ CertificateCompTIA Network+ Certificate
CompTIA Network+ Certificatekurtis jacobs
 
Puertas el recuerdo 2014
Puertas el recuerdo 2014Puertas el recuerdo 2014
Puertas el recuerdo 2014edulff
 
Descubre como eres-11770
Descubre como eres-11770Descubre como eres-11770
Descubre como eres-11770dondela
 

Viewers also liked (17)

Sledgehammer vs. Rapier
Sledgehammer vs. RapierSledgehammer vs. Rapier
Sledgehammer vs. Rapier
 
Handout for Poetics of Superheroes
Handout for Poetics of SuperheroesHandout for Poetics of Superheroes
Handout for Poetics of Superheroes
 
M.c.academico
M.c.academicoM.c.academico
M.c.academico
 
Screen media final
Screen media finalScreen media final
Screen media final
 
Pedido
PedidoPedido
Pedido
 
Instagram your city 2014
Instagram your city 2014Instagram your city 2014
Instagram your city 2014
 
степень с натуральным показателем
степень с натуральным показателемстепень с натуральным показателем
степень с натуральным показателем
 
Educação à distância
Educação à distânciaEducação à distância
Educação à distância
 
Natjecaj1(05 2015)
Natjecaj1(05 2015)Natjecaj1(05 2015)
Natjecaj1(05 2015)
 
Slide doc ppt
Slide doc pptSlide doc ppt
Slide doc ppt
 
4 P
4 P4 P
4 P
 
Thanksgiving Ideas - Songs of Thanksgiving - Give Thanks
Thanksgiving Ideas - Songs of Thanksgiving - Give ThanksThanksgiving Ideas - Songs of Thanksgiving - Give Thanks
Thanksgiving Ideas - Songs of Thanksgiving - Give Thanks
 
CompTIA Network+ Certificate
CompTIA Network+ CertificateCompTIA Network+ Certificate
CompTIA Network+ Certificate
 
Puertas el recuerdo 2014
Puertas el recuerdo 2014Puertas el recuerdo 2014
Puertas el recuerdo 2014
 
Pico de aguila
Pico de aguilaPico de aguila
Pico de aguila
 
Descubre como eres-11770
Descubre como eres-11770Descubre como eres-11770
Descubre como eres-11770
 
Untitled Presentation
Untitled PresentationUntitled Presentation
Untitled Presentation
 

Similar to micwic2013_poster

Kinect Arabic Interfaced Drawing Application
Kinect Arabic Interfaced Drawing ApplicationKinect Arabic Interfaced Drawing Application
Kinect Arabic Interfaced Drawing ApplicationYasser Hisham
 
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...Waqas Tariq
 
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...Waqas Tariq
 
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...Waqas Tariq
 
Virtual Yoga System Using Kinect Sensor
Virtual Yoga System Using Kinect SensorVirtual Yoga System Using Kinect Sensor
Virtual Yoga System Using Kinect SensorIRJET Journal
 
28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt3265mn
 
28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-pptahani
 
ENHANCED EDUCATION THROUGH AUGMENTED REALITY
ENHANCED EDUCATION THROUGH AUGMENTED REALITYENHANCED EDUCATION THROUGH AUGMENTED REALITY
ENHANCED EDUCATION THROUGH AUGMENTED REALITYIRJET Journal
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense TechnologyNavin Kumar
 
Full Body Immersion in AR
Full Body Immersion in ARFull Body Immersion in AR
Full Body Immersion in ARAli Said
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technologyRenjith Ravi
 
MOUSE SIMULATION USING NON MAXIMUM SUPPRESSION
MOUSE SIMULATION USING NON MAXIMUM SUPPRESSIONMOUSE SIMULATION USING NON MAXIMUM SUPPRESSION
MOUSE SIMULATION USING NON MAXIMUM SUPPRESSIONIRJET Journal
 
An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...eSAT Journals
 
Sixthsensetechnology
Sixthsensetechnology Sixthsensetechnology
Sixthsensetechnology syed Rehaman
 
augmented reality paper presentation
augmented reality paper presentationaugmented reality paper presentation
augmented reality paper presentationVaibhav Mehta
 
Gesture detection by virtual surface
Gesture detection by virtual surfaceGesture detection by virtual surface
Gesture detection by virtual surfaceAshish Garg
 
IRJET- 3D Drawing with Augmented Reality
IRJET- 3D Drawing with Augmented RealityIRJET- 3D Drawing with Augmented Reality
IRJET- 3D Drawing with Augmented RealityIRJET Journal
 

Similar to micwic2013_poster (20)

Kinect Arabic Interfaced Drawing Application
Kinect Arabic Interfaced Drawing ApplicationKinect Arabic Interfaced Drawing Application
Kinect Arabic Interfaced Drawing Application
 
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
 
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
 
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
 
Virtual Yoga System Using Kinect Sensor
Virtual Yoga System Using Kinect SensorVirtual Yoga System Using Kinect Sensor
Virtual Yoga System Using Kinect Sensor
 
virtual_chess
virtual_chessvirtual_chess
virtual_chess
 
28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt
 
28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt
 
ENHANCED EDUCATION THROUGH AUGMENTED REALITY
ENHANCED EDUCATION THROUGH AUGMENTED REALITYENHANCED EDUCATION THROUGH AUGMENTED REALITY
ENHANCED EDUCATION THROUGH AUGMENTED REALITY
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense Technology
 
Full Body Immersion in AR
Full Body Immersion in ARFull Body Immersion in AR
Full Body Immersion in AR
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
 
MOUSE SIMULATION USING NON MAXIMUM SUPPRESSION
MOUSE SIMULATION USING NON MAXIMUM SUPPRESSIONMOUSE SIMULATION USING NON MAXIMUM SUPPRESSION
MOUSE SIMULATION USING NON MAXIMUM SUPPRESSION
 
An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...
 
Sixthsensetechnology
Sixthsensetechnology Sixthsensetechnology
Sixthsensetechnology
 
augmented reality paper presentation
augmented reality paper presentationaugmented reality paper presentation
augmented reality paper presentation
 
VR- virtual reality
VR- virtual realityVR- virtual reality
VR- virtual reality
 
Gesture detection by virtual surface
Gesture detection by virtual surfaceGesture detection by virtual surface
Gesture detection by virtual surface
 
IRJET- 3D Drawing with Augmented Reality
IRJET- 3D Drawing with Augmented RealityIRJET- 3D Drawing with Augmented Reality
IRJET- 3D Drawing with Augmented Reality
 
AI Virtual Mouse
AI Virtual MouseAI Virtual Mouse
AI Virtual Mouse
 

micwic2013_poster

  • 1. A Natural User Interface for 3D Environments Eric Maslowski, Theodore Hall; UROP Student: Rachael Miller Navigating the digital world efficiently requires a natural user interface that provides an intuitive way of working with information such as research data, three-dimensional images, and simulations. This project applies the Microsoft Kinect as a mode of interaction with the Michigan Immersive Digital Experience Nexus. The input systems that we use today, such as joysticks and computer mice, do not provide a natural mode of interaction with complex information. This problem is amplified when we use these two-dimensional input systems to navigate within a three- dimensional space, as in the University of Michigan 3D Lab’s Michigan Immersive Digital Experience Nexus (MIDEN), an immersive virtual reality space. In an effort to improve interactivity in the MIDEN, the Microsoft Kinect has been applied as a way of representing the physical body in a virtual space. By analyzing the data received from the Kinect, we have created a real-time, digital model of the body. This body represents an avatar that corresponds to the user’s location in space, allowing them to interact with virtual objects. Because the MIDEN offers the user perspective and depth perception, interaction feels more natural than maneuvering an avatar on a screen; the user can reach out and directly “touch” objects. As a supplement to physical interaction, a gesture-based user interface provides the user greater control in simulations (e.g. navigation, selection). By using the hands rather than other, more restrictive input systems, the experience becomes more immersive and the user can focus on their data analysis, training, or whatever other goals they may have. Concept 1. Examine space and plan system design a. Identify ideal location for the Kinect i. Kinects were best placed in the lower corners of the room ii. This arrangement is unobtrusive, but still provides reliable tracking b. Introduce a second Kinect to improve accuracy 2. Develop integration software a. Implement client-server system to receive data from both Kinects i. Because a single process cannot call on multiple Kinects, each device must be managed by a separate process ii. Each process passes data through a socket to the rendering engine b. Consolidate data for an improved skeleton c. Convert data to the coordinate system of the MIDEN d. Place digital body in appropriate location i. Each data point corresponds to the location of a virtual object ii. These objects meet to form a model of the body in its current position 3. Place Kinects and calibrate system a. Hide Kinects, cables, etc. from viewer b. Measure offsets and angles of devices within the space Current Work Future Plans We now have a working demonstration of our application in the MIDEN that allows the user to interact directly with virtual objects, just as they would in a physical environment. To create this physical interactivity, we have placed two Kinect devices in front of the user to increase the interactive space within the MIDEN. We then consolidate the data given by each camera to create a single, improved skeleton that be used as a guide for placing a set of invisible PhysX capsules within the simulation. These capsules are overlaid on the user’s body, giving them an avatar in Jugular’s virtual space that can be used to interact with other objects with realistic physics. Additionally, we have implemented a small array of simple gesture controls, such as assuming control of the system and navigation. In the near future, we plan to expand the gesture recognition system and include feedback to make the environment feel more responsive. We are also considering implementing a small visible interface that would give the user a wider array of available actions. Our goal is to create a simple, intuitive mode of interaction with the virtual world, allowing the user to focus on their task, rather than learning to use the technology. The interface will be applied to the navigation of large datasets that are too complex to interpret without 3D visualization techniques. We wanted to make interaction with the MIDEN feel as natural and immersive as possible, but still allow the user to control the virtual environment. By augmenting our direct-contact based system with gesture controls, we have provided the tools to move through and interact with simulations, data, and models while freeing the user from restrictive peripheral controllers and trackers. 1. Lay out gesture recognition architecture a. User’s pose is compared to an array of trigger poses b. Trigger poses activate and deactivate gestures c. Dynamic gestures can be performed while triggers are active 2. Identify ideal trigger poses a. Poses must be simple, distinct, and comfortable b. Avoid false positives and failed recognition 3. Build pose recognition software a. Triggers must work dynamically, for all different body types b. Joint angles and relative positions work best i. Hand over head, 90° arm angles c. Poses are compared to known gesture library 4. Link gestures to commands within MIDEN a. Move forward, rotate, etc. Objective Special thanks: Rob Soltesz, for being a teammate, project sounding board and guinea pig Kaelynn Clolinger and Noah Rice, for providing test subjects of varied heights Stage Two: Implement Gesture Recognition Stage One: Integrate the Kinect with the MIDEN (With Rob Soltesz) flow diagram here Apply skeletal smoothing Aligned Frame Consolidation Check for trigger gestures Complete desired action Place PhysX capsules Left Kinect Final Result Right Kinect Toggle ModeMove Track Me Rotate Before Kinect With Kinect Apply skeletal smoothing Aligned Frame Check for trigger gestures Complete desired action Place PhysX capsules Final Result Analyzing and understanding complex data often requires advanced 3D visualization, but this type of information can be hard to navigate using only a mouse and keyboard. What might an interface designed specifically for 3D data look like? Using the Microsoft Kinect and the Michigan Immersive Digital Experience Nexus, we have begun developing this interface, giving the user the ability to intuitively interact with data using only their physical body. We now have a working demonstration of our application in the MIDEN that allows the user to interact directly with virtual objects, just as they would in a physical environment. To create this physical interactivity, we have placed two Kinect devices in front of the user to increase the interactive space within the MIDEN. We then consolidate the data given by each camera to create a single, improved skeleton that be used as a guide for placing a set of invisible PhysX capsules within the simulation. These capsules are overlaid on the user’s body, giving them an avatar in Jugular’s virtual space that can be used to interact with other objects with realistic physics. Additionally, we have implemented a small array of simple gesture controls, such as assuming control of the system and navigation. In the near future, we plan to expand the gesture recognition system and include feedback to make the envi- ronment feel more responsive. We are also considering implementing a small visible interface that would give the user a wider array of available actions. Our goal is to create a simple, intuitive mode of interaction with the virtual world, allowing the user to focus on their task, rather than learning to use the technology. The interface will be applied to the navigation of large datasets that are too complex to interpret without 3D visualization techniques.