SlideShare a Scribd company logo
1 of 127
Download to read offline
GRAND CHALLENGES FOR
MIXED REALITY
Mark Billinghurst
mark.billinghurst@auckland.ac.nz
October 19th 2021
1967 – IBM 1401 – half of the computers in the world, $10,000/month to run
Sketchpad (1963)
• Ivan Sutherland
• First interactive graphics
• Pen based input
The ultimate display would, of course, be a room within
which the computer can control the existence of matter. A
chair displayed in such a room would be good enough to sit
in. Handcuffs displayed in such a room would be confining,
and a bullet displayed in such a room would be fatal..
Sutherland, Ivan. "The ultimate display." (1965).
Sutherland Display (1968)
https://www.youtube.com/watch?v=NtwZXGprxag
Star Trek – HoloDeck (1974)
Star Wars – Hologram (1977)
My First VR Experience - 1990
• Silicon Graphics Reality Engine
• 500,000 polygons/second
• VPL Eyephone HMD
• 320 x 240 resolution
• Magnetic tracking
• Glove input
• Expensive - $250,000+
Star Wars – Collaborative AR
1999 – Shared Space Demo
• Face to face collaborative AR like Star Wars concept
Shared Space Demo
Marker Based Tracking: ARToolKit (1999)
https://github.com/artoolkit
CPU: 300 Mhz
HDD; 9GB
RAM: 512 mb
Camera: VGA 30fps
Graphics: 500K poly/sec
1998: SGI O2 2008: Nokia N95
CPU: 332 Mhz
HDD; 8GB
RAM: 128 mb
Camera: VGA 30 fps
Graphics: 2m poly/sec
By 2008 phones had the same hardware as used in Shared Space demo
2005: Mobile AR version of Shared Space
• AR Tennis
• Shared AR content
• Two user game
• Audio + haptic feedback
• Bluetooth networking
ARTennis Demo
VR and AR Today
• Large growing market
• > $25 Billion USD in 2020
• Hundreds of millions of users
• Many available devices
• HMD, phones, tablets, HUDs
• Robust developer tools
• Vuforia, MRTK, Unity, etc
• Large number of applications
• > 150K developers, > 100K apps
• Strong research/business communities
• ISMAR, IEEE VR, AWE conferences, AugmentedReality.org, etc
https://www.youtube.com/watch?v=aUPMDwypBkA
https://www.youtube.com/watch?v=xRSF31dbLBU
Future Visions of VR: Ready Player One
• https://www.youtube.com/watch?v=LiK2fhOY0nE
Today vs. Tomorrow
VR in 2021 VR in 2045
Graphics High quality Photo-realistic
Display 110-150 degrees Total immersion
Interaction Handheld controller/some gesture Full gesture/body/gaze
Navigation Limited movement Natural
Multiuser Few users Millions of users
https://www.youtube.com/watch?v=gg-ZakMEwDU
“.. the technologies that will significantly
affect our lives over the next 10 years
have been around for a decade. The
future is with us ... The trick is learning
how to spot it”
October 2004
Bill Buxton
Key Technologies for MR Systems
• Display
• Stimulate visual, hearing/touch sense
• Tracking
• Changing viewpoint, registered content
• Interaction
• Supporting user input
DISPLAY
• Past
• Bulky Head mounted displays
• Current
• Handheld, lightweight head mounted
• Future
• Projected AR
• Wide FOV see through
• Retinal displays
• Contact lens
Evolution in Displays
North Focals (2020)
• https://www.bynorth.com
• Socially acceptable smart glasses
• $599 USD, small field of view
• Ring input device
Wide FOV See-Through (3+ years)
• Waveguide techniques
• Wider FOV
• Thin see through
• Socially acceptable
• Pinlight Displays
• LCD panel + point light sources
• 110 degree FOV
• UNC/Nvidia
Lumus DK40
Maimone, A., Lanman, D., Rathinavel, K., Keller, K., Luebke, D., & Fuchs, H. (2014). Pinlight displays:
wide field of view augmented reality eyeglasses using defocused point light sources. In ACM SIGGRAPH
2014 Emerging Technologies (p. 20). ACM.
https://www.youtube.com/watch?v=P407DFm0PFQ
Pinlight Display Demo
Kura Gallium Glasses (2020)
• https://www.kura.tech/
• Pinpoint wide field of view effect - 150° FOV;
• Resolution in the range 4K-8K;
• Very transparent screen
Contact Lens (15 + years)
• Contact Lens only
• Unobtrusive
• Significant technical challenges
• Power, data, resolution
• Babak Parviz (2008)
• MojoVision
• Mojo Lens
• Prototype smart contact lens
• https://www.mojo.vision/
http://spectrum.ieee.org/biomedical/bionics/augmented-reality-in-a-contact-lens/
https://www.youtube.com/watch?v=TzVAMRe3kmA
INTERACTION
Evolution of Interaction
• Past
• Limited interaction
• Viewpoint manipulation
• Present
• Screen based, simple gesture
• tangible interaction
• Future
• Natural gesture, Multimodal
• Intelligent Interfaces
• Physiological/Sensor based
Natural Gesture
• Freehand gesture input
• Depth sensors for gesture capture
• Move beyond simple pointing
• Rich two handed gestures
• Eg Microsoft Research Hand Tracker
• 3D hand tracking, 30 fps, single sensor
• Commercial Systems
• Hololens2, Oculus, Intel, MagicLeap, etc
Sharp, T., Keskin, C., Robertson, D., Taylor, J., Shotton, J., Leichter, D. K. C. R. I., ... & Izadi, S.
(2015, April). Accurate, Robust, and Flexible Real-time Hand Tracking. In Proc. CHI (Vol. 8).
https://www.youtube.com/watch?v=LblxKvbfEoo
Multi-Scale Gesture
• Combine different gesture types
• In-air gestures – natural but imprecise
• Micro-gesture – fine scale gestures
• Gross motion + fine tuning interaction
Ens, B., Quigley, A., Yeo, H. S., Irani, P., Piumsomboon, T., & Billinghurst, M. (2018). Counterpoint:
Exploring Mixed-Scale Gesture Interaction for AR Applications. In Extended Abstracts of the 2018 CHI
Conference on Human Factors in Computing Systems (p. LBW120). ACM.
https://www.youtube.com/watch?v=TRfqNtt1VxY&t=23s
Eye Tracking Input
• HMDs with integrated eye-tracking
• Hololens2, MagicLeap One
• Research questions
• How can eye gaze be used for interaction?
• What interaction metaphors are natural?
Eye Gaze Interaction Methods
• Gaze for interaction
• Implicit vs. explicit input
• Exploring different gaze interaction
• Duo reticles – use eye saccade input
• Hardware
• HTC Vive + Pupil Labs integrated eye-tracking
Piumsomboon, T., Lee, G., Lindeman, R. W., & Billinghurst, M. (2017, March). Exploring natural eye-gaze-based
interaction for immersive virtual reality. In 3D User Interfaces (3DUI), 2017 IEEE Symposium on (pp. 36-39). IEEE.
Duo-Reticles (DR)
Inertial Reticle (IR)
Real-time Reticle (RR) or Eye-gaze Reticle (original name)
A-1
As RR and IR are aligned,
alignment time counts down
A-2 A-3
Selection completed
Duo-Reticles (DR) – Video 1
Physiological Sensor Input
• Using physiological sensors for implicit input
• Systems that recognize user intent/activity
• EEG
• Measuring brain activity
• EMG
• Measuring muscle activity
https://www.youtube.com/watch?v=K34p7RwjWt0
HP Reverb G2 Omnicept
• Wide FOV, high resolution, best in class VR display
• Eye tracking, heart rate, pupillometry, and face camera
NextMind
• EEG attachment for AR/VR HMD
• 9 dry EEG electrodes
• https://www.next-mind.com/
https://www.youtube.com/watch?v=yfzDcfQpdp0
Multimodal Input
• Combine gesture and speech input
• Gesture good for qualitative input
• Speech good for quantitative input
• Support combined commands
• “Put that there” + pointing
• E.g. HIT Lab NZ multimodal input
• 3D hand tracking, speech
• Multimodal fusion module
• Complete tasks faster with MMI, less errors
Billinghurst, M., Piumsomboon, T., & Bai, H. (2014). Hands in Space: Gesture Interaction with
Augmented-Reality Interfaces. IEEE computer graphics and applications, (1), 77-80.
Intelligent Interfaces
• Move to Implicit Input vs. Explicit
• Recognize user behaviour
• Provide adaptive feedback
• Move beyond check-lists of actions
• E.g. AR + Intelligent Tutoring
• Constraint based ITS + AR
• PC Assembly (Westerfield, 2015)
• 30% faster, 25% better retention
Westerfield, G., Mitrovic, A., & Billinghurst, M. (2015). Intelligent Augmented Reality Training for
Motherboard Assembly. International Journal of Artificial Intelligence in Education, 25(1), 157-172.
https://www.youtube.com/watch?v=vifHh4WjEFE
TRACKING
Evolution of Tracking
• Past
• Location based, marker based,
• magnetic/mechanical
• Present
• Image based, hybrid tracking
• Future
• Ubiquitous
• Model based
• Environmental
Model Based Tracking
• Track from known 3D model
• Use depth + colour information
• Match input to model template
• Use CAD model of targets
• Recent innovations
• Learn models online
• Tracking from cluttered scene
• Track from deformable objects
Hinterstoisser, S., Lepetit, V., Ilic, S., Holzer, S., Bradski, G., Konolige, K., & Navab, N. (2013). Model based training, detection
and pose estimation of texture-less 3D objects in heavily cluttered scenes. In Computer Vision–ACCV 2012 (pp. 548-562).
Deformable Object Tracking
https://www.youtube.com/watch?v=t2vqsitWLKs
Environmental Tracking (3+ yrs)
• Environment capture
• Use depth sensors to capture scene & track from model
• InifinitAM (www.robots.ox.ac.uk/~victor/infinitam/)
• Real time scene capture, dense or sparse capture, open source
• iPad Pro LiDAR
• Scene scanning up to 5m
https://www.youtube.com/watch?v=qZY6y1IVIfw
Fusion4D (2016)
• Real capture and dynamic reconstruction
• RGBD sensors + incremental reconstruction
Dou, M., Khamis, S., Degtyarev, Y., Davidson, P., Fanello, S. R., Kowdle, A., ... & Izadi, S. (2016). Fusion4d:
Real-time performance capture of challenging scenes. ACM Transactions on Graphics (TOG), 35(4), 1-13.
Fusion4D Demo
https://www.youtube.com/watch?v=rnz0Kt36mOQ
Wide Area Outdoor Tracking
• Process
• Combine panoramas into point cloud model (offline)
• Initialize camera tracking from point cloud
• Update pose by aligning camera image to point cloud
• Accurate to 25 cm, 0.5 degree over very wide area
Ventura, J., & Hollerer, T. (2012). Wide-area scene mapping for mobile visual tracking. In Mixed
and Augmented Reality (ISMAR), 2012 IEEE International Symposium on (pp. 3-12). IEEE.
Wide Area Outdoor Tracking
https://www.youtube.com/watch?v=8ZNN0NeXV6s
AR Cloud Based Tracking
• AR Cloud
• a machine-readable 1:1 scale model of the real world
• processing recognition/tracking data in the cloud
• Can create cloud from input from multiple devices
• Store key visual features in cloud, Stitch features from multiple devices
• Retrieve for tracking/interaction
• AR Cloud Companies
• 6D.ai, Vertical.ai, Ubiquity6, etc
6D.ai Demo
• https://www.youtube.com/watch?v=AwwU14gllS0
PERCEPTION AND
NEUROSCIENCE
AR/VR as Perceptual Phenomena
• Virtual Reality
• Do I perceive myself as being in the Virtual Environment?
• Sense of Presence
• Augmented Reality
• Is that virtual object part of my real world?
• Sense of Object Presence
• What perceptual cues create a sense of Presence/Object Presence?
• How can Presence/Object Presence be measured?
Measuring Presence
• Presence is very subjective so how to measure it ?
• Subjective Measures
• Self report questionnaire
• University College London Questionnaire (Slater 1999)
• Witmer and Singer Presence Questionnaire (Witmer 1998)
• ITC Sense Of Presence Inventory (Lessiter 2000)
• Continuous measure
• Person moves slider bar in VE depending on Presence felt
• Objective Measures
• Behavioural
• reflex/flinch measure, startle response
• Physiological measures
• change in heart rate, skin conductance, skin temperature
Presence Slider
Using Neuro-Physiological Presence Measures
• Put people in High Presence/Low Presence VR
• Measure physiological cues
• EEG, ECG, GSR
• Measure subjective cues
• Presence surveys
• SUS, Witmer-Singer
• Correlate subjective and physiological results
Dey, A., Phoon, J., Saha, S., Dobbins, C., & Billinghurst, M. (2020, November). A Neurophysiological Approach for
Measuring Presence in Immersive Virtual Environments. In 2020 IEEE International Symposium on Mixed and Augmented
Reality (ISMAR) (pp. 474-485). IEEE.
Results
• Significant difference in subjective presence scores between HP/LP VE
• Significant difference in EEG power and heart rate between HP/LP VE
EEG Power Heart Rate
Perception Based Graphics
• Eye Physiology
• Rods in eye centre = colour vision, cones in periphery = motion, B+W
• Foveated Rendering
• Use eye tracking to draw highest resolution where user looking
• Reduces graphics throughput
Foveated Rendering
• https://www.youtube.com/watch?v=lNX0wCdD2LA
Making AR Content Appear Real
• Appearance
• Lighting
• Audio cues
• Occlusion
• Touch/Haptic feedback
• Etc..
Key Perceptual Issues in AR
• Classification of Perceptual Issues
• Environment, Capturing, Augmentation
• Display device, User
Kruijff, E., Swan, J. E., & Feiner, S. (2010, October). Perceptual issues in augmented reality revisited.
In 2010 IEEE International Symposium on Mixed and Augmented Reality (pp. 3-12). IEEE.
Example Perceptual Issues
• zxcZ
SOCIAL AND ETHICAL ISSUES
Social Acceptance
• People don’t want to look silly
• Only 12% of 4,600 adults would be willing to wear AR glasses
• 20% of mobile AR browser users experience social issues
• Acceptance more due to Social than Technical issues
• Needs further study (ethnographic, field tests, longitudinal)
TAT AugmentedID
https://www.youtube.com/watch?v=tb0pMeg1UN0
Ethical Issues
• Persuasive Technology
• Affecting emotions
• Behaviour modification
• Privacy Concerns
• Facial recognition
• Space capture
• Personal data
• Safety Concerns
• Sim sickness, Distraction
• Long term effects
Pase, S. (2012). Ethical considerations in augmented reality applications. In Proceedings of the International Conference on
e-Learning, e-Business, Enterprise Information Systems, and e-Government (EEE) (p. 1). The Steering Committee of The
World Congress in Computer Science, Computer Engineering and Applied Computing (WorldComp).
COLLABORATION
Using AR/VR for Enhanced Collaboration
• Changing perspective, Sharing views
• Copying spaces, Changing scale
• Copy bodies, Sharing non-verbal cues
AR for Remote Collaboration
• Camera + Processing + AR Display + Connectivity
• First person Ego-Vision Collaboration
AR View Remote Expert View
Shared Sphere – 360 Video Sharing
Shared
Live 360 Video
Host User Guest User
Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a
live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
https://www.youtube.com/watch?v=FFF6qP5Ap44
3D Live Scene Capture
• Use cluster of RGBD sensors
• Fuse together 3D point cloud
View Sharing Evolution
• Increased immersion
• Improved scene understanding
• Better collaboration
2D 360 3D
Sharing Virtual Communication Cues
• Collaboration between AR and VR
• Gaze Visualization Conditions
• Baseline, FoV, Head-gaze, Eye-gaze
https://www.youtube.com/watch?v=K_afCWZtExk
Multi-Scale Collaboration
• Changing the user’s virtual body scale
Sharing a View
Sharing: Separating Cues from Body
• What happens when you can’t see your colleague/agent?
Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018, April). Mini-me: An adaptive
avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI conference on human factors in computing systems (pp. 1-13).
Collaborating Collaborator out of View
Mini-Me Communication Cues in MR
• When lose sight of collaborator a Mini-Me avatar appears
• Miniature avatar in real world
• Mini-Me points to shared objects, show communication cues
• Redirected gaze, gestures
https://www.youtube.com/watch?v=YrdCg8zz57E
Scaling Up
• Supporting Large Groups of People
• Social VR spaces
• Large scale events
• Hybrid Interfaces
• AR/VR users with desktop/mobile
• Persistent virtual worlds
Facebook Horizon/Workrooms
• Key features
• Avatar creation, multi-user social VR
• Large scale 3D virtual spaces
• Meeting support, using real devices
• In world content creation/authoring
https://www.youtube.com/watch?v=Is8eXZco46Q
Hybrid Interfaces - Spatial
https://www.youtube.com/watch?v=PG3tQYlZ6JQ
Research Issues
• Avatars
• How to easily create?
• How realistic should they be?
• How can you communicate social cues?
• Hybrid Interfaces
• How can you provide equity across difference devices?
• Social Presence
• How can you objectively measure Social Presence?
• How use AR/VR cues to increase Social Presence?
Collaboration Technology Trends
1. Improved Content Capture
• Move from sharing faces to sharing places
2. Increased Network Bandwidth
• Sharing natural communication cues
3. Implicit Understanding
• Recognizing behaviour and emotion
Natural
Collaboration
Implicit
Understanding
Experienc
e Capture
Natural
Collaboration
Implicit
Understanding
Experienc
e Capture
Empathic
Computing
Empathic Computing
Can we develop systems
that allow us to share what
we are seeing, hearing and
feeling with others?
Piumsomboon, T., Lee, Y., Lee, G. A., Dey, A., & Billinghurst, M. (2017). Empathic Mixed
Reality: Sharing What You Feel and Interacting with What You See. In Ubiquitous Virtual
Reality (ISUVR), 2017 International Symposium on (pp. 38-41). IEEE.
Empathy Glasses (CHI 2016)
• Combine together eye-tracking, display, face expression
• Implicit cues – eye gaze, face expression
+
+
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of
the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
Remote Collaboration
• Eye gaze pointer and remote pointing
• Face expression display
• Implicit cues for remote collaboration
https://www.youtube.com/watch?v=CdgWVDbMwp4
Brain Synchronization
• Measure EEG of people collaborating
• Brain activity synchronizes
• More synchronization = better collaboration
Brain Synchronization
Pre-training (Finger Pointing) Session Start
Post-Training (Finger Pointing) Session End
Brain Synchronization in VR
asfd
Technology Trends
• Advanced displays
• Wide FOV, high resolution
• Real time space capture
• 3D scanning, stitching, segmentation
• Natural gesture interaction
• Hand tracking, pose recognition
• Robust eye-tracking
• Gaze points, focus depth
• Emotion sensing/sharing
• Physiological sensing, emotion mapping
• Advanced displays
• Real time space capture
• Natural gesture interaction
• Robust eye-tracking
• Emotion sensing/sharing
Empathic
Tele-Existence
Empathic Tele-Existence
• Move from Observer to Participant
• Explicit to Implicit communication
• Experiential collaboration – doing together
The Metaverse
• Neal Stephenson’s “SnowCrash”
• VR successor to the internet
• The Metaverse is the convergence of:
• 1) virtually enhanced physical reality
• 2) physically persistent virtual space
• Metaverse Roadmap
• http://metaverseroadmap.org/
Metaverse Dimensions
CONCLUSIONS
Conclusions
• AR/VR/MR is becoming commonly available
• Significant advances over 50+ years
• In order to achieve Sutherland’s vision, need research in
• Display, Tracking, Input
• New MR technologies will enable this to happen
• Display devices, Interaction, Tracking technologies
• There are still significant areas for research
• Social Acceptance, Perception, Collaboration, Etc.
More Information
Billinghurst, M. (2021). Grand
Challenges for Augmented
Reality. Frontiers in Virtual Reality, 2, 12.
www.empathiccomputing.org
@marknb00
mark.billinghurst@auckland.ac.nz

More Related Content

What's hot

Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 
Comp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingComp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingMark Billinghurst
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsMark Billinghurst
 
Advanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesMark Billinghurst
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR TechnologyMark Billinghurst
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XRMark Billinghurst
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
 
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysCOMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysMark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional InterfacesMark Billinghurst
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR SystemsMark Billinghurst
 
COMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsCOMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsMark Billinghurst
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR SystemsMark Billinghurst
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignMark Billinghurst
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VRMark Billinghurst
 
Developing VR Experiences with Unity
Developing VR Experiences with UnityDeveloping VR Experiences with Unity
Developing VR Experiences with UnityMark Billinghurst
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR PrototypingMark Billinghurst
 

What's hot (20)

Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Comp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingComp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and Prototyping
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 
Advanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR Studies
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysCOMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
 
COMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsCOMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR Systems
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface Design
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
 
Developing VR Experiences with Unity
Developing VR Experiences with UnityDeveloping VR Experiences with Unity
Developing VR Experiences with Unity
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
 

Similar to Grand Challenges for Mixed Reality

Beyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityMark Billinghurst
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented RealityMark Billinghurst
 
Mobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research DirectionsMobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research DirectionsMark Billinghurst
 
Future Directions for Augmented Reality
Future Directions for Augmented RealityFuture Directions for Augmented Reality
Future Directions for Augmented RealityMark Billinghurst
 
COMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRCOMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRMark Billinghurst
 
2016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 52016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 5Mark Billinghurst
 
COMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research DirectionsCOMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research DirectionsMark Billinghurst
 
Application in Augmented and Virtual Reality
Application in Augmented and Virtual RealityApplication in Augmented and Virtual Reality
Application in Augmented and Virtual RealityMark Billinghurst
 
Fifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using ARFifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using ARMark Billinghurst
 
Virtual World
Virtual WorldVirtual World
Virtual WorldSahith An
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityMark Billinghurst
 
Augmented Reality: The Next 20 Years
Augmented Reality: The Next 20 YearsAugmented Reality: The Next 20 Years
Augmented Reality: The Next 20 YearsMark Billinghurst
 
Mobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - TechnologyMobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - TechnologyMark Billinghurst
 
eng.pptx
eng.pptxeng.pptx
eng.pptxZuine
 
Seminar presentation
Seminar presentationSeminar presentation
Seminar presentationBrij Kishore
 
Create Biz Opportunities by Using Technological Disruptions
Create Biz Opportunities by Using Technological DisruptionsCreate Biz Opportunities by Using Technological Disruptions
Create Biz Opportunities by Using Technological DisruptionsHugo Gaston Ortega
 
Virtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesVirtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesMark Billinghurst
 
Augmented Reality: The Next 20 Years (AWE Asia 2015)
Augmented Reality: The Next 20 Years (AWE Asia 2015)Augmented Reality: The Next 20 Years (AWE Asia 2015)
Augmented Reality: The Next 20 Years (AWE Asia 2015)Mark Billinghurst
 
Google Cardboard Virtual Reality
Google Cardboard Virtual RealityGoogle Cardboard Virtual Reality
Google Cardboard Virtual RealityVicky VikRanth
 

Similar to Grand Challenges for Mixed Reality (20)

Beyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented Reality
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Mobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research DirectionsMobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research Directions
 
Future Directions for Augmented Reality
Future Directions for Augmented RealityFuture Directions for Augmented Reality
Future Directions for Augmented Reality
 
COMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRCOMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VR
 
2016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 52016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 5
 
COMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research DirectionsCOMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research Directions
 
Application in Augmented and Virtual Reality
Application in Augmented and Virtual RealityApplication in Augmented and Virtual Reality
Application in Augmented and Virtual Reality
 
Fifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using ARFifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using AR
 
Virtual World
Virtual WorldVirtual World
Virtual World
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual Reality
 
Augmented Reality: The Next 20 Years
Augmented Reality: The Next 20 YearsAugmented Reality: The Next 20 Years
Augmented Reality: The Next 20 Years
 
Mobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - TechnologyMobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - Technology
 
Virtual reality
Virtual realityVirtual reality
Virtual reality
 
eng.pptx
eng.pptxeng.pptx
eng.pptx
 
Seminar presentation
Seminar presentationSeminar presentation
Seminar presentation
 
Create Biz Opportunities by Using Technological Disruptions
Create Biz Opportunities by Using Technological DisruptionsCreate Biz Opportunities by Using Technological Disruptions
Create Biz Opportunities by Using Technological Disruptions
 
Virtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesVirtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the Possibilities
 
Augmented Reality: The Next 20 Years (AWE Asia 2015)
Augmented Reality: The Next 20 Years (AWE Asia 2015)Augmented Reality: The Next 20 Years (AWE Asia 2015)
Augmented Reality: The Next 20 Years (AWE Asia 2015)
 
Google Cardboard Virtual Reality
Google Cardboard Virtual RealityGoogle Cardboard Virtual Reality
Google Cardboard Virtual Reality
 

More from Mark Billinghurst

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesMark Billinghurst
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARMark Billinghurst
 

More from Mark Billinghurst (8)

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 

Recently uploaded

Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
Vertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsVertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsMiki Katsuragi
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piececharlottematthew16
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyAlfredo García Lavilla
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024The Digital Insurer
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clashcharlottematthew16
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek SchlawackFwdays
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr LapshynFwdays
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 

Recently uploaded (20)

Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food Manufacturing
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
Vertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsVertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering Tips
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piece
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easy
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024
 
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort ServiceHot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clash
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 

Grand Challenges for Mixed Reality

  • 1. GRAND CHALLENGES FOR MIXED REALITY Mark Billinghurst mark.billinghurst@auckland.ac.nz October 19th 2021
  • 2. 1967 – IBM 1401 – half of the computers in the world, $10,000/month to run
  • 3. Sketchpad (1963) • Ivan Sutherland • First interactive graphics • Pen based input
  • 4. The ultimate display would, of course, be a room within which the computer can control the existence of matter. A chair displayed in such a room would be good enough to sit in. Handcuffs displayed in such a room would be confining, and a bullet displayed in such a room would be fatal.. Sutherland, Ivan. "The ultimate display." (1965).
  • 6. Star Trek – HoloDeck (1974)
  • 7. Star Wars – Hologram (1977)
  • 8. My First VR Experience - 1990 • Silicon Graphics Reality Engine • 500,000 polygons/second • VPL Eyephone HMD • 320 x 240 resolution • Magnetic tracking • Glove input • Expensive - $250,000+
  • 9. Star Wars – Collaborative AR
  • 10. 1999 – Shared Space Demo • Face to face collaborative AR like Star Wars concept
  • 12. Marker Based Tracking: ARToolKit (1999) https://github.com/artoolkit
  • 13. CPU: 300 Mhz HDD; 9GB RAM: 512 mb Camera: VGA 30fps Graphics: 500K poly/sec 1998: SGI O2 2008: Nokia N95 CPU: 332 Mhz HDD; 8GB RAM: 128 mb Camera: VGA 30 fps Graphics: 2m poly/sec By 2008 phones had the same hardware as used in Shared Space demo
  • 14. 2005: Mobile AR version of Shared Space • AR Tennis • Shared AR content • Two user game • Audio + haptic feedback • Bluetooth networking
  • 16. VR and AR Today • Large growing market • > $25 Billion USD in 2020 • Hundreds of millions of users • Many available devices • HMD, phones, tablets, HUDs • Robust developer tools • Vuforia, MRTK, Unity, etc • Large number of applications • > 150K developers, > 100K apps • Strong research/business communities • ISMAR, IEEE VR, AWE conferences, AugmentedReality.org, etc
  • 19.
  • 20. Future Visions of VR: Ready Player One • https://www.youtube.com/watch?v=LiK2fhOY0nE
  • 21. Today vs. Tomorrow VR in 2021 VR in 2045 Graphics High quality Photo-realistic Display 110-150 degrees Total immersion Interaction Handheld controller/some gesture Full gesture/body/gaze Navigation Limited movement Natural Multiuser Few users Millions of users
  • 23.
  • 24. “.. the technologies that will significantly affect our lives over the next 10 years have been around for a decade. The future is with us ... The trick is learning how to spot it” October 2004 Bill Buxton
  • 25. Key Technologies for MR Systems • Display • Stimulate visual, hearing/touch sense • Tracking • Changing viewpoint, registered content • Interaction • Supporting user input
  • 27. • Past • Bulky Head mounted displays • Current • Handheld, lightweight head mounted • Future • Projected AR • Wide FOV see through • Retinal displays • Contact lens Evolution in Displays
  • 28. North Focals (2020) • https://www.bynorth.com • Socially acceptable smart glasses • $599 USD, small field of view • Ring input device
  • 29. Wide FOV See-Through (3+ years) • Waveguide techniques • Wider FOV • Thin see through • Socially acceptable • Pinlight Displays • LCD panel + point light sources • 110 degree FOV • UNC/Nvidia Lumus DK40 Maimone, A., Lanman, D., Rathinavel, K., Keller, K., Luebke, D., & Fuchs, H. (2014). Pinlight displays: wide field of view augmented reality eyeglasses using defocused point light sources. In ACM SIGGRAPH 2014 Emerging Technologies (p. 20). ACM.
  • 31. Kura Gallium Glasses (2020) • https://www.kura.tech/ • Pinpoint wide field of view effect - 150° FOV; • Resolution in the range 4K-8K; • Very transparent screen
  • 32. Contact Lens (15 + years) • Contact Lens only • Unobtrusive • Significant technical challenges • Power, data, resolution • Babak Parviz (2008) • MojoVision • Mojo Lens • Prototype smart contact lens • https://www.mojo.vision/ http://spectrum.ieee.org/biomedical/bionics/augmented-reality-in-a-contact-lens/
  • 33.
  • 36. Evolution of Interaction • Past • Limited interaction • Viewpoint manipulation • Present • Screen based, simple gesture • tangible interaction • Future • Natural gesture, Multimodal • Intelligent Interfaces • Physiological/Sensor based
  • 37. Natural Gesture • Freehand gesture input • Depth sensors for gesture capture • Move beyond simple pointing • Rich two handed gestures • Eg Microsoft Research Hand Tracker • 3D hand tracking, 30 fps, single sensor • Commercial Systems • Hololens2, Oculus, Intel, MagicLeap, etc Sharp, T., Keskin, C., Robertson, D., Taylor, J., Shotton, J., Leichter, D. K. C. R. I., ... & Izadi, S. (2015, April). Accurate, Robust, and Flexible Real-time Hand Tracking. In Proc. CHI (Vol. 8).
  • 39. Multi-Scale Gesture • Combine different gesture types • In-air gestures – natural but imprecise • Micro-gesture – fine scale gestures • Gross motion + fine tuning interaction Ens, B., Quigley, A., Yeo, H. S., Irani, P., Piumsomboon, T., & Billinghurst, M. (2018). Counterpoint: Exploring Mixed-Scale Gesture Interaction for AR Applications. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems (p. LBW120). ACM.
  • 41. Eye Tracking Input • HMDs with integrated eye-tracking • Hololens2, MagicLeap One • Research questions • How can eye gaze be used for interaction? • What interaction metaphors are natural?
  • 42. Eye Gaze Interaction Methods • Gaze for interaction • Implicit vs. explicit input • Exploring different gaze interaction • Duo reticles – use eye saccade input • Hardware • HTC Vive + Pupil Labs integrated eye-tracking Piumsomboon, T., Lee, G., Lindeman, R. W., & Billinghurst, M. (2017, March). Exploring natural eye-gaze-based interaction for immersive virtual reality. In 3D User Interfaces (3DUI), 2017 IEEE Symposium on (pp. 36-39). IEEE.
  • 43. Duo-Reticles (DR) Inertial Reticle (IR) Real-time Reticle (RR) or Eye-gaze Reticle (original name) A-1 As RR and IR are aligned, alignment time counts down A-2 A-3 Selection completed
  • 45. Physiological Sensor Input • Using physiological sensors for implicit input • Systems that recognize user intent/activity • EEG • Measuring brain activity • EMG • Measuring muscle activity
  • 47. HP Reverb G2 Omnicept • Wide FOV, high resolution, best in class VR display • Eye tracking, heart rate, pupillometry, and face camera
  • 48. NextMind • EEG attachment for AR/VR HMD • 9 dry EEG electrodes • https://www.next-mind.com/
  • 50. Multimodal Input • Combine gesture and speech input • Gesture good for qualitative input • Speech good for quantitative input • Support combined commands • “Put that there” + pointing • E.g. HIT Lab NZ multimodal input • 3D hand tracking, speech • Multimodal fusion module • Complete tasks faster with MMI, less errors Billinghurst, M., Piumsomboon, T., & Bai, H. (2014). Hands in Space: Gesture Interaction with Augmented-Reality Interfaces. IEEE computer graphics and applications, (1), 77-80.
  • 51. Intelligent Interfaces • Move to Implicit Input vs. Explicit • Recognize user behaviour • Provide adaptive feedback • Move beyond check-lists of actions • E.g. AR + Intelligent Tutoring • Constraint based ITS + AR • PC Assembly (Westerfield, 2015) • 30% faster, 25% better retention Westerfield, G., Mitrovic, A., & Billinghurst, M. (2015). Intelligent Augmented Reality Training for Motherboard Assembly. International Journal of Artificial Intelligence in Education, 25(1), 157-172.
  • 54. Evolution of Tracking • Past • Location based, marker based, • magnetic/mechanical • Present • Image based, hybrid tracking • Future • Ubiquitous • Model based • Environmental
  • 55. Model Based Tracking • Track from known 3D model • Use depth + colour information • Match input to model template • Use CAD model of targets • Recent innovations • Learn models online • Tracking from cluttered scene • Track from deformable objects Hinterstoisser, S., Lepetit, V., Ilic, S., Holzer, S., Bradski, G., Konolige, K., & Navab, N. (2013). Model based training, detection and pose estimation of texture-less 3D objects in heavily cluttered scenes. In Computer Vision–ACCV 2012 (pp. 548-562).
  • 57. Environmental Tracking (3+ yrs) • Environment capture • Use depth sensors to capture scene & track from model • InifinitAM (www.robots.ox.ac.uk/~victor/infinitam/) • Real time scene capture, dense or sparse capture, open source • iPad Pro LiDAR • Scene scanning up to 5m
  • 59. Fusion4D (2016) • Real capture and dynamic reconstruction • RGBD sensors + incremental reconstruction Dou, M., Khamis, S., Degtyarev, Y., Davidson, P., Fanello, S. R., Kowdle, A., ... & Izadi, S. (2016). Fusion4d: Real-time performance capture of challenging scenes. ACM Transactions on Graphics (TOG), 35(4), 1-13.
  • 61. Wide Area Outdoor Tracking • Process • Combine panoramas into point cloud model (offline) • Initialize camera tracking from point cloud • Update pose by aligning camera image to point cloud • Accurate to 25 cm, 0.5 degree over very wide area Ventura, J., & Hollerer, T. (2012). Wide-area scene mapping for mobile visual tracking. In Mixed and Augmented Reality (ISMAR), 2012 IEEE International Symposium on (pp. 3-12). IEEE.
  • 62. Wide Area Outdoor Tracking https://www.youtube.com/watch?v=8ZNN0NeXV6s
  • 63. AR Cloud Based Tracking • AR Cloud • a machine-readable 1:1 scale model of the real world • processing recognition/tracking data in the cloud • Can create cloud from input from multiple devices • Store key visual features in cloud, Stitch features from multiple devices • Retrieve for tracking/interaction • AR Cloud Companies • 6D.ai, Vertical.ai, Ubiquity6, etc
  • 66. AR/VR as Perceptual Phenomena • Virtual Reality • Do I perceive myself as being in the Virtual Environment? • Sense of Presence • Augmented Reality • Is that virtual object part of my real world? • Sense of Object Presence • What perceptual cues create a sense of Presence/Object Presence? • How can Presence/Object Presence be measured?
  • 67. Measuring Presence • Presence is very subjective so how to measure it ? • Subjective Measures • Self report questionnaire • University College London Questionnaire (Slater 1999) • Witmer and Singer Presence Questionnaire (Witmer 1998) • ITC Sense Of Presence Inventory (Lessiter 2000) • Continuous measure • Person moves slider bar in VE depending on Presence felt • Objective Measures • Behavioural • reflex/flinch measure, startle response • Physiological measures • change in heart rate, skin conductance, skin temperature Presence Slider
  • 68. Using Neuro-Physiological Presence Measures • Put people in High Presence/Low Presence VR • Measure physiological cues • EEG, ECG, GSR • Measure subjective cues • Presence surveys • SUS, Witmer-Singer • Correlate subjective and physiological results Dey, A., Phoon, J., Saha, S., Dobbins, C., & Billinghurst, M. (2020, November). A Neurophysiological Approach for Measuring Presence in Immersive Virtual Environments. In 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 474-485). IEEE.
  • 69. Results • Significant difference in subjective presence scores between HP/LP VE • Significant difference in EEG power and heart rate between HP/LP VE EEG Power Heart Rate
  • 70. Perception Based Graphics • Eye Physiology • Rods in eye centre = colour vision, cones in periphery = motion, B+W • Foveated Rendering • Use eye tracking to draw highest resolution where user looking • Reduces graphics throughput
  • 72. Making AR Content Appear Real • Appearance • Lighting • Audio cues • Occlusion • Touch/Haptic feedback • Etc..
  • 73. Key Perceptual Issues in AR • Classification of Perceptual Issues • Environment, Capturing, Augmentation • Display device, User Kruijff, E., Swan, J. E., & Feiner, S. (2010, October). Perceptual issues in augmented reality revisited. In 2010 IEEE International Symposium on Mixed and Augmented Reality (pp. 3-12). IEEE.
  • 76. Social Acceptance • People don’t want to look silly • Only 12% of 4,600 adults would be willing to wear AR glasses • 20% of mobile AR browser users experience social issues • Acceptance more due to Social than Technical issues • Needs further study (ethnographic, field tests, longitudinal)
  • 78.
  • 79.
  • 80. Ethical Issues • Persuasive Technology • Affecting emotions • Behaviour modification • Privacy Concerns • Facial recognition • Space capture • Personal data • Safety Concerns • Sim sickness, Distraction • Long term effects Pase, S. (2012). Ethical considerations in augmented reality applications. In Proceedings of the International Conference on e-Learning, e-Business, Enterprise Information Systems, and e-Government (EEE) (p. 1). The Steering Committee of The World Congress in Computer Science, Computer Engineering and Applied Computing (WorldComp).
  • 82. Using AR/VR for Enhanced Collaboration • Changing perspective, Sharing views • Copying spaces, Changing scale • Copy bodies, Sharing non-verbal cues
  • 83. AR for Remote Collaboration • Camera + Processing + AR Display + Connectivity • First person Ego-Vision Collaboration
  • 84. AR View Remote Expert View
  • 85. Shared Sphere – 360 Video Sharing Shared Live 360 Video Host User Guest User Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
  • 87. 3D Live Scene Capture • Use cluster of RGBD sensors • Fuse together 3D point cloud
  • 88.
  • 89. View Sharing Evolution • Increased immersion • Improved scene understanding • Better collaboration 2D 360 3D
  • 90. Sharing Virtual Communication Cues • Collaboration between AR and VR • Gaze Visualization Conditions • Baseline, FoV, Head-gaze, Eye-gaze
  • 92. Multi-Scale Collaboration • Changing the user’s virtual body scale
  • 93.
  • 95. Sharing: Separating Cues from Body • What happens when you can’t see your colleague/agent? Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018, April). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI conference on human factors in computing systems (pp. 1-13). Collaborating Collaborator out of View
  • 96. Mini-Me Communication Cues in MR • When lose sight of collaborator a Mini-Me avatar appears • Miniature avatar in real world • Mini-Me points to shared objects, show communication cues • Redirected gaze, gestures
  • 98. Scaling Up • Supporting Large Groups of People • Social VR spaces • Large scale events • Hybrid Interfaces • AR/VR users with desktop/mobile • Persistent virtual worlds
  • 99. Facebook Horizon/Workrooms • Key features • Avatar creation, multi-user social VR • Large scale 3D virtual spaces • Meeting support, using real devices • In world content creation/authoring
  • 101. Hybrid Interfaces - Spatial https://www.youtube.com/watch?v=PG3tQYlZ6JQ
  • 102. Research Issues • Avatars • How to easily create? • How realistic should they be? • How can you communicate social cues? • Hybrid Interfaces • How can you provide equity across difference devices? • Social Presence • How can you objectively measure Social Presence? • How use AR/VR cues to increase Social Presence?
  • 103. Collaboration Technology Trends 1. Improved Content Capture • Move from sharing faces to sharing places 2. Increased Network Bandwidth • Sharing natural communication cues 3. Implicit Understanding • Recognizing behaviour and emotion
  • 106. Empathic Computing Can we develop systems that allow us to share what we are seeing, hearing and feeling with others? Piumsomboon, T., Lee, Y., Lee, G. A., Dey, A., & Billinghurst, M. (2017). Empathic Mixed Reality: Sharing What You Feel and Interacting with What You See. In Ubiquitous Virtual Reality (ISUVR), 2017 International Symposium on (pp. 38-41). IEEE.
  • 107. Empathy Glasses (CHI 2016) • Combine together eye-tracking, display, face expression • Implicit cues – eye gaze, face expression + + Pupil Labs Epson BT-200 AffectiveWear Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
  • 108. Remote Collaboration • Eye gaze pointer and remote pointing • Face expression display • Implicit cues for remote collaboration
  • 110. Brain Synchronization • Measure EEG of people collaborating • Brain activity synchronizes • More synchronization = better collaboration
  • 115.
  • 116.
  • 117. asfd
  • 118.
  • 119. Technology Trends • Advanced displays • Wide FOV, high resolution • Real time space capture • 3D scanning, stitching, segmentation • Natural gesture interaction • Hand tracking, pose recognition • Robust eye-tracking • Gaze points, focus depth • Emotion sensing/sharing • Physiological sensing, emotion mapping
  • 120. • Advanced displays • Real time space capture • Natural gesture interaction • Robust eye-tracking • Emotion sensing/sharing Empathic Tele-Existence
  • 121. Empathic Tele-Existence • Move from Observer to Participant • Explicit to Implicit communication • Experiential collaboration – doing together
  • 122. The Metaverse • Neal Stephenson’s “SnowCrash” • VR successor to the internet • The Metaverse is the convergence of: • 1) virtually enhanced physical reality • 2) physically persistent virtual space • Metaverse Roadmap • http://metaverseroadmap.org/
  • 125. Conclusions • AR/VR/MR is becoming commonly available • Significant advances over 50+ years • In order to achieve Sutherland’s vision, need research in • Display, Tracking, Input • New MR technologies will enable this to happen • Display devices, Interaction, Tracking technologies • There are still significant areas for research • Social Acceptance, Perception, Collaboration, Etc.
  • 126. More Information Billinghurst, M. (2021). Grand Challenges for Augmented Reality. Frontiers in Virtual Reality, 2, 12.