SlideShare a Scribd company logo
PERCEPTION
COMP 4010 Lecture Two
Mark Billinghurst
August 4rd 2022
mark.billinghurst@unisa.edu.au
REVIEW
The Incredible Disappearing Computer
1960-70’s
Room
1970-80’s
Desk
1980-90’s
Lap
1990-2000’s
Hand
2010 -
Head
Rekimoto, J. and Nagao, K. 1995. The world through the computer: computer augmented interaction with real world environments.
Making Interfaces Invisible
(c) Internet of Things
Internet of Things (IoT)..
• Embed computing and sensing in real world
• Smart objects, sensors, etc..
(c) Internet of Things
Virtual Reality (VR)
• Users immersed in Computer Generated environment
• HMD, gloves, 3D graphics, body tracking
Augmented Reality (AR)
• Virtual Images blended with the real world
• See-through HMD, handheld display, viewpoint tracking, etc..
AR vs VR
Milgram’s Mixed Reality (MR) Continuum
Augmented Reality Virtual Reality
Real World Virtual World
Mixed Reality
"...anywhere between the extrema of the virtuality continuum."
P. Milgram and A. F. Kishino, (1994) A Taxonomy of Mixed Reality Visual Displays
Internet of Things
Extended Reality (XR)
Augmented Reality Virtual Reality
Real World Virtual World
Mixed Reality
Extended Reality
Internet of Things
Metaverse Components
• Four Key Components
• Virtual Worlds
• Augmented Reality
• Mirror Worlds
• Lifelogging
Ivan Sutherland (1960s)
1
2
Ivan Sutherland’s Head-Mounted Display (1968)
Super Cockpit (1965-80’s)
• US Airforce Research Program
• Wright Patterson Air Force Base
• Tom Furness III
• Multisensory
• Visual, auditory, tactile
• Head, eye, speech, and hand input
• Addressing pilot information overload
• Flight controls and tasks too complicated
• Research only
• big system, not safe for ejecting
VPL Research (1985 – 1999)
• First Commercial VR Company
• Jaron Lanier, Jean-Jacques Grimaud
• Provided complete systems
• Displays, software, gloves, etc
• DataGlove, EyePhone, AudioSphere
First Industrial Use of AR (1990’s)
• 1992: Tom Caudell at Boeing coined the term “AR.”
• Wire harness assembly application begun
• Lead by Tom Caudell, and David Mizell
Desktop VR - 1995
• Expensive - $150,000+
• 2 million polys/sec
• VGA HMD – 30 Hz
• Magnetic tracking
Mobile/Wearable Systems (1995)
• 1995 Navicam (Rekimoto)
• Handheld AR
• 1997 Touring Machine (Feiner)
• Backpack AR, GPS, see-through display
• 1998 Tinmith (Thomas, UniSA)
• Outdoor gaming, CAD
Rise of Commercial VR Companies
• W Industries/Virtuality (1985 - 97)
• Location based entertainment
• Virtuality VR Arcades
• Division (1989 – 1998)
• Turn key VR systems
• Visual programming tools
• Virtual i-O (1993 -1997)
• Inexpensive gamer HMDs
• Sense8 (1990 - 1998)
• WorldToolKit, WorldUp
• VR authoring tools
Mobile Phone AR (2005)
• Mobile Phones
• camera
• processor
• display
• AR on Mobile Phones
• Simple graphics
• Optimized computer vision
• Collaborative Interaction
2008 - Browser Based AR
• Flash + camera + 3D graphics
• ARToolKit ported to Flash
• High impact
• High marketing value
• Large potential install base
• 1.6 Billion web users
• Ease of development
• Lots of developers, mature tools
• Low cost of entry
• Browser, web camera
2008: Location Aware Phones
Nokia Navigator
Motorola Droid
VR Second Wave (2010 - )
• Palmer Luckey
• HMD hacker
• Mixed Reality Lab (MxR) intern
• Oculus Rift (2011 - )
• 2012 - $2.4 million kickstarter
• 2014 - $2B acquisition FaceBook
• $350 USD, 110o FOV
Desktop VR in 2016
• Graphics Desktop
• $1,500 USD
• >4 Billion poly/sec
• $600 HMD
• 1080x1200, 90Hz
• Optical tracking
• Room scale
Oculus Rift
Sony Morpheus
HTC/Valve Vive
2016 - Rise of Consumer HMDs
Social Mobile Camera AR Apps (2015 - )
• SnapChat - Lenses, World Lenses
• Cinco de Mayo lens > 225 million views
• Facebook - Camera Effects
• Google – Word Lens/Translate
Hololens (2016)
• Integrated system – Windows
• Stereo see-through display
• Depth sensing tracking
• Voice and gesture interaction
• Note: Hololens2 coming September 2019
ARKit/ARcore (2017)
• Visual Inertial Odometry (VIO) systems
• Mobile phone pose tracked by
• Camera (Visual), Accelerometer & Gyroscope (Intertial)
• Features
• Plane detection, lighting detection, hardware optimisation
• Links
• https://developer.apple.com/arkit/
• https://developers.google.com/ar/
History Summary
• 1960’s – 80’s: Early Experimentation
• 1980’s – 90’s: Basic Research
• Tracking, displays
• 1995 – 2005: Tools/Applications
• Interaction, usability, theory
• 2005 - : Commercial Applications
• Mobile, Games, Medical, Industry
THE BUSINESS OF AR/VR
Why 2022 won’t be like 1996
• It’s not just VR anymore
• Huge amount of investment
• Inexpensive hardware platforms
• Easy to use content creation tools
• New devices for input and output
• Proven use cases – no more Hype!
• Most important: Focus on User Experience
Example: Pokemon GO
Killer Combo: brand + social + mobile + geo-location + AR
Pokemon GO Effect
• Fastest App to reach $500 million in Revenue
• Only 63 days after launch, > $1 Billion in 6 months
• Over 500 million downloads, > 25 million DAU
• Nintendo stock price up by 50% (gain of $9 Billion USD)
Augmented Reality in 2022
• Large growing market
• > $13Billion USD in 2021
• Many available devices
• HMD, phones, tablets, HUDs
• Robust developer tools
• Vuforia, ARToolKit, Unity, Wikitude, etc
• Large number of applications
• > 150K developers, > 100K mobile apps
• Strong research/business communities
• ISMAR, AWE conferences, AugmentedReality.org, etc
Large Growing Industry
Conclusion
• AR/VR has a long history
• > 50 years of HMDs, simulators
• Key elements for were in place by early 1990’s
• Displays, tracking, input, graphics
• Strong support from military, government, universities
• First commercial wave failed in late 1990’s
• Too expensive, bad user experience, poor technology, etc
• We are now in second commercial wave
• Better experience, Affordable hardware
• Large commercial investment, Significant installed user base
• Will XR be a commercial success this time?
PERCEPTION
What is Reality?
How do We Perceive Reality?
• We understand the world through
our senses:
• Sight, Hearing, Touch, Taste, Smell
(and others..)
• Two basic processes:
• Sensation – Gathering information
• Perception – Interpreting information
Simple Sensing/Perception Model
Goal of Virtual Reality
“.. to make it feel like you’re actually in a place that
you are not.”
Palmer Luckey
Co-founder, Oculus
Creating the Illusion of Reality
• Fooling human perception by using
technology to generate artificial sensations
• Computer generated sights, sounds, smell, etc
Reality vs. Virtual Reality
• In a VR system there are input and output devices
between human perception and action
Example Birdly - http://www.somniacs.co/
• Create illusion of flying like a bird
• Multisensory VR experience
• Visual, audio, wind, haptic
Birdly Demo
https://www.youtube.com/watch?v=gHE6H62GHoM
PRESENCE
Presence ..
“The subjective experience of being in one place or
environment even when physically situated in another”
Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence
questionnaire. Presence: Teleoperators and virtual environments, 7(3), 225-240.
Immersion vs. Presence
• Immersion: describes the extent to which technology is capable of
delivering a vivid illusion of reality to the senses of a human participant.
• Presence: a state of consciousness, the (psychological) sense of being
in the virtual environment.
• So Immersion, defined in technical terms, is capable of producing a
sensation of Presence
• Goal of VR: Create a high degree of Presence
• Make people believe they are really in Virtual Environment
Slater, M., & Wilbur, S. (1997). A framework for immersive virtual environments (FIVE): Speculations on the role
of presence in virtual environments. Presence: Teleoperators and virtual environments, 6(6), 603-616.
How to Create Strong Presence?
• Use Multiple Dimensions of Presence
• Create rich multi-sensory VR experiences
• Include social actors/agents that interact with the user
• Have environment respond to the user
• What Influences Presence
• Vividness – ability to provide rich experience (Steuer 1992)
• Using Virtual Body – user can see themselves (Slater 1993)
• Internal factors – individual user differences (Sadowski 2002)
• Interactivity – how much users can interact (Steuer 1992)
• Sensory, Realism factors (Witmer 1998)
Five Key Technical Requirements for Presence
• Persistence
• > 90 Hz refresh, < 3 ms persistence, avoid retinal blur
• Optics
• Wide FOV > 90 degrees, comfortable eyebox, good calibration
• Tracking
• 6 DOF, 360 tracking, sub-mm accuracy, no jitter, good tracking volume
• Resolution
• Correct stereo, > 1K x 1K resolution, no visible pixels
• Latency
• < 20 ms latency, fuse optical tracking and IMU, minimize tracking loop
http://www.roadtovr.com/oculus-shares-5-key-ingredients-for-presence-in-virtual-reality/
Example: UNC Pit Room
• Key Features
• Training room and pit room
• Physical walking
• Fast, accurate, room scale tracking
• Haptic feedback – feel edge of pit, walls
• Strong visual and 3D audio cues
• Task
• Carry object across pit
• Walk across or walk around
• Dropping virtual balls at targets in pit
• http://wwwx.cs.unc.edu/Research/eve/walk_exp/
Typical Subject Behaviour
• Note – from another pit experiment
• https://www.youtube.com/watch?v=VVAO0DkoD-8
Richie’s Plank
• https://www.youtube.com/watch?v=4M92kfnpg-k
Why do people behave like this?
• Presence can be decomposed into two dimensions (Slater 2009):
• “Place Illusion” (PI): being in the place depicted in the VR environment
• perception in VR matches natural sensorimotor input
• Plausibility Illusion (Psi): the events in the VR environment are actually occurring
• VR environment responds to user actions
• When both PI and Psi are high, people respond realistically to events in the VR
Slater, M. (2009). Place illusion and plausibility can lead to realistic behaviour in immersive virtual
environments. Philosophical Transactions of the Royal Society B: Biological Sciences, 364(1535), 3549-3557.
Presence = PI + Psi + ??
Slater, M., Banakou, D., Beacco, A., Gallego, J., Macia-Varela, F., & Oliva, R. (2022). A Separate Reality: An
Update on Place Illusion and Plausibility in Virtual Reality. Frontiers in Virtual Reality, 81.
Four Illusions of Presence (Slater 2022)
• Place Illusion: being in the place
• Plausibility Illusion: events are real
• Body Ownership: seeing your body in VR
• Copresence/Social Presence: other people are in VR
Social Presence
• What makes a Person appear real?
• Interactivity
• Visual appearance
• Audio cues
• Touch
• Contextual cues
• Etc..
Oh, C. S., Bailenson, J. N., & Welch, G. F. (2018). A systematic review of social presence:
Definition, antecedents, and implications. Frontiers in Robotics and AI, 5, 114.
Object Presence
• What makes an object appear real?
• Touch/Haptic feedback
• Appearance
• Lighting
• Audio cues
• Occlusion
• Etc..
Benefits of High Presence
• Leads to greater engagement, excitement and satisfaction
• Increased reaction to actions in VR
• People more likely to behave like in the real world
• E.g. people scared of heights in real world will be scared in VR
• More natural communication (Social Presence)
• Use same cues as face-to-face conversation
• Note: The relationship between Presence and Performance is unclear
Measuring Presence
• Presence is very subjective so there is a lot of debate
among researchers about how to measure it
• Subjective Measures
• Self report questionnaire
• University College London Questionnaire (Slater 1999)
• Witmer and Singer Presence Questionnaire (Witmer 1998)
• ITC Sense Of Presence Inventory (Lessiter 2000)
• Continuous measure
• Person moves slider bar in VE depending on Presence felt
• Objective Measures
• Behavioural
• reflex/flinch measure, startle response
• Physiological measures
• change in heart rate, skin conductance, skin temperature Presence Slider
PERCEPTION
Motivation
• Understand: In order to create a strong sense of Presence
we need to understand the Human Perception system
• Stimulate: We need to be able to use technology to provide
real world sensory inputs, and create the VR illusion
VR Hardware Human Senses
Senses
• How an organism obtains information for perception:
• Sensation part of Somatic Division of Peripheral Nervous System
• Integration and perception requires the Central Nervous System
• Five major senses (but there are more..):
• Sight (Opthalamoception)
• Hearing (Audioception)
• Taste (Gustaoception)
• Smell (Olfacaoception)
• Touch (Tactioception)
Relative Importance of Each Sense
• Percentage of neurons in
brain devoted to each sense
• Sight – 30%
• Touch – 8%
• Hearing – 2%
• Smell - < 1%
• Over 60% of brain involved
with vision in some way
Other Lessor Known Senses..
• Proprioception = sense of body position
• what is your body doing right now
• Equilibrium = balance
• Acceleration
• Nociception = sense of pain
• Temperature
• Satiety = state of being fed or gratified to or beyond capacity
• Thirst
• Micturition = amount of CO2 and Na in blood
Sight
The Human Visual System
• Purpose is to convert visual input to signals in the brain
The Human Eye
• Light passes through cornea and lens onto retina
• Photoreceptors in retina convert light into electrochemical signals
Photoreceptors – Rods and Cones
• Retina photoreceptors come in two types, Rods and Cones
• Rods – 125 million, periphery of retina, no colour detection, night vision
• Cones – 4-6 million, center of retina, colour vision, day vision
Human Horizontal and Vertical FOV
• Humans can see ~135
o
vertical (60
o
above, 75
o
below)
• See up to ~ 210
o
horizontal FOV, ~ 115
o
stereo overlap
• Colour/stereo in centre, black and white/mono in periphery
Vergence + Accommodation
• saas
Vergence/Accommodation Demo
• https://www.youtube.com/watch?v=p_xLO7yxgOk
Vergence-Accommodation Conflict
• Looking at real objects, vergence and focal distance match
• In VR, vergence and accommodation can miss-match
• Focusing on HMD screen, but accommodating for virtual object behind screen
Visual Acuity
Visual Acuity Test Targets
• Ability to resolve details
• Several types of visual acuity
• detection, separation, etc
• Normal eyesight can see a 50 cent coin at 80m
• Corresponds to 1 arc min (1/60th of a degree)
• Max acuity = 0.4 arc min
Stereo Perception/Stereopsis
• Eyes separated by IPD
• Inter pupillary distance
• 5 – 7.5cm (avge. 6.5cm)
• Each eye sees diff. image
• Separated by image parallax
• Images fused to create 3D
stereo view
Depth Perception
• The visual system uses a range of different Stereoscopic
and Monocular cues for depth perception
Stereoscopic Monocular
eye convergence angle
disparity between left
and right images
diplopia
eye accommodation
perspective
atmospheric artifacts (fog)
relative sizes
image blur
occlusion
motion parallax
shadows
texture
Parallax can be more important for depth perception!
Stereoscopy is important for size and distance evaluation
Common Depth Cues
Depth Perception Distances
• i.e. convergence/accommodation used for depth perception < 10m
Properties of the Human Visual System
• visual acuity: 20/20 is ~1 arc min
• field of view: ~200° monocular, ~120° binocular, ~135° vertical
• resolution of eye: ~576 megapixels
• temporal resolution: ~60 Hz (depends on contrast, luminance)
• dynamic range: instantaneous 6.5 f-stops, adapt to 46.5 f-stops
• colour: everything in CIE xy diagram
• depth cues in 3D displays: vergence, focus, (dis)comfort
• accommodation range: ~8cm to ∞, degrades with age
Creating the Perfect Illusion
Cuervo, E., Chintalapudi, K., & Kotaru, M. (2018,
February). Creating the perfect illusion: What will it
take to create life-like virtual reality headsets?.
In Proceedings of the 19th International Workshop on
Mobile Computing Systems & Applications (pp. 7-12).
• Technology to create life-like VR HMDs
• Compared to current HMDs
• 6 − 10× higher pixel density
• 20 − 30× higher frame rate
Comparison between Eyes and HMD
Hearing
Anatomy of the Ear
Auditory Thresholds
• Humans hear frequencies from 20 – 22,000 Hz
• Most everyday sounds from 80 – 90 dB
Sound Localization
• Humans have two ears
• localize sound in space
• Sound can be localized
using 3 coordinates
• Azimuth, elevation, distance
Sound Localization
https://www.youtube.com/watch?v=FIU1bNSlbxk
Sound Localization (Azimuth Cues)
Interaural Time Difference
HRTF (Elevation Cue)
• Pinna and head shape affect frequency intensities
• Sound intensities measured with microphones in ear and
compared to intensities at sound source
• Difference is HRTF, gives clue as to sound source location
Accuracy of Sound Localization
• People can locate sound
• Most accurately in front of them
• 2-3° error in front of head
• Least accurately to sides and behind head
• Up to 20° error to side of head
• Largest errors occur above/below elevations and behind head
• Front/back confusion is an issue
• Up to 10% of sounds presented in the front are perceived
coming from behind and vice versa (more in headphones)
BUTEAN, A., Bălan, O., NEGOI, I., Moldoveanu, F., & Moldoveanu, A. (2015). COMPARATIVE RESEARCH ON SOUND
LOCALIZATION ACCURACY IN THE FREE-FIELD AND VIRTUAL AUDITORY DISPLAYS. InConference proceedings of»
eLearning and Software for Education «(eLSE)(No. 01, pp. 540-548). Universitatea Nationala de Aparare Carol I.
Touch
Haptic Sensation
• Somatosensory System
• complex system of nerve cells that responds to
changes to the surface or internal state of the body
• Skin is the largest organ
• 1.3-1.7 square m in adults
• Tactile: Surface properties
• Receptors not evenly spread
• Most densely populated area is the tongue
• Kinesthetic: Muscles, Tendons, etc.
• Also known as proprioception
Cutaneous System
• Skin – heaviest organ in the body
• Epidermis outer layer, dead skin cells
• Dermis inner layer, with four kinds of mechanoreceptors
Mechanoreceptors
• Cells that respond to pressure, stretching, and vibration
• Slow Acting (SA), Rapidly Acting (RA)
• Type I at surface – light discriminate touch
• Type II deep in dermis – heavy and continuous touch
Receptor Type Rate of
Acting
Stimulus
Frequency
Receptive Field Detection Function
Merkel Discs SA-I 0 – 10 Hz Small, well defined Edges, intensity
Ruffini
corpuscles
SA-II 0 – 10 Hz Large, indistinct Static force,
skin stretch
Meissner
corpuscles
RA-I 20 – 50 Hz Small, well defined Velocity, edges
Pacinian
corpuscles
RA-II 100 – 300 Hz Large, indistinct Acceleration,
vibration
Spatial Resolution
• Sensitivity varies greatly
• Two-point discrimination
Body
Site
Threshold
Distance
Finger 2-3mm
Cheek 6mm
Nose 7mm
Palm 10mm
Forehead 15mm
Foot 20mm
Belly 30mm
Forearm 35mm
Upper Arm 39mm
Back 39mm
Shoulder 41mm
Thigh 42mm
Calf 45mm
http://faculty.washington.edu/chudler/chsense.html
Proprioception/Kinaesthesia
• Proprioception (joint position sense)
• Awareness of movement and positions of body parts
• Due to nerve endings and Pacinian and Ruffini corpuscles at joints
• Enables us to touch nose with eyes closed
• Joints closer to body more accurately sensed
• Users know hand position accurate to 8cm without looking at them
• Kinaesthesia (joint movement sense)
• Sensing muscle contraction or stretching
• Cutaneous mechanoreceptors measuring skin stretching
• Helps with force sensation
AR TECHNOLOGY
Augmented Reality Definition
•Combines Real and Virtual Images
•Both can be seen at the same time
•Interactive in real-time
•The virtual content can be interacted with
•Registered in 3D
•Virtual objects appear fixed in space
Augmented Reality technology
•Combines Real and Virtual Images
•Needs: Display technology
•Interactive in real-time
•Needs: Input and interaction technology
•Registered in 3D
•Needs: Viewpoint tracking technology
Example: MagicLeap ML-1 AR Display
•Display
• Multi-layered Waveguide display
•Tracking
• Inside out SLAM tracking
•Input
• 6DOF wand, gesture input
MagicLeap Display
• Optical see through AR display
• Overlay graphics directly on real world
• 40o x 30o FOV, 1280 x 960 pixels/eye
• Waveguide based display
• Holographic optical element
• Very thin physical display
• Two sets of waveguides
• Different focal planes
• Overcomes vergence/accommodation problem
• Eye tracking for selecting focal plane
• Separate CPU/GPU unit
AR Vergence and Accommodation
• Fixed focal distance for OST displays
• Accommodation conflict between real and virtual object
Tracking
• Inside out tracking
• Sensors on the user’s head
• Using multiple sensors
• Time of Flight Depth Sensor
• IR dot projector
• Wide angle cameras
• Internal accelerometer (IMU)
• Creates 3D model of real world
• Tracks from model
Spatial Mapping (Hololens)
Input
• Multiple input methods
• Handheld Controller
• Multiple buttons, trackpad input
• 6 DOF magnetic tracking
• Eye gaze
• Integrated eye tracking
• Hand tracking
• Natural hand input
Hand Tracking
Eye Tracking
1: AR DISPLAYS
AR Display Technologies
• Classification (Bimber/Raskar 2005)
• Head attached
• Head mounted display/projector
• Body attached
• Handheld display/projector
• Spatial
• Spatially aligned projector/monitor
Bimber, O., & Raskar, R. (2005). Spatial augmented reality: merging real and virtual worlds. CRC press.
DisplayTaxonomy
HEAD MOUNTED DISPLAYS
Types of Head Mounted Displays
Occluded
See-thru
Multiplexed
Optical see-through Head-Mounted Display
Virtual images
from monitors
Real
World
Optical
Combiners
ViewThrough Optical See-Through HMD
Optical Design - Birdbath
▪ Reflect off beam splitter
Optical Design – Curved Mirror
▪ Reflect off free-space curved mirror
Example: Meta2
• https://www.youtube.com/watch?v=e1W29w63W4g
Optical Design - Prism
Epson Moverio BT-300
▪ Stereo see-through display ($700)
▪ 1280 RGB x 720 pixels, 23 degree FOV, 30Hz, 69g
▪ Android Powered, separate controller
▪ VGA camera, GPS, gyro, accelerometer
Optical Design - Waveguide
• Use prisms/grating elements
Lumus Display
• https://www.youtube.com/watch?v=G2MtI7asLcA
Example: Sony Smart EyeGlasses
https://www.youtube.com/watch?v=kYPWaMsarss
Hololens Waveguide Display
AR HMDs
• Microsoft HoloLens2 - $3,500 USD
• Wearable computer, 47 degree FOV
• Waveguide displays, optical see-through
• Vuzix Blade - $1000 USD
• 30 degree FOV, optical see-through
• Self contained, Monocular, Android OS
• Epson BT 30C - $499 USD
• 25 degree FOV, see-through
• Tethered display, USB-C connector
Pros and Cons of Optical see-throughAR
• Pros
• Simpler design (cheaper)
• Direct view of real world
• No eye displacement
• Socially acceptable (glasses form factor)
• Cons
• Difficult to occlude real world
• Image washout outdoors/bright lights
• Wide field of view challenging
• Can’t delay the real world
Video see-through HMD
Video
cameras
Monitors
Graphics
Combiner
Video
ViewThrough aVideo See-Through HMD
Example: Varjo XR-1
• Wide field of view
• 87 degrees
• High resolution
• 1920 x 1080 pixel/eye
• 1440 x 1600 pixel insert
• Low latency stereo cameras
• 2 x 12 megapixel
• < 20 ms delay
• Integrated Eye Tracking
Varjo XR-1 Image Quality
• https://www.youtube.com/watch?v=L0sg-3EGbZs
Handheld AR
• Camera + display = handheld AR
• Mobile phone/Tablet display
Pros and Cons ofVideo See-ThroughAR
• Pros
• True occlusion
• Digitized image of real world
• Registration, calibration, matchable time delay
• Wide FOV is easier to support
• Cons
• Larger, bulkier hardware
• Can’t see real world with natural eyes
Multiplexed Display
Virtual Image ‘inset’ into Real World
Example:Google Glass
ViewThrough Google Glass
See-Through Display Taxonomy
Example
Products
Binocular
See-Through
Displays
Monocular
Optical
See-Through
Video
See-Through
Stereoscopic
Overlays
Monoscopic
Overlays
Single
Camera
Dual
Camera
Monoscopic
Overlays
Stereoscopic
Overlays
Stereoscopic
Overlays
Monoscopic
Overlays
Video
See-Through
E.g.: smartphone- or
tablet-based
hand-held AR
Also: Google Glass in
VST mode E.g.: Lumos DK-40
E.g.: Microsoft HoloLens,
Epson Moverio BT-200,
Vuzix STAR 1200XLD
E.g.: Trivisio
ARVision
E.g.: Vuzix iWear VR920
with
iWear CamAR
Possible, but no
clear advantage
E.g.: Canon COASTAR,
Vuzix Wrap 1200DXAR
Optical
See-Through
E.g.: Microvision
Nomad,
DigiLens DL40,
TacEye ST,
Vuzix M2000AR
More on Head Mounted Displays
• Karl Guttag Blog - https://kguttag.com/
HANDHELD AR
Handheld AR
• Camera + display = handheld AR
• Mobile phone/Tablet display
• Video see-through AR
User Perspective Rendering for AR
User-Perspective Hand-Held Display
Handheld display with
device perspective
Handheld display with
user perspective
Image: Domagoj Baričević
https://www.youtube.com/watch?v=z0nVgk1OxSc
SPATIALAUGMENTED REALITY
SpatialAugmented Reality
• Project onto irregular surfaces
• Geometric Registration
• Projector blending, High dynamic range
• Book: Bimber, Rasker “Spatial Augmented Reality”
Lightform
• Depth sensor + projector
• Create 3D model of space
• Deform image mapping
• Content creation tools
• https://www.youtube.com/watch?v=AUJNxNkwEy0
Steerable Projector
Image: Claudio Pinhanez, IBM Research
Everywhere Projector Display
A steerable, tracked projector can
display images anywhere
Head Mounted Projector
• NVIS P-50 HMPD
• 1280x1024/eye
• Stereoscopic
• 50 degree FOV
• www.nvis.com
HMD vs.HMPD
Head Mounted Display Head Mounted Projected Display
Tilt5 - https://www.tiltfive.com/
• Stereo head worn projectors
• Interactive wand
• Roll-able retro-reflective sheet
• Designed for shared interaction
• Retroreflective roll-able mat
incident light
diffusion
retro-reflection
reflection
Lambertian reflector
(e.g. unfinished wood) Mirror reflector Retro-reflector
• https://www.youtube.com/watch?v=gNnBX1cW3L4
OTHER AR DISPLAYS
Video MonitorAR
Video
cameras Monitor
Graphics Combiner
Video
Stereo
glasses
Examples
Magic Mirror AR Experience
• See AR overlay of an image of yourself
• https://www.youtube.com/watch?v=Mr71jrkzWq8&t=2s
OtherTypes ofAR Display
• Audio
• spatial sound
• ambient audio
• Tactile
• physical sensation
• Haptic
• virtual touch
Haptic Input
• AR Haptic Workbench
• CSIRO 2003 – Adcock et. al.
Phantom
• SensableTechnologies (www.sensable.com)
• 6 DOF Force Feedback Device
AR Haptic Interface
• Phantom, ARToolKit, Magellan
Olfactory Display
MetaCookie: An olfactory display is
combined with visual augmentation of a
plain cookie to provide the illusion of a
flavored cookie (chocolate, in the inset).
Image: Takuji Narumi
• https://www.youtube.com/watch?v=3GnQE9cCf84
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

More Related Content

What's hot

Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions
Mark Billinghurst
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
Mark Billinghurst
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
Mark Billinghurst
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality
Mark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
Mark Billinghurst
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
Mark Billinghurst
 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-Perception
Mark Billinghurst
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
Mark Billinghurst
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
Mark Billinghurst
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
Mark Billinghurst
 
Comp4010 lecture3-AR Technology
Comp4010 lecture3-AR TechnologyComp4010 lecture3-AR Technology
Comp4010 lecture3-AR Technology
Mark Billinghurst
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface Design
Mark Billinghurst
 
Comp4010 lecture6 Prototyping
Comp4010 lecture6 PrototypingComp4010 lecture6 Prototyping
Comp4010 lecture6 Prototyping
Mark Billinghurst
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
Mark Billinghurst
 
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysCOMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
Mark Billinghurst
 
Multimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed RealityMultimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed Reality
Mark Billinghurst
 
Advanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR Studies
Mark Billinghurst
 
COMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsCOMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR Systems
Mark Billinghurst
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
Mark Billinghurst
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
Mark Billinghurst
 

What's hot (20)

Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-Perception
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
Comp4010 lecture3-AR Technology
Comp4010 lecture3-AR TechnologyComp4010 lecture3-AR Technology
Comp4010 lecture3-AR Technology
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface Design
 
Comp4010 lecture6 Prototyping
Comp4010 lecture6 PrototypingComp4010 lecture6 Prototyping
Comp4010 lecture6 Prototyping
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysCOMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
 
Multimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed RealityMultimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed Reality
 
Advanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR Studies
 
COMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsCOMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR Systems
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 

Similar to 2022 COMP4010 Lecture2: Perception

2016 AR Summer School - Lecture1
2016 AR Summer School - Lecture12016 AR Summer School - Lecture1
2016 AR Summer School - Lecture1
Mark Billinghurst
 
Lecture 9 AR Technology
Lecture 9 AR TechnologyLecture 9 AR Technology
Lecture 9 AR Technology
Mark Billinghurst
 
Introduction to Augmented Reality
Introduction to Augmented RealityIntroduction to Augmented Reality
Introduction to Augmented Reality
Mark Billinghurst
 
Virtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesVirtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the Possibilities
Mark Billinghurst
 
Building AR and VR Experiences
Building AR and VR ExperiencesBuilding AR and VR Experiences
Building AR and VR Experiences
Mark Billinghurst
 
Virtual Reality & Augmented Reality
Virtual Reality & Augmented RealityVirtual Reality & Augmented Reality
Virtual Reality & Augmented Reality
Rajesh Yadav
 
C. VR intrduction_lecture for introduction to VR Lecture-1.pptx
C. VR intrduction_lecture for introduction to VR Lecture-1.pptxC. VR intrduction_lecture for introduction to VR Lecture-1.pptx
C. VR intrduction_lecture for introduction to VR Lecture-1.pptx
RajGopalMishra4
 
presentation1-180123jjjjjjjj150728_2.pdf
presentation1-180123jjjjjjjj150728_2.pdfpresentation1-180123jjjjjjjj150728_2.pdf
presentation1-180123jjjjjjjj150728_2.pdf
reler89973
 
Mobile AR Lecture1-introduction
Mobile AR Lecture1-introductionMobile AR Lecture1-introduction
Mobile AR Lecture1-introduction
Mark Billinghurst
 
Virtual Reality 2.0
Virtual Reality 2.0Virtual Reality 2.0
Virtual Reality 2.0
Mark Billinghurst
 
Mixed Reality in the Workspace
Mixed Reality in the WorkspaceMixed Reality in the Workspace
Mixed Reality in the Workspace
Mark Billinghurst
 
Future Directions for Augmented Reality
Future Directions for Augmented RealityFuture Directions for Augmented Reality
Future Directions for Augmented Reality
Mark Billinghurst
 
COMP 4010 - Lecture 8 AR Technology
COMP 4010 - Lecture 8 AR TechnologyCOMP 4010 - Lecture 8 AR Technology
COMP 4010 - Lecture 8 AR Technology
Mark Billinghurst
 
Application in Augmented and Virtual Reality
Application in Augmented and Virtual RealityApplication in Augmented and Virtual Reality
Application in Augmented and Virtual Reality
Mark Billinghurst
 
2016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 52016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 5
Mark Billinghurst
 
Comp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XRComp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XR
Mark Billinghurst
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual Reality
Mark Billinghurst
 
VRCAI 2011 Billinghurst Keynote
VRCAI 2011 Billinghurst KeynoteVRCAI 2011 Billinghurst Keynote
VRCAI 2011 Billinghurst Keynote
Mark Billinghurst
 
COMP 4010 - Lecture 7: Introduction to Augmented Reality
COMP 4010 - Lecture 7: Introduction to Augmented RealityCOMP 4010 - Lecture 7: Introduction to Augmented Reality
COMP 4010 - Lecture 7: Introduction to Augmented Reality
Mark Billinghurst
 
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Mark Billinghurst
 

Similar to 2022 COMP4010 Lecture2: Perception (20)

2016 AR Summer School - Lecture1
2016 AR Summer School - Lecture12016 AR Summer School - Lecture1
2016 AR Summer School - Lecture1
 
Lecture 9 AR Technology
Lecture 9 AR TechnologyLecture 9 AR Technology
Lecture 9 AR Technology
 
Introduction to Augmented Reality
Introduction to Augmented RealityIntroduction to Augmented Reality
Introduction to Augmented Reality
 
Virtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesVirtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the Possibilities
 
Building AR and VR Experiences
Building AR and VR ExperiencesBuilding AR and VR Experiences
Building AR and VR Experiences
 
Virtual Reality & Augmented Reality
Virtual Reality & Augmented RealityVirtual Reality & Augmented Reality
Virtual Reality & Augmented Reality
 
C. VR intrduction_lecture for introduction to VR Lecture-1.pptx
C. VR intrduction_lecture for introduction to VR Lecture-1.pptxC. VR intrduction_lecture for introduction to VR Lecture-1.pptx
C. VR intrduction_lecture for introduction to VR Lecture-1.pptx
 
presentation1-180123jjjjjjjj150728_2.pdf
presentation1-180123jjjjjjjj150728_2.pdfpresentation1-180123jjjjjjjj150728_2.pdf
presentation1-180123jjjjjjjj150728_2.pdf
 
Mobile AR Lecture1-introduction
Mobile AR Lecture1-introductionMobile AR Lecture1-introduction
Mobile AR Lecture1-introduction
 
Virtual Reality 2.0
Virtual Reality 2.0Virtual Reality 2.0
Virtual Reality 2.0
 
Mixed Reality in the Workspace
Mixed Reality in the WorkspaceMixed Reality in the Workspace
Mixed Reality in the Workspace
 
Future Directions for Augmented Reality
Future Directions for Augmented RealityFuture Directions for Augmented Reality
Future Directions for Augmented Reality
 
COMP 4010 - Lecture 8 AR Technology
COMP 4010 - Lecture 8 AR TechnologyCOMP 4010 - Lecture 8 AR Technology
COMP 4010 - Lecture 8 AR Technology
 
Application in Augmented and Virtual Reality
Application in Augmented and Virtual RealityApplication in Augmented and Virtual Reality
Application in Augmented and Virtual Reality
 
2016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 52016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 5
 
Comp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XRComp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XR
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual Reality
 
VRCAI 2011 Billinghurst Keynote
VRCAI 2011 Billinghurst KeynoteVRCAI 2011 Billinghurst Keynote
VRCAI 2011 Billinghurst Keynote
 
COMP 4010 - Lecture 7: Introduction to Augmented Reality
COMP 4010 - Lecture 7: Introduction to Augmented RealityCOMP 4010 - Lecture 7: Introduction to Augmented Reality
COMP 4010 - Lecture 7: Introduction to Augmented Reality
 
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented Reality
 

More from Mark Billinghurst

The Metaverse: Are We There Yet?
The  Metaverse:    Are   We  There  Yet?The  Metaverse:    Are   We  There  Yet?
The Metaverse: Are We There Yet?
Mark Billinghurst
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
Mark Billinghurst
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
Mark Billinghurst
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
Mark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
Mark Billinghurst
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
Mark Billinghurst
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Mark Billinghurst
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
Mark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
Mark Billinghurst
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
Mark Billinghurst
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
Mark Billinghurst
 

More from Mark Billinghurst (11)

The Metaverse: Are We There Yet?
The  Metaverse:    Are   We  There  Yet?The  Metaverse:    Are   We  There  Yet?
The Metaverse: Are We There Yet?
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 

Recently uploaded

Connector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a buttonConnector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a button
DianaGray10
 
Neuro-symbolic is not enough, we need neuro-*semantic*
Neuro-symbolic is not enough, we need neuro-*semantic*Neuro-symbolic is not enough, we need neuro-*semantic*
Neuro-symbolic is not enough, we need neuro-*semantic*
Frank van Harmelen
 
When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...
Elena Simperl
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
Ana-Maria Mihalceanu
 
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
BookNet Canada
 
Generating a custom Ruby SDK for your web service or Rails API using Smithy
Generating a custom Ruby SDK for your web service or Rails API using SmithyGenerating a custom Ruby SDK for your web service or Rails API using Smithy
Generating a custom Ruby SDK for your web service or Rails API using Smithy
g2nightmarescribd
 
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdfFIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance
 
Leading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdfLeading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdf
OnBoard
 
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualitySoftware Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Inflectra
 
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
Product School
 
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdfFIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance
 
Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........
Alison B. Lowndes
 
Epistemic Interaction - tuning interfaces to provide information for AI support
Epistemic Interaction - tuning interfaces to provide information for AI supportEpistemic Interaction - tuning interfaces to provide information for AI support
Epistemic Interaction - tuning interfaces to provide information for AI support
Alan Dix
 
DevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA ConnectDevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA Connect
Kari Kakkonen
 
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
Product School
 
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
UiPathCommunity
 
JMeter webinar - integration with InfluxDB and Grafana
JMeter webinar - integration with InfluxDB and GrafanaJMeter webinar - integration with InfluxDB and Grafana
JMeter webinar - integration with InfluxDB and Grafana
RTTS
 
Accelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish CachingAccelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish Caching
Thijs Feryn
 
Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !
KatiaHIMEUR1
 
UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4
DianaGray10
 

Recently uploaded (20)

Connector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a buttonConnector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a button
 
Neuro-symbolic is not enough, we need neuro-*semantic*
Neuro-symbolic is not enough, we need neuro-*semantic*Neuro-symbolic is not enough, we need neuro-*semantic*
Neuro-symbolic is not enough, we need neuro-*semantic*
 
When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
 
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
 
Generating a custom Ruby SDK for your web service or Rails API using Smithy
Generating a custom Ruby SDK for your web service or Rails API using SmithyGenerating a custom Ruby SDK for your web service or Rails API using Smithy
Generating a custom Ruby SDK for your web service or Rails API using Smithy
 
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdfFIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
 
Leading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdfLeading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdf
 
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualitySoftware Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
 
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
 
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdfFIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
 
Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........
 
Epistemic Interaction - tuning interfaces to provide information for AI support
Epistemic Interaction - tuning interfaces to provide information for AI supportEpistemic Interaction - tuning interfaces to provide information for AI support
Epistemic Interaction - tuning interfaces to provide information for AI support
 
DevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA ConnectDevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA Connect
 
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
 
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
 
JMeter webinar - integration with InfluxDB and Grafana
JMeter webinar - integration with InfluxDB and GrafanaJMeter webinar - integration with InfluxDB and Grafana
JMeter webinar - integration with InfluxDB and Grafana
 
Accelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish CachingAccelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish Caching
 
Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !
 
UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4
 

2022 COMP4010 Lecture2: Perception

  • 1. PERCEPTION COMP 4010 Lecture Two Mark Billinghurst August 4rd 2022 mark.billinghurst@unisa.edu.au
  • 3. The Incredible Disappearing Computer 1960-70’s Room 1970-80’s Desk 1980-90’s Lap 1990-2000’s Hand 2010 - Head
  • 4. Rekimoto, J. and Nagao, K. 1995. The world through the computer: computer augmented interaction with real world environments. Making Interfaces Invisible (c) Internet of Things
  • 5. Internet of Things (IoT).. • Embed computing and sensing in real world • Smart objects, sensors, etc.. (c) Internet of Things
  • 6. Virtual Reality (VR) • Users immersed in Computer Generated environment • HMD, gloves, 3D graphics, body tracking
  • 7. Augmented Reality (AR) • Virtual Images blended with the real world • See-through HMD, handheld display, viewpoint tracking, etc..
  • 9. Milgram’s Mixed Reality (MR) Continuum Augmented Reality Virtual Reality Real World Virtual World Mixed Reality "...anywhere between the extrema of the virtuality continuum." P. Milgram and A. F. Kishino, (1994) A Taxonomy of Mixed Reality Visual Displays Internet of Things
  • 10. Extended Reality (XR) Augmented Reality Virtual Reality Real World Virtual World Mixed Reality Extended Reality Internet of Things
  • 11. Metaverse Components • Four Key Components • Virtual Worlds • Augmented Reality • Mirror Worlds • Lifelogging
  • 12. Ivan Sutherland (1960s) 1 2 Ivan Sutherland’s Head-Mounted Display (1968)
  • 13. Super Cockpit (1965-80’s) • US Airforce Research Program • Wright Patterson Air Force Base • Tom Furness III • Multisensory • Visual, auditory, tactile • Head, eye, speech, and hand input • Addressing pilot information overload • Flight controls and tasks too complicated • Research only • big system, not safe for ejecting
  • 14. VPL Research (1985 – 1999) • First Commercial VR Company • Jaron Lanier, Jean-Jacques Grimaud • Provided complete systems • Displays, software, gloves, etc • DataGlove, EyePhone, AudioSphere
  • 15. First Industrial Use of AR (1990’s) • 1992: Tom Caudell at Boeing coined the term “AR.” • Wire harness assembly application begun • Lead by Tom Caudell, and David Mizell
  • 16. Desktop VR - 1995 • Expensive - $150,000+ • 2 million polys/sec • VGA HMD – 30 Hz • Magnetic tracking
  • 17. Mobile/Wearable Systems (1995) • 1995 Navicam (Rekimoto) • Handheld AR • 1997 Touring Machine (Feiner) • Backpack AR, GPS, see-through display • 1998 Tinmith (Thomas, UniSA) • Outdoor gaming, CAD
  • 18. Rise of Commercial VR Companies • W Industries/Virtuality (1985 - 97) • Location based entertainment • Virtuality VR Arcades • Division (1989 – 1998) • Turn key VR systems • Visual programming tools • Virtual i-O (1993 -1997) • Inexpensive gamer HMDs • Sense8 (1990 - 1998) • WorldToolKit, WorldUp • VR authoring tools
  • 19. Mobile Phone AR (2005) • Mobile Phones • camera • processor • display • AR on Mobile Phones • Simple graphics • Optimized computer vision • Collaborative Interaction
  • 20. 2008 - Browser Based AR • Flash + camera + 3D graphics • ARToolKit ported to Flash • High impact • High marketing value • Large potential install base • 1.6 Billion web users • Ease of development • Lots of developers, mature tools • Low cost of entry • Browser, web camera
  • 21. 2008: Location Aware Phones Nokia Navigator Motorola Droid
  • 22. VR Second Wave (2010 - ) • Palmer Luckey • HMD hacker • Mixed Reality Lab (MxR) intern • Oculus Rift (2011 - ) • 2012 - $2.4 million kickstarter • 2014 - $2B acquisition FaceBook • $350 USD, 110o FOV
  • 23. Desktop VR in 2016 • Graphics Desktop • $1,500 USD • >4 Billion poly/sec • $600 HMD • 1080x1200, 90Hz • Optical tracking • Room scale
  • 24. Oculus Rift Sony Morpheus HTC/Valve Vive 2016 - Rise of Consumer HMDs
  • 25. Social Mobile Camera AR Apps (2015 - ) • SnapChat - Lenses, World Lenses • Cinco de Mayo lens > 225 million views • Facebook - Camera Effects • Google – Word Lens/Translate
  • 26. Hololens (2016) • Integrated system – Windows • Stereo see-through display • Depth sensing tracking • Voice and gesture interaction • Note: Hololens2 coming September 2019
  • 27. ARKit/ARcore (2017) • Visual Inertial Odometry (VIO) systems • Mobile phone pose tracked by • Camera (Visual), Accelerometer & Gyroscope (Intertial) • Features • Plane detection, lighting detection, hardware optimisation • Links • https://developer.apple.com/arkit/ • https://developers.google.com/ar/
  • 28. History Summary • 1960’s – 80’s: Early Experimentation • 1980’s – 90’s: Basic Research • Tracking, displays • 1995 – 2005: Tools/Applications • Interaction, usability, theory • 2005 - : Commercial Applications • Mobile, Games, Medical, Industry
  • 30. Why 2022 won’t be like 1996 • It’s not just VR anymore • Huge amount of investment • Inexpensive hardware platforms • Easy to use content creation tools • New devices for input and output • Proven use cases – no more Hype! • Most important: Focus on User Experience
  • 31. Example: Pokemon GO Killer Combo: brand + social + mobile + geo-location + AR
  • 32. Pokemon GO Effect • Fastest App to reach $500 million in Revenue • Only 63 days after launch, > $1 Billion in 6 months • Over 500 million downloads, > 25 million DAU • Nintendo stock price up by 50% (gain of $9 Billion USD)
  • 33. Augmented Reality in 2022 • Large growing market • > $13Billion USD in 2021 • Many available devices • HMD, phones, tablets, HUDs • Robust developer tools • Vuforia, ARToolKit, Unity, Wikitude, etc • Large number of applications • > 150K developers, > 100K mobile apps • Strong research/business communities • ISMAR, AWE conferences, AugmentedReality.org, etc
  • 34.
  • 35.
  • 36.
  • 37.
  • 38.
  • 40.
  • 41. Conclusion • AR/VR has a long history • > 50 years of HMDs, simulators • Key elements for were in place by early 1990’s • Displays, tracking, input, graphics • Strong support from military, government, universities • First commercial wave failed in late 1990’s • Too expensive, bad user experience, poor technology, etc • We are now in second commercial wave • Better experience, Affordable hardware • Large commercial investment, Significant installed user base • Will XR be a commercial success this time?
  • 44. How do We Perceive Reality? • We understand the world through our senses: • Sight, Hearing, Touch, Taste, Smell (and others..) • Two basic processes: • Sensation – Gathering information • Perception – Interpreting information
  • 46. Goal of Virtual Reality “.. to make it feel like you’re actually in a place that you are not.” Palmer Luckey Co-founder, Oculus
  • 47. Creating the Illusion of Reality • Fooling human perception by using technology to generate artificial sensations • Computer generated sights, sounds, smell, etc
  • 48. Reality vs. Virtual Reality • In a VR system there are input and output devices between human perception and action
  • 49. Example Birdly - http://www.somniacs.co/ • Create illusion of flying like a bird • Multisensory VR experience • Visual, audio, wind, haptic
  • 52. Presence .. “The subjective experience of being in one place or environment even when physically situated in another” Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence questionnaire. Presence: Teleoperators and virtual environments, 7(3), 225-240.
  • 53. Immersion vs. Presence • Immersion: describes the extent to which technology is capable of delivering a vivid illusion of reality to the senses of a human participant. • Presence: a state of consciousness, the (psychological) sense of being in the virtual environment. • So Immersion, defined in technical terms, is capable of producing a sensation of Presence • Goal of VR: Create a high degree of Presence • Make people believe they are really in Virtual Environment Slater, M., & Wilbur, S. (1997). A framework for immersive virtual environments (FIVE): Speculations on the role of presence in virtual environments. Presence: Teleoperators and virtual environments, 6(6), 603-616.
  • 54. How to Create Strong Presence? • Use Multiple Dimensions of Presence • Create rich multi-sensory VR experiences • Include social actors/agents that interact with the user • Have environment respond to the user • What Influences Presence • Vividness – ability to provide rich experience (Steuer 1992) • Using Virtual Body – user can see themselves (Slater 1993) • Internal factors – individual user differences (Sadowski 2002) • Interactivity – how much users can interact (Steuer 1992) • Sensory, Realism factors (Witmer 1998)
  • 55. Five Key Technical Requirements for Presence • Persistence • > 90 Hz refresh, < 3 ms persistence, avoid retinal blur • Optics • Wide FOV > 90 degrees, comfortable eyebox, good calibration • Tracking • 6 DOF, 360 tracking, sub-mm accuracy, no jitter, good tracking volume • Resolution • Correct stereo, > 1K x 1K resolution, no visible pixels • Latency • < 20 ms latency, fuse optical tracking and IMU, minimize tracking loop http://www.roadtovr.com/oculus-shares-5-key-ingredients-for-presence-in-virtual-reality/
  • 56. Example: UNC Pit Room • Key Features • Training room and pit room • Physical walking • Fast, accurate, room scale tracking • Haptic feedback – feel edge of pit, walls • Strong visual and 3D audio cues • Task • Carry object across pit • Walk across or walk around • Dropping virtual balls at targets in pit • http://wwwx.cs.unc.edu/Research/eve/walk_exp/
  • 57. Typical Subject Behaviour • Note – from another pit experiment • https://www.youtube.com/watch?v=VVAO0DkoD-8
  • 59. Why do people behave like this? • Presence can be decomposed into two dimensions (Slater 2009): • “Place Illusion” (PI): being in the place depicted in the VR environment • perception in VR matches natural sensorimotor input • Plausibility Illusion (Psi): the events in the VR environment are actually occurring • VR environment responds to user actions • When both PI and Psi are high, people respond realistically to events in the VR Slater, M. (2009). Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments. Philosophical Transactions of the Royal Society B: Biological Sciences, 364(1535), 3549-3557. Presence = PI + Psi + ??
  • 60. Slater, M., Banakou, D., Beacco, A., Gallego, J., Macia-Varela, F., & Oliva, R. (2022). A Separate Reality: An Update on Place Illusion and Plausibility in Virtual Reality. Frontiers in Virtual Reality, 81. Four Illusions of Presence (Slater 2022) • Place Illusion: being in the place • Plausibility Illusion: events are real • Body Ownership: seeing your body in VR • Copresence/Social Presence: other people are in VR
  • 61. Social Presence • What makes a Person appear real? • Interactivity • Visual appearance • Audio cues • Touch • Contextual cues • Etc.. Oh, C. S., Bailenson, J. N., & Welch, G. F. (2018). A systematic review of social presence: Definition, antecedents, and implications. Frontiers in Robotics and AI, 5, 114.
  • 62.
  • 63. Object Presence • What makes an object appear real? • Touch/Haptic feedback • Appearance • Lighting • Audio cues • Occlusion • Etc..
  • 64.
  • 65. Benefits of High Presence • Leads to greater engagement, excitement and satisfaction • Increased reaction to actions in VR • People more likely to behave like in the real world • E.g. people scared of heights in real world will be scared in VR • More natural communication (Social Presence) • Use same cues as face-to-face conversation • Note: The relationship between Presence and Performance is unclear
  • 66. Measuring Presence • Presence is very subjective so there is a lot of debate among researchers about how to measure it • Subjective Measures • Self report questionnaire • University College London Questionnaire (Slater 1999) • Witmer and Singer Presence Questionnaire (Witmer 1998) • ITC Sense Of Presence Inventory (Lessiter 2000) • Continuous measure • Person moves slider bar in VE depending on Presence felt • Objective Measures • Behavioural • reflex/flinch measure, startle response • Physiological measures • change in heart rate, skin conductance, skin temperature Presence Slider
  • 68. Motivation • Understand: In order to create a strong sense of Presence we need to understand the Human Perception system • Stimulate: We need to be able to use technology to provide real world sensory inputs, and create the VR illusion VR Hardware Human Senses
  • 69. Senses • How an organism obtains information for perception: • Sensation part of Somatic Division of Peripheral Nervous System • Integration and perception requires the Central Nervous System • Five major senses (but there are more..): • Sight (Opthalamoception) • Hearing (Audioception) • Taste (Gustaoception) • Smell (Olfacaoception) • Touch (Tactioception)
  • 70. Relative Importance of Each Sense • Percentage of neurons in brain devoted to each sense • Sight – 30% • Touch – 8% • Hearing – 2% • Smell - < 1% • Over 60% of brain involved with vision in some way
  • 71. Other Lessor Known Senses.. • Proprioception = sense of body position • what is your body doing right now • Equilibrium = balance • Acceleration • Nociception = sense of pain • Temperature • Satiety = state of being fed or gratified to or beyond capacity • Thirst • Micturition = amount of CO2 and Na in blood
  • 72. Sight
  • 73. The Human Visual System • Purpose is to convert visual input to signals in the brain
  • 74. The Human Eye • Light passes through cornea and lens onto retina • Photoreceptors in retina convert light into electrochemical signals
  • 75. Photoreceptors – Rods and Cones • Retina photoreceptors come in two types, Rods and Cones • Rods – 125 million, periphery of retina, no colour detection, night vision • Cones – 4-6 million, center of retina, colour vision, day vision
  • 76. Human Horizontal and Vertical FOV • Humans can see ~135 o vertical (60 o above, 75 o below) • See up to ~ 210 o horizontal FOV, ~ 115 o stereo overlap • Colour/stereo in centre, black and white/mono in periphery
  • 79. Vergence-Accommodation Conflict • Looking at real objects, vergence and focal distance match • In VR, vergence and accommodation can miss-match • Focusing on HMD screen, but accommodating for virtual object behind screen
  • 80. Visual Acuity Visual Acuity Test Targets • Ability to resolve details • Several types of visual acuity • detection, separation, etc • Normal eyesight can see a 50 cent coin at 80m • Corresponds to 1 arc min (1/60th of a degree) • Max acuity = 0.4 arc min
  • 81. Stereo Perception/Stereopsis • Eyes separated by IPD • Inter pupillary distance • 5 – 7.5cm (avge. 6.5cm) • Each eye sees diff. image • Separated by image parallax • Images fused to create 3D stereo view
  • 82.
  • 83. Depth Perception • The visual system uses a range of different Stereoscopic and Monocular cues for depth perception Stereoscopic Monocular eye convergence angle disparity between left and right images diplopia eye accommodation perspective atmospheric artifacts (fog) relative sizes image blur occlusion motion parallax shadows texture Parallax can be more important for depth perception! Stereoscopy is important for size and distance evaluation
  • 85. Depth Perception Distances • i.e. convergence/accommodation used for depth perception < 10m
  • 86. Properties of the Human Visual System • visual acuity: 20/20 is ~1 arc min • field of view: ~200° monocular, ~120° binocular, ~135° vertical • resolution of eye: ~576 megapixels • temporal resolution: ~60 Hz (depends on contrast, luminance) • dynamic range: instantaneous 6.5 f-stops, adapt to 46.5 f-stops • colour: everything in CIE xy diagram • depth cues in 3D displays: vergence, focus, (dis)comfort • accommodation range: ~8cm to ∞, degrades with age
  • 87. Creating the Perfect Illusion Cuervo, E., Chintalapudi, K., & Kotaru, M. (2018, February). Creating the perfect illusion: What will it take to create life-like virtual reality headsets?. In Proceedings of the 19th International Workshop on Mobile Computing Systems & Applications (pp. 7-12). • Technology to create life-like VR HMDs • Compared to current HMDs • 6 − 10× higher pixel density • 20 − 30× higher frame rate
  • 91. Auditory Thresholds • Humans hear frequencies from 20 – 22,000 Hz • Most everyday sounds from 80 – 90 dB
  • 92. Sound Localization • Humans have two ears • localize sound in space • Sound can be localized using 3 coordinates • Azimuth, elevation, distance
  • 94. Sound Localization (Azimuth Cues) Interaural Time Difference
  • 95. HRTF (Elevation Cue) • Pinna and head shape affect frequency intensities • Sound intensities measured with microphones in ear and compared to intensities at sound source • Difference is HRTF, gives clue as to sound source location
  • 96. Accuracy of Sound Localization • People can locate sound • Most accurately in front of them • 2-3° error in front of head • Least accurately to sides and behind head • Up to 20° error to side of head • Largest errors occur above/below elevations and behind head • Front/back confusion is an issue • Up to 10% of sounds presented in the front are perceived coming from behind and vice versa (more in headphones) BUTEAN, A., Bălan, O., NEGOI, I., Moldoveanu, F., & Moldoveanu, A. (2015). COMPARATIVE RESEARCH ON SOUND LOCALIZATION ACCURACY IN THE FREE-FIELD AND VIRTUAL AUDITORY DISPLAYS. InConference proceedings of» eLearning and Software for Education «(eLSE)(No. 01, pp. 540-548). Universitatea Nationala de Aparare Carol I.
  • 97. Touch
  • 98. Haptic Sensation • Somatosensory System • complex system of nerve cells that responds to changes to the surface or internal state of the body • Skin is the largest organ • 1.3-1.7 square m in adults • Tactile: Surface properties • Receptors not evenly spread • Most densely populated area is the tongue • Kinesthetic: Muscles, Tendons, etc. • Also known as proprioception
  • 99. Cutaneous System • Skin – heaviest organ in the body • Epidermis outer layer, dead skin cells • Dermis inner layer, with four kinds of mechanoreceptors
  • 100. Mechanoreceptors • Cells that respond to pressure, stretching, and vibration • Slow Acting (SA), Rapidly Acting (RA) • Type I at surface – light discriminate touch • Type II deep in dermis – heavy and continuous touch Receptor Type Rate of Acting Stimulus Frequency Receptive Field Detection Function Merkel Discs SA-I 0 – 10 Hz Small, well defined Edges, intensity Ruffini corpuscles SA-II 0 – 10 Hz Large, indistinct Static force, skin stretch Meissner corpuscles RA-I 20 – 50 Hz Small, well defined Velocity, edges Pacinian corpuscles RA-II 100 – 300 Hz Large, indistinct Acceleration, vibration
  • 101. Spatial Resolution • Sensitivity varies greatly • Two-point discrimination Body Site Threshold Distance Finger 2-3mm Cheek 6mm Nose 7mm Palm 10mm Forehead 15mm Foot 20mm Belly 30mm Forearm 35mm Upper Arm 39mm Back 39mm Shoulder 41mm Thigh 42mm Calf 45mm http://faculty.washington.edu/chudler/chsense.html
  • 102. Proprioception/Kinaesthesia • Proprioception (joint position sense) • Awareness of movement and positions of body parts • Due to nerve endings and Pacinian and Ruffini corpuscles at joints • Enables us to touch nose with eyes closed • Joints closer to body more accurately sensed • Users know hand position accurate to 8cm without looking at them • Kinaesthesia (joint movement sense) • Sensing muscle contraction or stretching • Cutaneous mechanoreceptors measuring skin stretching • Helps with force sensation
  • 104. Augmented Reality Definition •Combines Real and Virtual Images •Both can be seen at the same time •Interactive in real-time •The virtual content can be interacted with •Registered in 3D •Virtual objects appear fixed in space
  • 105. Augmented Reality technology •Combines Real and Virtual Images •Needs: Display technology •Interactive in real-time •Needs: Input and interaction technology •Registered in 3D •Needs: Viewpoint tracking technology
  • 106. Example: MagicLeap ML-1 AR Display •Display • Multi-layered Waveguide display •Tracking • Inside out SLAM tracking •Input • 6DOF wand, gesture input
  • 107. MagicLeap Display • Optical see through AR display • Overlay graphics directly on real world • 40o x 30o FOV, 1280 x 960 pixels/eye • Waveguide based display • Holographic optical element • Very thin physical display • Two sets of waveguides • Different focal planes • Overcomes vergence/accommodation problem • Eye tracking for selecting focal plane • Separate CPU/GPU unit
  • 108. AR Vergence and Accommodation • Fixed focal distance for OST displays • Accommodation conflict between real and virtual object
  • 109.
  • 110. Tracking • Inside out tracking • Sensors on the user’s head • Using multiple sensors • Time of Flight Depth Sensor • IR dot projector • Wide angle cameras • Internal accelerometer (IMU) • Creates 3D model of real world • Tracks from model
  • 112. Input • Multiple input methods • Handheld Controller • Multiple buttons, trackpad input • 6 DOF magnetic tracking • Eye gaze • Integrated eye tracking • Hand tracking • Natural hand input
  • 116. AR Display Technologies • Classification (Bimber/Raskar 2005) • Head attached • Head mounted display/projector • Body attached • Handheld display/projector • Spatial • Spatially aligned projector/monitor
  • 117. Bimber, O., & Raskar, R. (2005). Spatial augmented reality: merging real and virtual worlds. CRC press. DisplayTaxonomy
  • 119. Types of Head Mounted Displays Occluded See-thru Multiplexed
  • 120. Optical see-through Head-Mounted Display Virtual images from monitors Real World Optical Combiners
  • 122. Optical Design - Birdbath ▪ Reflect off beam splitter
  • 123. Optical Design – Curved Mirror ▪ Reflect off free-space curved mirror
  • 126. Epson Moverio BT-300 ▪ Stereo see-through display ($700) ▪ 1280 RGB x 720 pixels, 23 degree FOV, 30Hz, 69g ▪ Android Powered, separate controller ▪ VGA camera, GPS, gyro, accelerometer
  • 127. Optical Design - Waveguide • Use prisms/grating elements
  • 130. Example: Sony Smart EyeGlasses https://www.youtube.com/watch?v=kYPWaMsarss
  • 132. AR HMDs • Microsoft HoloLens2 - $3,500 USD • Wearable computer, 47 degree FOV • Waveguide displays, optical see-through • Vuzix Blade - $1000 USD • 30 degree FOV, optical see-through • Self contained, Monocular, Android OS • Epson BT 30C - $499 USD • 25 degree FOV, see-through • Tethered display, USB-C connector
  • 133.
  • 134.
  • 135. Pros and Cons of Optical see-throughAR • Pros • Simpler design (cheaper) • Direct view of real world • No eye displacement • Socially acceptable (glasses form factor) • Cons • Difficult to occlude real world • Image washout outdoors/bright lights • Wide field of view challenging • Can’t delay the real world
  • 138. Example: Varjo XR-1 • Wide field of view • 87 degrees • High resolution • 1920 x 1080 pixel/eye • 1440 x 1600 pixel insert • Low latency stereo cameras • 2 x 12 megapixel • < 20 ms delay • Integrated Eye Tracking
  • 139. Varjo XR-1 Image Quality
  • 141. Handheld AR • Camera + display = handheld AR • Mobile phone/Tablet display
  • 142. Pros and Cons ofVideo See-ThroughAR • Pros • True occlusion • Digitized image of real world • Registration, calibration, matchable time delay • Wide FOV is easier to support • Cons • Larger, bulkier hardware • Can’t see real world with natural eyes
  • 143. Multiplexed Display Virtual Image ‘inset’ into Real World
  • 146. See-Through Display Taxonomy Example Products Binocular See-Through Displays Monocular Optical See-Through Video See-Through Stereoscopic Overlays Monoscopic Overlays Single Camera Dual Camera Monoscopic Overlays Stereoscopic Overlays Stereoscopic Overlays Monoscopic Overlays Video See-Through E.g.: smartphone- or tablet-based hand-held AR Also: Google Glass in VST mode E.g.: Lumos DK-40 E.g.: Microsoft HoloLens, Epson Moverio BT-200, Vuzix STAR 1200XLD E.g.: Trivisio ARVision E.g.: Vuzix iWear VR920 with iWear CamAR Possible, but no clear advantage E.g.: Canon COASTAR, Vuzix Wrap 1200DXAR Optical See-Through E.g.: Microvision Nomad, DigiLens DL40, TacEye ST, Vuzix M2000AR
  • 147. More on Head Mounted Displays • Karl Guttag Blog - https://kguttag.com/
  • 149. Handheld AR • Camera + display = handheld AR • Mobile phone/Tablet display • Video see-through AR
  • 151. User-Perspective Hand-Held Display Handheld display with device perspective Handheld display with user perspective Image: Domagoj Baričević
  • 152.
  • 155. SpatialAugmented Reality • Project onto irregular surfaces • Geometric Registration • Projector blending, High dynamic range • Book: Bimber, Rasker “Spatial Augmented Reality”
  • 156. Lightform • Depth sensor + projector • Create 3D model of space • Deform image mapping • Content creation tools
  • 158. Steerable Projector Image: Claudio Pinhanez, IBM Research Everywhere Projector Display A steerable, tracked projector can display images anywhere
  • 159. Head Mounted Projector • NVIS P-50 HMPD • 1280x1024/eye • Stereoscopic • 50 degree FOV • www.nvis.com
  • 160. HMD vs.HMPD Head Mounted Display Head Mounted Projected Display
  • 161. Tilt5 - https://www.tiltfive.com/ • Stereo head worn projectors • Interactive wand • Roll-able retro-reflective sheet • Designed for shared interaction
  • 162. • Retroreflective roll-able mat incident light diffusion retro-reflection reflection Lambertian reflector (e.g. unfinished wood) Mirror reflector Retro-reflector
  • 165. Video MonitorAR Video cameras Monitor Graphics Combiner Video Stereo glasses
  • 167. Magic Mirror AR Experience • See AR overlay of an image of yourself
  • 169. OtherTypes ofAR Display • Audio • spatial sound • ambient audio • Tactile • physical sensation • Haptic • virtual touch
  • 170. Haptic Input • AR Haptic Workbench • CSIRO 2003 – Adcock et. al.
  • 172. AR Haptic Interface • Phantom, ARToolKit, Magellan
  • 173. Olfactory Display MetaCookie: An olfactory display is combined with visual augmentation of a plain cookie to provide the illusion of a flavored cookie (chocolate, in the inset). Image: Takuji Narumi