VR INPUT AND SYSTEMS
COMP 4010 Lecture Nine
Mark Billinghurst
October 5th 2021
mark.billinghurst@unisa.edu.au
LECTURE 8 REVIEW
Reality vs. Virtual Reality
• In a VR system there are input and output devices
between human perception and action
Using Technology to Stimulate Senses
• Simulate output
• E.g. simulate real scene
• Map output to devices
• Graphics to HMD
• Use devices to
stimulate the senses
• HMD stimulates eyes
Visual
Simulation
3D Graphics HMD Vision
System
Brain
Example: Visual Simulation
Human-Machine Interface
Key Technologies for VR Systems
• Display (Immersion)
• Stimulate senses
• visual, auditory, tactile sense, etc..
• Tracking (Independence)
• Changing viewpoint
• independent movement
• Input Devices (Interaction)
• Supporting user interaction
• User input
DISPLAY TECHNOLOGY
Creating an Immersive Experience
•Head Mounted Display
• Immerse the eyes
•Projection/Large Screen
• Immerse the head/body
•Future Technologies
• Neural implants
• Contact lens displays, etc
VR Distorted Image
Typical VR HMD FOV
Foveated Displays
• Combine high resolution center
with low resolution periphery
Computer Based vs. Mobile VR Displays
Projection/Large Display Technologies
• Room Scale Projection
• CAVE, multi-wall environment
• Dome projection
• Hemisphere/spherical display
• Head/body inside
• Vehicle Simulator
• Simulated visual display in windows
CAVE
• Developed in 1992, EVL University of Illinois Chicago
• Multi-walled stereo projection environment
• Head tracked active stereo
Cruz-Neira, C., Sandin, D. J., DeFanti, T. A., Kenyon, R. V., & Hart, J. C. (1992). The CAVE: audio
visual experience automatic virtual environment. Communications of the ACM, 35(6), 64-73.
Vehicle Simulators
• Combine VR displays with vehicle
• Visual displays on windows
• Motion base for haptic feedback
• Audio feedback
• Physical vehicle controls
• Steering wheel, flight stick, etc
• Full vehicle simulation
• Emergencies, normal operation, etc
• Weapon operation
• Training scenarios
Audio Displays
• Spatialization vs. Localization
• Spatialization is the processing of sound signals to make them emanate from
a point in space
• This is a technical topic
• Localization is the ability of people to identify the source position of a sound
• This is a human topic, i.e., some people are better at it than others.
Stereo Sound
• Seems to come from inside users head
• Follows head motion as user moves head
3D Spatial Sound
• Seems to be external to the head
• Fixed in space when user moves head
• Has reflected sound properties
Head-Related Transfer Functions (HRTFs)
• A set of functions that model how sound from a
source at a known location reaches the eardrum
• adsfa
Haptic Feedback
• Greatly improves realism
• Hands and wrist are most important
• High density of touch receptors
• Two kinds of feedback:
• Touch Feedback
• information on texture, temperature, etc.
• Does not resist user contact
• Force Feedback
• information on weight, and inertia.
• Actively resists contact motion
Active Haptics
• Actively resists motion
• Key properties
• Force resistance
• Frequency Response
• Degrees of Freedom
• Latency
Passive Haptics
• Not controlled by system
• Use real props (Styrofoam for walls)
• Pros
• Cheap
• Large scale
• Accurate
• Cons
• Not dynamic
• Limited use
TRACKING TECHNOLOGY
Tracking and Rendering in VR
Tracking fits into the graphics pipeline for VR
Tracking Technologies
§ Active (device sends out signal)
• Mechanical, Magnetic, Ultrasonic
• GPS, Wifi, cell location
§ Passive (device senses world)
• Inertial sensors (compass, accelerometer, gyro)
• Computer Vision
• Marker based, Natural feature tracking
§ Hybrid Tracking
• Combined sensors (e.g. Vision + Inertial)
Key Tracking Performance Criteria
• Static Accuracy
• Dynamic Accuracy
• Latency
• Update Rate
• Tracking Jitter
• Signal to Noise Ratio
• Tracking Drift
MechanicalTracker (Active)
•Idea: mechanical arms with joint sensors
•++: high accuracy, haptic feedback
•-- : cumbersome, expensive
Microscribe Sutherland
MagneticTracker (Active)
• Idea: difference between a magnetic
transmitter and a receiver
• ++: 6DOF, robust
• -- : wired, sensible to metal, noisy, expensive
• -- : error increases with distance
Flock of Birds (Ascension)
InertialTracker (Passive)
• Idea: measuring linear and angular orientation rates
(accelerometer/gyroscope)
• ++: no transmitter, cheap, small, high frequency, wireless
• -- : drift, hysteris only 3DOF
IS300 (Intersense)
Wii Remote
OpticalTracker (Passive)
• Idea: Image Processing and ComputerVision
• Specialized
• Infrared, Retro-Reflective, Stereoscopic
• Monocular BasedVision Tracking
ART Hi-Ball
Outside-In vs.Inside-OutTracking
Example: Oculus Quest
• Inside out tracking
• Four cameras on corner of display
• Searching for visual features
• On setup creates map of room
Example: Vive Lighthouse Tracking
• Outside-in tracking system
• 2 base stations
• Each with 2 laser scanners, LED array
• Headworn/handheld sensors
• 37 photo-sensors in HMD, 17 in hand
• Additional IMU sensors (500 Hz)
• Performance
• Tracking server fuses sensor samples
• Sampling rate 250 Hz, 4 ms latency
• See http://doc-ok.org/?p=1478
Lighthouse Setup
INPUT TECHNOLOGY
VR Input Devices
• Physical devices that convey information into the application
and support interaction in the Virtual Environment
Mapping Between Input and Output
Input
Output
Motivation
• Mouse and keyboard are good for desktop UI tasks
• Text entry, selection, drag and drop, scrolling, rubber banding, …
• 2D mouse for 2D windows
• What devices are best for 3D input in VR?
• Use multiple 2D input devices?
• Use new types of devices?
vs.
Input Device Characteristics
• Size and shape, encumbrance
• Degrees of Freedom
• Integrated (mouse) vs. separable (Etch-a-sketch)
• Direct vs. indirect manipulation
• Relative vs. Absolute input
• Relative: measure difference between current and last input (mouse)
• Absolute: measure input relative to a constant point of reference (tablet)
• Rate control vs. position control
• Isometric vs. Isotonic
• Isometric: measure pressure or force with no actual movement
• Isotonic: measure deflection from a center point (e.g. mouse)
Hand Input Devices
• Devices that integrate hand input into VR
• World-Grounded input devices
• Devices fixed in real world (e.g. joystick)
• Non-Tracked handheld controllers
• Devices held in hand, but not tracked in 3D (e.g. xbox controller)
• Tracked handheld controllers
• Physical device with 6 DOF tracking inside (e.g. Vive controllers)
• Hand-Worn Devices
• Gloves, EMG bands, rings, or devices worn on hand/arm
• Bare Hand Input
• Using technology to recognize natural hand input
World Grounded Devices
• Devices constrained or fixed in real world
• Not ideal for VR
• Constrains user motion
• Good for VR vehicle metaphor
• Used in location based entertainment (e.g. Disney Aladdin ride)
Disney Aladdin Magic Carpet VR Ride
Eight360 - Nova
• World grounded interface
• Mounted in sphere
• Full 360-degree range of motion
• Vehicle motion simulator
• https://www.eight360.com/
https://www.youtube.com/watch?v=FhZD-X1LWhY
Non-Tracked Handheld Controllers
• Devices held in hand
• Buttons, joysticks, game controllers, etc.
• Traditional video game controllers
• Xbox controller
Tracked Handheld Controllers
• Handheld controller with 6 DOF tracking
• Combines button/joystick input plus tracking
• One of the best options for VR applications
• Physical prop enhancing VR presence
• Providing proprioceptive, passive haptic touch cues
• Direct mapping to real hand motion
HTC Vive Controllers Oculus Touch Controllers
Example: WMR Handheld Controllers
• Windows Mixed Reality Controllers
• Left and right hand
• Combine computer vision + IMU tracking
• Track both in and out of view
• Button input, Vibration feedback
https://www.youtube.com/watch?v=rkDpRllbLII
Cubic Mouse
• Plastic box
• Polhemus Fastrack inside (magnetic 6 DOF tracking)
• 3 translating rods, 6 buttons
• Two handed interface
• Supports object rotation, zooming, cutting plane, etc.
Fröhlich, B., & Plate, J. (2000). The cubic mouse: a new device for three-dimensional input. In Proceedings of
the SIGCHI conference on Human Factors in Computing Systems (pp. 526-531). ACM.
Cubic Mouse Video
https://www.youtube.com/watch?v=1WuH7ezv_Gs
Hand Worn Devices
• Devices worn on hands/arms
• Glove, EMG sensors, rings, etc.
• Advantages
• Natural input with potentially rich gesture interaction
• Hands can be held in comfortable positions – no line of sight issues
• Hands and fingers can fully interact with real objects
Facebook EMG Band
https://www.youtube.com/watch?v=WmxLiXAo9ko
Data Gloves
• Bend sensing gloves
• Passive input device
• Detecting hand posture and gestures
• Continuous raw data from bend sensors
• Fibre optic, resistive ink, strain-gauge
• Large DOF output, natural hand output
• Pinch gloves
• Conductive material at fingertips
• Determine if fingertips touching
• Used for discrete input
• Object selection, mode switching, etc.
How Pinch Gloves Work
• Contact between conductive
fabric completes circuit
• Each finger receives voltage
in turn (T3 – T7)
• Look for output voltage at
different times
StretchSense Gloves
• Wearable motion capture sensors
• Capacitive sensors
• Measure stretch, pressure, bend, shear
• Many applications
• Garments, gloves, etc.
• http://stretchsense.com/
StretchSense Glove Demo
https://www.youtube.com/watch?v=ZDq7fQguFPI
Bare Hands
• Using computer vision to track bare hand input
• Creates compelling sense of Presence, natural interaction
• Challenges need to be solved
• Not having sense of touch
• Line of sight required to sensor
• Fatigue from holding hands in front of sensor
Oculus Quest 2 – Hand Tracking
https://www.youtube.com/watch?v=uztFcEA6Rf0
Non-Hand Input Devices
• Capturing input from other parts of the body
• Head Tracking
• Use head motion for input
• Eye Tracking
• Largely unexplored for VR
• Face Tracking
• Use for lip syncing
• Microphones
• Audio input, speech
• Full-Body tracking
• Motion capture, body movement
Eye Tracking
• Technology
• Shine IR light into eye and look for reflections
• Advantages
• Provides natural hands-free input
• Gaze provides cues as to user attention
• Can be combined with other input technologies
HTC Vive Pro Eye
• HTC Vive Pro with integrated eye-tracking
• Tobii systems eye-tracker
• Easy calibration and set-up
• Auto-calibration software compensates for HMD motion
https://www.youtube.com/watch?v=y_jdjjNrJyk
Facial Tracking
• Mount cameras over mouth for
face tracking
• HTC Vive facial tracker
• Stereo IR cameras
• Tracks 38 facial features
• < 10 ms delay
• 60Hz update rate
• Unity/Unreal SDK plugin
https://www.youtube.com/watch?v=fBLwshNHBRo
Full Body Tracking
• Adding full-body input into VR
• Creates illusion of self-embodiment
• Significantly enhances sense of Presence
• Technologies
• Motion capture suit, camera based systems
• Can track large number of significant feature points
Camera Based Motion Capture
• Use multiple cameras
• Reflective markers on body
• Eg – Opitrack (www.optitrack.com)
• 120 – 360 fps, < 10ms latency, < 1mm accuracy
Optitrack Demo
https://www.youtube.com/watch?v=tBAvjU0ScuI
Wearable Motion Capture: PrioVR
• Wearable motion capture system
• 8 – 17 inertial sensors + wireless data transmission
• 30 – 40m range, 7.5 ms latency, 0.09o
precision
• Supports full range of motion, no occlusion
• https://yostlabs.com/priovr/
PrioVR Demo
https://www.youtube.com/watch?v=q72iErtvhNc
Pedestrian Devices
• Pedestrian input in VR
• Walking/running in VR
• Virtuix Omni
• Special shoes
• http://www.virtuix.com
• Cyberith Virtualizer
• Socks + slippery surface
• http://cyberith.com
Virtuix Omni Demo
https://www.youtube.com/watch?v=aOYHg8qdxTE
Omnidirectional Treadmills
• Infinadeck
• 2 axis treadmill, flexible material
• Tracks user to keep them in centre
• Limitless walking input in VR
• www.infinadeck.com
Infinadeck Demo
https://www.youtube.com/watch?v=seML5CQBzP8
Virtusphere
• Fully immersive sphere
• Support walking, running in VR
• Person inside trackball
• http://www.virtusphere.com
Virtusphere Demo
https://www.youtube.com/watch?v=5PSFCnrk0GI
Comparison Between Devices
From Jerald (2015)
COMPLETE VR SYSTEMS
Creating a Good VR Experience
• Creating a good experience requires good system design
• Integrating multiple hardware, software, interaction, content elements
Example: Shard VR Slide
Ride down the Shard at 100 mph - Multi-sensory VR
https://www.youtube.com/watch?v=HNXYoEdBtoU
Key Components to Consider
• Five key components:
• Inputs
• Outputs
• Computation/Simulation
• Content/World database
• User interaction
From: Sherman, W. R., & Craig, A. B. (2018). Understanding virtual reality:
Interface, application, and design. Morgan Kaufmann.
Typical VR System
• Combining multiple technology elements for good user experience
• Input devices, output modality, content databases, networking, etc.
From Content to User
Modelling
Program
Content
• 3d model
• Textures
Translation
• CAD data
Application
programming
Dynamics
Generator
Input Devices
• Gloves, Mic
• Trackers
Renderers
• 3D, sound
Output Devices
• HMD, audio
• Haptic
User Actions
• Speak
• Grab
Software
Content
User I/O
Types of VR Graphics Content
• Panoramas
• 360 images/video
• Captured 3D content
• Scanned objects/spaces
• Modelled Content
• Hand created 3D models
• Existing 3D assets
Capturing Panoramas
• Stitching individual photos together
• Image Composite Editor (Microsoft)
• AutoPano (Kolor)
• Using 360 camera
• Ricoh Theta-S
• Insta360
Consumer 360 Capture Devices
Kodac 360 Fly 360 Gear 360 Theta S Nikon
LG 360 Pointgrey Ladybug Panono 360 Bublcam
Example: Cardboard Camera
• Capture 360 panoramas
• Stitch together images on phone
• View in VR on Google Cardboard Viewer
Cardboard Camera
https://www.youtube.com/watch?v=d5lUXZhWaZY
• Use camera pairs to capture stereo 360 video
• Vuze+ VR camera
• 8 lenses, 4K Stereoscopic 3D 360⁰ video and photo, $999 USD
Stereo Video Capture
Vuze
3D Scanning
• A range of products support 3D scanning
• Create point cloud or mesh model
• Typically combine RGB cameras with depth sensing
• Captures texture plus geometry
• Multi-scale
• Object Scanners
• Handheld, Desktop
• Body Scanners
• Rotating platform, multi-camera
• Room scale
• Mobile, tripod mounted
Example: Matterport
• Matterport Pro2 3D scanner
• Room scale scanner, panorama and 3D model
• 360° (left-right) x 300° (vertical) field of view
• Structured light (infared) 3D sensor
• 15 ft (4.5 m) maximum range
• 4K HDR images
Matterport Pro2 Lite
https://www.youtube.com/watch?v=SjHk0Th-j1I
Handheld/Desktop Scanners
• Capture people/objects
• Sense 3D scanner
• accuracy of 0.90 mm, colour resolution of 1920×1080 pixels
• Occipital Structure sensor
• Add-on to iPad, mesh scanning, IR light projection, 60 Hz
Structure Sensor
https://www.youtube.com/watch?v=7j3HQxUGvq4
3D Modelling
• A variety of 3D modelling tools can be used
• Export in VR compatible file format (.obj, .fbx, etc)
• Especially useful for animation - difficult to create from scans
• Popular tools
• Blender (free), 3DS max, Maya, etc.
• Easy to Use
• Tinkercad, Sketchup Free, Meshmixer, Fusion 360, etc.
Modelling in VR
• Several tools for modelling in VR
• Natural interaction, low polygon count, 3D object viewins
• Low end
• Google Blocks
• High end
• Quill, Tilt brush – 3D painting
• Gravity Sketch – 3D CAD
Example: Google Blocks
https://www.youtube.com/watch?v=1TX81cRqfUU
Download Existing VR Content
• Many locations for 3D objects, textures, etc.
• Sketchfab, Sketchup, Free3D (www.free3d.com), etc.
• Asset stores - Unity, Unreal
• Provide 3D models, materials, code, etc..
VR Graphics Architecture
• Application Layer
• User interface libraries
• Simulation/behaviour code
• User interaction specification
• Graphics Layer (CPU acceleration)
• Scene graph specification
• Object physics engine
• Specifying graphics objects
• Rendering Layer (GPU acceleration)
• Low level graphics code
• Rendering pixels/polygons
• Interface with graphics card/frame buffer
• Low level code for loading models and showing on screen
• Using shaders and low-level GPU programming to improve graphics
Traditional 3D Graphics Pipeline
Graphics Challenges with VR
• Higher data throughput (> 7x desktop requirement)
• Lower latency requirements (from 150ms/frame to 20ms)
• HMD Lens distortion
• HMD may have cheap lens
• Creates chromatic aberration and distorted image
• Warp graphics images to create undistorted view
• Use low level shader programming
Lens Distortion
VR System Pipeline
• Using time warping and lens distortion
Perception Based Graphics
• Eye Physiology
• Rods in eye centre = colour vision, cones in periphery = motion, B+W
• Foveated Rendering
• Use eye tracking to draw highest resolution where user looking
• Reduces graphics throughput
Foveated Rendering
https://www.youtube.com/watch?v=lNX0wCdD2LA
Scene Graphs
• Tree-like structure for organising VR graphics
• e.g. VRML, OSG, X3D
• Hierarchy of nodes that define:
• Groups (and Switches, Sequences etc…)
• Transformations
• Projections
• Geometry
• …
• And states and attributes that define:
• Materials and textures
• Lighting and blending
• …
Example Scene Graph
• Car model with four wheels
• Only need one wheel geometry object in scene graph
More Complex
• Everything off root node
• Parent/child node
relationships
• Can move car by
transforming group node
Adding Cameras and Lights
• Scene graph includes:
• Cameras
• Lighting
• Material properties
• Etc..
• All passed to renderer
Benefits of Using a Scene Graph
• Performance
• Structuring data facilitates optimization
• Culling, state management, etc…
• Hardware Abstraction
• Underlying graphics pipeline is hidden
• No Low-level programming
• Think about objects, not polygons
• Supports Behaviours
• Collision detection, animation, etc..
Scene Graph in the Rendering Pipeline
• Scene graph used to optimize scene creation in pipeline
Scene Graph Libraries
• VRML/X3D
• descriptive text format, ISO standard
• OpenInventor
• based on C++ and OpenGL
• originally Silicon Graphics, 1988
• now supported by VSG3d.com
• Java3D
• provides 3D data structures in Java
• not supported anymore
• Open Scene Graph (OSG)
• Various Game Engines
• e.g. JMonkey 3 (scene graph based game engine for Java)
OpenSceneGraph
• http://www.openscenegraph.org/
• Open-source scene graph implementation
• Based on OpenGL
• Object-oriented C++ following design pattern principles
• Used for simulation, games, research, and industrial projects
• Active development community
• mailing list, documentation (www.osgbooks.com)
• Uses the OSG Public License (similar to LGPL)
OpenSceneGraph Features
• Plugins for loading and saving
• 3D: 3D Studio (.3ds), OpenFlight (.flt), Wavefront (.obj)…
• 2D: .png, .jpg, .bmp, QuickTime movies
• NodeKits to extend functionality
• osgTerrain - terrain rendering
• osgAnimation - character animation
• osgShadow - shadow framework
• Multi-language support
• C++, Java, Lua and Python
• Cross-platform support:
• Windows, Linux, MacOS, iOS, Android, etc.
OpenSceneGraph Architecture
Scene graph and
Rendering
functionality
Plugins read and
write 2D image
and 3D model
files
NodeKits extend
core functionality,
exposing higher-level
node types
OpenSceneGraph and Virtual Reality
• Need to create VR wrapper on top of OSG
• Add support for HMDs, device interaction, etc..
• Several viewer nodes available with VR support
• OsgOpenVRViewer: viewing on VR devices compatible with openVR/steamVR
• OsgOculusViewer: OsgViewer with support for the Oculus Rift
Examples
Using OsgOculusViewer, Leap Motion and Oculus Rift HMD
https://www.youtube.com/watch?v=xZgyOF-oT0g
Typical VR Simulation Loop
• User moves head, scene updates, displayed graphics change
• Need to synchronize system to reduce delays
System Delays
Typical Delay from Tracking to Rendering
System Delay
Typical System Delays
• Total Delay = 50 + 2 + 33 + 17 = 102 ms
• 1 ms delay = 1/3 mm error for object drawn at arms length
• So total of 33mm error from when user begins moving to when object drawn
Tracking Calculate
Viewpoint
Simulation
Render
Scene
Draw to
Display
x,y,z
r,p,y
Application Loop
20 Hz = 50ms 500 Hz = 2ms 30 Hz = 33ms 60 Hz = 17ms
Living with High Latency (1/3 sec – 3 sec)
https://www.youtube.com/watch?v=_fNp37zFn9Q
Effects of System Latency
• Degraded Visual Acuity
• Scene still moving when head stops = motion blur
• Degraded Performance
• As latency increases it’s difficult to select objects etc.
• If latency > 120 ms, training doesn’t improve performance
• Breaks-in-Presence
• If system delay high user doesn’t believe they are in VR
• Negative Training Effects
• User train to operative in world with delay
• Simulator Sickness
• Latency is greatest cause of simulator sickness
Simulator Sickness
• Visual input conflicting with vestibular system
What Happens When Senses Don’t Match?
• 20-30% VR users experience motion sickness
• Sensory Conflict Theory
• Visual cues don’t match vestibular cues
• Eyes – “I’m moving!”, Vestibular – “No, you’re not!”
Avoiding Motion Sickness
• Better VR experience design
• More natural movements
• Improved VR system performance
• Less tracking latency, better graphics frame rate
• Provide a fixed frame of reference
• Ground plane, vehicle window
• Add a virtual nose
• Provide peripheral cue
• Eat ginger
• Reduces upset stomach
Many Causes of Simulator Sickness
• 25-40% of VR users get Simulator Sickness, due to:
• Latency
• Major cause of simulator sickness
• Tracking accuracy/precision
• Seeing world from incorrect position, viewpoint drift
• Field of View
• Wide field of view creates more periphery vection = sickness
• Refresh Rate/Flicker
• Flicker/low refresh rate creates eye fatigue
• Vergence/Accommodation Conflict
• Creates eye strain over time
• Eye separation
• If IPD not matching to inter-image distance then discomfort
Motion Sickness
https://www.youtube.com/watch?v=BznbIlW8iqE
The VR Book
• The VR Book: Human-Centered
Design for Virtual Reality
• Jason Jerald
• https://thevrbook.net/
System Design Guidelines - I
• Hardware
• Choose HMDs with fast pixel response time, no flicker
• Choose trackers with high update rates, accurate, no drift
• Choose HMDs that are lightweight, comfortable to wear
• Use hand controllers with no line-of-sight requirements
• System Calibration
• Have virtual FOV match actual FOV of HMD
• Measure and set users IPD
• Latency Reduction
• Minimize overall end to end system delay
• Use displays with fast response time and low persistence
• Use latency compensation to reduce perceived latency
Jason Jerald, The VR Book, 2016
System Design Guidelines - II
• General Design
• Design for short user experiences
• Minimize visual stimuli closer to eye (vergence/accommodation)
• For binocular displays, do not use 2D overlays/HUDs
• Design for sitting, or provide physical barriers
• Show virtual warning when user reaches end of tracking area
• Motion Design
• Move virtual viewpoint with actual motion of the user
• If latency high, no tasks requiring fast head motion
• Interface Design
• Design input/interaction for user’s hands at their sides
• Design interactions to be non-repetitive to reduce strain injuries
Jason Jerald, The VR Book, 2016
VR PROTOTYPING
XR Prototyping Tools
Low Fidelity (Concept, visual design)
• Sketching
• Photoshop
• PowerPoint
• Video
High Fidelity (Interaction, experience design)
• Interactive sketching
• Desktop & on-device authoring
• Immersive authoring & visual scripting
• XR development toolkits
XR Prototyping Tools
Low Fidelity (Concept, visual design)
• Sketching
• Photoshop
• PowerPoint
• Video
High Fidelity (Interaction, experience design)
• Interactive sketching
• Desktop & on-device authoring
• Immersive authoring & visual scripting
• XR development toolkits
Sketching VR Interfaces
•Download 360 panorama template grid
•Draw interface ideas into grid
•Scan into 360 photo viewer for VR HMD
See https://virtualrealitypop.com/vr-sketches-56599f99b357
VR Sketch Sheets
Available from https://blog.prototypr.io/vr-paper-prototyping-9e1cab6a75f3
Concept Sheet
Top down and First Person view
Suitable for rough sketch
Sketching on the Template
Bringing into VR
https://www.youtube.com/watch?v=jyHpqgk58dg
Example: VR Basketball
Final 360 Image
Viewing in VR
https://www.youtube.com/watch?v=_r3VD-IJbvk
VR Storyboarding - Describing the Experience
VR Storyboard Template
Example: VR Storyboard
https://medium.com/cinematicvr/a-storyboard-for-virtual-reality-fa000a9b4497
Sketching in VR
Using VR applications for rapid prototyping
- Intuitive sketching in immersive space
- Creating/testing at 1:1 scale
- Rapid UI design/layout
Examples
- Quill - https://quill.fb.com/
- Tilt Brush - https://www.tiltbrush.com/
Example: Google Tilt Brush
Gravity Sketch
•Intuitive immersive 3D design platform
•Move from sketch to final 3D model render
•Natural 3D UI manipulation
•Two handed input, 3D menus, etc
•Multi-platform
•HMD (Quest, Steam), tablet, etc
•Support for collaboration
Gravity Sketch Demo
https://www.youtube.com/watch?v=vte_2MxW3Js
Scene Assembly
•Assemble assets into 3D scene
•Create high-fidelity view
•Collect user feedback
•Immersive Scene Assembly
•Microsoft Maquette: https://www.maquette.ms/
•Sketchbox: https://www.sketchbox3d.com/
Example: Mocking up a Scene - Twitch VR
https://www.notion.so/Twitch-VR-Prototype-Workflow-5cf65c7bfcd84226a87b1c0db07e46f2
1. Collect tools needed
2. Know/imagine the user
3. Storyboard/sketch the experience
4. Create assets needed
5. Create the interface scenes
6. View in VR/record video
Prototyping Process
2D Asset Creation
•Many possible tools
•Sketch
•MacOS
•https://www.sketch.com/
•Figma
•Chrome plugin
•https://www.figma.com/
Figma Asset Layout
Google Blocks
3D Scene Layout
•Scene layout in VR
•Sketchbox
•https://www.sketchbox3d.com/
•Fast and simple VR prototyping tool
•Collaborative VR design tool
•Export to SketchFab/fbx
Sketchbox Demo
https://www.youtube.com/watch?v=gWfgewGzaEI
Final Scene
https://www.youtube.com/watch?v=qwbFAqpINuY
XR Prototyping Tools
Low Fidelity (Concept, visual design)
• Sketching
• Photoshop
• PowerPoint
• Video
High Fidelity (Interaction, experience design)
• Interactive sketching
• Desktop & on-device authoring
• Immersive authoring & visual scripting
• XR development toolkits
Digital Authoring Tools for VR
• Support visual authoring of 3D
scene graphs with VR previews
• Basic interactions can be
implemented without coding
• Advanced interactions require
JavaScript, C#, or C++
Amazon Sumerian
Unity Editor
Immersive Authoring Tools for VR
• Enable visual authoring of 3D
content in VR
• Make it possible to edit while
previewing VR experience
• Focus on 3D modeling rather than
animation & scripting
• Typically support export to common
3D model formats and asset sharing
platforms like Google Poly,
Sketchfab, or 3D Warehouse
Google Blocks
Oculus Quill
Interactive 360 Prototyping for VR
•Create 360 images and add interactive elements
•Many possible tools
•InstaVR
•http://www.instavr.co/
•Free, fast panorama VR
•Drag and drop web interface
•Deploy to multi platforms (Quest, Vive, phone, etc)
•VR Direct
•https://www.vrdirect.com/
•Connect multiple 360 scenes
•Instant content update
•EasyVR
•https://www.360easyvr.com/
Using InstaVR
https://www.youtube.com/watch?v=ce4Pww3Up10
VR Visual Programming
• Drag and drop VR development
• Visual Programming for Unity
• VR Easy - http://blog.avrworks.com/
• Key VR functionality (navigation, etc)
• HMD and VR controller support
• Bolt
• Rich visual flow
• Integrated with Unity
• Playmaker - https://hutonggames.com/
• Popular game authoring tool
• Can be combined with VR toolkits
Video Demo - VR Easy
https://www.youtube.com/watch?v=L49ENduYgac
Visual Scripting with Bolt
Visual Scripting with Bolt
Example: Changing Material on Collision
Example: Changing Material on Collision
High Level Graphics Tools
• Game Engines
• Powerful, need scripting ability
• Unity, Unreal, Cry Engine, etc..
• Combine with VR plugins
• HMDs, input devices, interaction, assets, etc..
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

Comp4010 Lecture9 VR Input and Systems

  • 1.
    VR INPUT ANDSYSTEMS COMP 4010 Lecture Nine Mark Billinghurst October 5th 2021 mark.billinghurst@unisa.edu.au
  • 2.
  • 3.
    Reality vs. VirtualReality • In a VR system there are input and output devices between human perception and action
  • 4.
    Using Technology toStimulate Senses • Simulate output • E.g. simulate real scene • Map output to devices • Graphics to HMD • Use devices to stimulate the senses • HMD stimulates eyes Visual Simulation 3D Graphics HMD Vision System Brain Example: Visual Simulation Human-Machine Interface
  • 5.
    Key Technologies forVR Systems • Display (Immersion) • Stimulate senses • visual, auditory, tactile sense, etc.. • Tracking (Independence) • Changing viewpoint • independent movement • Input Devices (Interaction) • Supporting user interaction • User input
  • 6.
  • 7.
    Creating an ImmersiveExperience •Head Mounted Display • Immerse the eyes •Projection/Large Screen • Immerse the head/body •Future Technologies • Neural implants • Contact lens displays, etc
  • 9.
  • 10.
  • 11.
    Foveated Displays • Combinehigh resolution center with low resolution periphery
  • 12.
    Computer Based vs.Mobile VR Displays
  • 13.
    Projection/Large Display Technologies •Room Scale Projection • CAVE, multi-wall environment • Dome projection • Hemisphere/spherical display • Head/body inside • Vehicle Simulator • Simulated visual display in windows
  • 14.
    CAVE • Developed in1992, EVL University of Illinois Chicago • Multi-walled stereo projection environment • Head tracked active stereo Cruz-Neira, C., Sandin, D. J., DeFanti, T. A., Kenyon, R. V., & Hart, J. C. (1992). The CAVE: audio visual experience automatic virtual environment. Communications of the ACM, 35(6), 64-73.
  • 15.
    Vehicle Simulators • CombineVR displays with vehicle • Visual displays on windows • Motion base for haptic feedback • Audio feedback • Physical vehicle controls • Steering wheel, flight stick, etc • Full vehicle simulation • Emergencies, normal operation, etc • Weapon operation • Training scenarios
  • 16.
    Audio Displays • Spatializationvs. Localization • Spatialization is the processing of sound signals to make them emanate from a point in space • This is a technical topic • Localization is the ability of people to identify the source position of a sound • This is a human topic, i.e., some people are better at it than others.
  • 17.
    Stereo Sound • Seemsto come from inside users head • Follows head motion as user moves head
  • 18.
    3D Spatial Sound •Seems to be external to the head • Fixed in space when user moves head • Has reflected sound properties
  • 19.
    Head-Related Transfer Functions(HRTFs) • A set of functions that model how sound from a source at a known location reaches the eardrum
  • 20.
  • 21.
    Haptic Feedback • Greatlyimproves realism • Hands and wrist are most important • High density of touch receptors • Two kinds of feedback: • Touch Feedback • information on texture, temperature, etc. • Does not resist user contact • Force Feedback • information on weight, and inertia. • Actively resists contact motion
  • 22.
    Active Haptics • Activelyresists motion • Key properties • Force resistance • Frequency Response • Degrees of Freedom • Latency
  • 23.
    Passive Haptics • Notcontrolled by system • Use real props (Styrofoam for walls) • Pros • Cheap • Large scale • Accurate • Cons • Not dynamic • Limited use
  • 24.
  • 25.
    Tracking and Renderingin VR Tracking fits into the graphics pipeline for VR
  • 26.
    Tracking Technologies § Active(device sends out signal) • Mechanical, Magnetic, Ultrasonic • GPS, Wifi, cell location § Passive (device senses world) • Inertial sensors (compass, accelerometer, gyro) • Computer Vision • Marker based, Natural feature tracking § Hybrid Tracking • Combined sensors (e.g. Vision + Inertial)
  • 27.
    Key Tracking PerformanceCriteria • Static Accuracy • Dynamic Accuracy • Latency • Update Rate • Tracking Jitter • Signal to Noise Ratio • Tracking Drift
  • 28.
    MechanicalTracker (Active) •Idea: mechanicalarms with joint sensors •++: high accuracy, haptic feedback •-- : cumbersome, expensive Microscribe Sutherland
  • 29.
    MagneticTracker (Active) • Idea:difference between a magnetic transmitter and a receiver • ++: 6DOF, robust • -- : wired, sensible to metal, noisy, expensive • -- : error increases with distance Flock of Birds (Ascension)
  • 30.
    InertialTracker (Passive) • Idea:measuring linear and angular orientation rates (accelerometer/gyroscope) • ++: no transmitter, cheap, small, high frequency, wireless • -- : drift, hysteris only 3DOF IS300 (Intersense) Wii Remote
  • 31.
    OpticalTracker (Passive) • Idea:Image Processing and ComputerVision • Specialized • Infrared, Retro-Reflective, Stereoscopic • Monocular BasedVision Tracking ART Hi-Ball
  • 32.
  • 33.
    Example: Oculus Quest •Inside out tracking • Four cameras on corner of display • Searching for visual features • On setup creates map of room
  • 34.
    Example: Vive LighthouseTracking • Outside-in tracking system • 2 base stations • Each with 2 laser scanners, LED array • Headworn/handheld sensors • 37 photo-sensors in HMD, 17 in hand • Additional IMU sensors (500 Hz) • Performance • Tracking server fuses sensor samples • Sampling rate 250 Hz, 4 ms latency • See http://doc-ok.org/?p=1478
  • 35.
  • 36.
  • 37.
    VR Input Devices •Physical devices that convey information into the application and support interaction in the Virtual Environment
  • 38.
    Mapping Between Inputand Output Input Output
  • 39.
    Motivation • Mouse andkeyboard are good for desktop UI tasks • Text entry, selection, drag and drop, scrolling, rubber banding, … • 2D mouse for 2D windows • What devices are best for 3D input in VR? • Use multiple 2D input devices? • Use new types of devices? vs.
  • 40.
    Input Device Characteristics •Size and shape, encumbrance • Degrees of Freedom • Integrated (mouse) vs. separable (Etch-a-sketch) • Direct vs. indirect manipulation • Relative vs. Absolute input • Relative: measure difference between current and last input (mouse) • Absolute: measure input relative to a constant point of reference (tablet) • Rate control vs. position control • Isometric vs. Isotonic • Isometric: measure pressure or force with no actual movement • Isotonic: measure deflection from a center point (e.g. mouse)
  • 41.
    Hand Input Devices •Devices that integrate hand input into VR • World-Grounded input devices • Devices fixed in real world (e.g. joystick) • Non-Tracked handheld controllers • Devices held in hand, but not tracked in 3D (e.g. xbox controller) • Tracked handheld controllers • Physical device with 6 DOF tracking inside (e.g. Vive controllers) • Hand-Worn Devices • Gloves, EMG bands, rings, or devices worn on hand/arm • Bare Hand Input • Using technology to recognize natural hand input
  • 42.
    World Grounded Devices •Devices constrained or fixed in real world • Not ideal for VR • Constrains user motion • Good for VR vehicle metaphor • Used in location based entertainment (e.g. Disney Aladdin ride) Disney Aladdin Magic Carpet VR Ride
  • 43.
    Eight360 - Nova •World grounded interface • Mounted in sphere • Full 360-degree range of motion • Vehicle motion simulator • https://www.eight360.com/
  • 44.
  • 45.
    Non-Tracked Handheld Controllers •Devices held in hand • Buttons, joysticks, game controllers, etc. • Traditional video game controllers • Xbox controller
  • 46.
    Tracked Handheld Controllers •Handheld controller with 6 DOF tracking • Combines button/joystick input plus tracking • One of the best options for VR applications • Physical prop enhancing VR presence • Providing proprioceptive, passive haptic touch cues • Direct mapping to real hand motion HTC Vive Controllers Oculus Touch Controllers
  • 47.
    Example: WMR HandheldControllers • Windows Mixed Reality Controllers • Left and right hand • Combine computer vision + IMU tracking • Track both in and out of view • Button input, Vibration feedback
  • 48.
  • 49.
    Cubic Mouse • Plasticbox • Polhemus Fastrack inside (magnetic 6 DOF tracking) • 3 translating rods, 6 buttons • Two handed interface • Supports object rotation, zooming, cutting plane, etc. Fröhlich, B., & Plate, J. (2000). The cubic mouse: a new device for three-dimensional input. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (pp. 526-531). ACM.
  • 50.
  • 51.
    Hand Worn Devices •Devices worn on hands/arms • Glove, EMG sensors, rings, etc. • Advantages • Natural input with potentially rich gesture interaction • Hands can be held in comfortable positions – no line of sight issues • Hands and fingers can fully interact with real objects
  • 52.
  • 53.
    Data Gloves • Bendsensing gloves • Passive input device • Detecting hand posture and gestures • Continuous raw data from bend sensors • Fibre optic, resistive ink, strain-gauge • Large DOF output, natural hand output • Pinch gloves • Conductive material at fingertips • Determine if fingertips touching • Used for discrete input • Object selection, mode switching, etc.
  • 54.
    How Pinch GlovesWork • Contact between conductive fabric completes circuit • Each finger receives voltage in turn (T3 – T7) • Look for output voltage at different times
  • 55.
    StretchSense Gloves • Wearablemotion capture sensors • Capacitive sensors • Measure stretch, pressure, bend, shear • Many applications • Garments, gloves, etc. • http://stretchsense.com/
  • 56.
  • 57.
    Bare Hands • Usingcomputer vision to track bare hand input • Creates compelling sense of Presence, natural interaction • Challenges need to be solved • Not having sense of touch • Line of sight required to sensor • Fatigue from holding hands in front of sensor
  • 58.
    Oculus Quest 2– Hand Tracking https://www.youtube.com/watch?v=uztFcEA6Rf0
  • 59.
    Non-Hand Input Devices •Capturing input from other parts of the body • Head Tracking • Use head motion for input • Eye Tracking • Largely unexplored for VR • Face Tracking • Use for lip syncing • Microphones • Audio input, speech • Full-Body tracking • Motion capture, body movement
  • 60.
    Eye Tracking • Technology •Shine IR light into eye and look for reflections • Advantages • Provides natural hands-free input • Gaze provides cues as to user attention • Can be combined with other input technologies
  • 61.
    HTC Vive ProEye • HTC Vive Pro with integrated eye-tracking • Tobii systems eye-tracker • Easy calibration and set-up • Auto-calibration software compensates for HMD motion
  • 62.
  • 63.
    Facial Tracking • Mountcameras over mouth for face tracking • HTC Vive facial tracker • Stereo IR cameras • Tracks 38 facial features • < 10 ms delay • 60Hz update rate • Unity/Unreal SDK plugin
  • 64.
  • 65.
    Full Body Tracking •Adding full-body input into VR • Creates illusion of self-embodiment • Significantly enhances sense of Presence • Technologies • Motion capture suit, camera based systems • Can track large number of significant feature points
  • 66.
    Camera Based MotionCapture • Use multiple cameras • Reflective markers on body • Eg – Opitrack (www.optitrack.com) • 120 – 360 fps, < 10ms latency, < 1mm accuracy
  • 67.
  • 68.
    Wearable Motion Capture:PrioVR • Wearable motion capture system • 8 – 17 inertial sensors + wireless data transmission • 30 – 40m range, 7.5 ms latency, 0.09o precision • Supports full range of motion, no occlusion • https://yostlabs.com/priovr/
  • 69.
  • 70.
    Pedestrian Devices • Pedestrianinput in VR • Walking/running in VR • Virtuix Omni • Special shoes • http://www.virtuix.com • Cyberith Virtualizer • Socks + slippery surface • http://cyberith.com
  • 71.
  • 72.
    Omnidirectional Treadmills • Infinadeck •2 axis treadmill, flexible material • Tracks user to keep them in centre • Limitless walking input in VR • www.infinadeck.com
  • 73.
  • 74.
    Virtusphere • Fully immersivesphere • Support walking, running in VR • Person inside trackball • http://www.virtusphere.com
  • 75.
  • 76.
  • 77.
  • 78.
    Creating a GoodVR Experience • Creating a good experience requires good system design • Integrating multiple hardware, software, interaction, content elements
  • 79.
    Example: Shard VRSlide Ride down the Shard at 100 mph - Multi-sensory VR https://www.youtube.com/watch?v=HNXYoEdBtoU
  • 80.
    Key Components toConsider • Five key components: • Inputs • Outputs • Computation/Simulation • Content/World database • User interaction From: Sherman, W. R., & Craig, A. B. (2018). Understanding virtual reality: Interface, application, and design. Morgan Kaufmann.
  • 81.
    Typical VR System •Combining multiple technology elements for good user experience • Input devices, output modality, content databases, networking, etc.
  • 82.
    From Content toUser Modelling Program Content • 3d model • Textures Translation • CAD data Application programming Dynamics Generator Input Devices • Gloves, Mic • Trackers Renderers • 3D, sound Output Devices • HMD, audio • Haptic User Actions • Speak • Grab Software Content User I/O
  • 83.
    Types of VRGraphics Content • Panoramas • 360 images/video • Captured 3D content • Scanned objects/spaces • Modelled Content • Hand created 3D models • Existing 3D assets
  • 84.
    Capturing Panoramas • Stitchingindividual photos together • Image Composite Editor (Microsoft) • AutoPano (Kolor) • Using 360 camera • Ricoh Theta-S • Insta360
  • 85.
    Consumer 360 CaptureDevices Kodac 360 Fly 360 Gear 360 Theta S Nikon LG 360 Pointgrey Ladybug Panono 360 Bublcam
  • 86.
    Example: Cardboard Camera •Capture 360 panoramas • Stitch together images on phone • View in VR on Google Cardboard Viewer
  • 87.
  • 88.
    • Use camerapairs to capture stereo 360 video • Vuze+ VR camera • 8 lenses, 4K Stereoscopic 3D 360⁰ video and photo, $999 USD Stereo Video Capture Vuze
  • 89.
    3D Scanning • Arange of products support 3D scanning • Create point cloud or mesh model • Typically combine RGB cameras with depth sensing • Captures texture plus geometry • Multi-scale • Object Scanners • Handheld, Desktop • Body Scanners • Rotating platform, multi-camera • Room scale • Mobile, tripod mounted
  • 90.
    Example: Matterport • MatterportPro2 3D scanner • Room scale scanner, panorama and 3D model • 360° (left-right) x 300° (vertical) field of view • Structured light (infared) 3D sensor • 15 ft (4.5 m) maximum range • 4K HDR images
  • 91.
  • 92.
    Handheld/Desktop Scanners • Capturepeople/objects • Sense 3D scanner • accuracy of 0.90 mm, colour resolution of 1920×1080 pixels • Occipital Structure sensor • Add-on to iPad, mesh scanning, IR light projection, 60 Hz
  • 93.
  • 94.
    3D Modelling • Avariety of 3D modelling tools can be used • Export in VR compatible file format (.obj, .fbx, etc) • Especially useful for animation - difficult to create from scans • Popular tools • Blender (free), 3DS max, Maya, etc. • Easy to Use • Tinkercad, Sketchup Free, Meshmixer, Fusion 360, etc.
  • 95.
    Modelling in VR •Several tools for modelling in VR • Natural interaction, low polygon count, 3D object viewins • Low end • Google Blocks • High end • Quill, Tilt brush – 3D painting • Gravity Sketch – 3D CAD
  • 96.
  • 97.
    Download Existing VRContent • Many locations for 3D objects, textures, etc. • Sketchfab, Sketchup, Free3D (www.free3d.com), etc. • Asset stores - Unity, Unreal • Provide 3D models, materials, code, etc..
  • 98.
    VR Graphics Architecture •Application Layer • User interface libraries • Simulation/behaviour code • User interaction specification • Graphics Layer (CPU acceleration) • Scene graph specification • Object physics engine • Specifying graphics objects • Rendering Layer (GPU acceleration) • Low level graphics code • Rendering pixels/polygons • Interface with graphics card/frame buffer
  • 99.
    • Low levelcode for loading models and showing on screen • Using shaders and low-level GPU programming to improve graphics Traditional 3D Graphics Pipeline
  • 100.
    Graphics Challenges withVR • Higher data throughput (> 7x desktop requirement) • Lower latency requirements (from 150ms/frame to 20ms) • HMD Lens distortion
  • 101.
    • HMD mayhave cheap lens • Creates chromatic aberration and distorted image • Warp graphics images to create undistorted view • Use low level shader programming Lens Distortion
  • 102.
    VR System Pipeline •Using time warping and lens distortion
  • 103.
    Perception Based Graphics •Eye Physiology • Rods in eye centre = colour vision, cones in periphery = motion, B+W • Foveated Rendering • Use eye tracking to draw highest resolution where user looking • Reduces graphics throughput
  • 104.
  • 105.
    Scene Graphs • Tree-likestructure for organising VR graphics • e.g. VRML, OSG, X3D • Hierarchy of nodes that define: • Groups (and Switches, Sequences etc…) • Transformations • Projections • Geometry • … • And states and attributes that define: • Materials and textures • Lighting and blending • …
  • 106.
    Example Scene Graph •Car model with four wheels • Only need one wheel geometry object in scene graph
  • 107.
    More Complex • Everythingoff root node • Parent/child node relationships • Can move car by transforming group node
  • 108.
    Adding Cameras andLights • Scene graph includes: • Cameras • Lighting • Material properties • Etc.. • All passed to renderer
  • 109.
    Benefits of Usinga Scene Graph • Performance • Structuring data facilitates optimization • Culling, state management, etc… • Hardware Abstraction • Underlying graphics pipeline is hidden • No Low-level programming • Think about objects, not polygons • Supports Behaviours • Collision detection, animation, etc..
  • 110.
    Scene Graph inthe Rendering Pipeline • Scene graph used to optimize scene creation in pipeline
  • 111.
    Scene Graph Libraries •VRML/X3D • descriptive text format, ISO standard • OpenInventor • based on C++ and OpenGL • originally Silicon Graphics, 1988 • now supported by VSG3d.com • Java3D • provides 3D data structures in Java • not supported anymore • Open Scene Graph (OSG) • Various Game Engines • e.g. JMonkey 3 (scene graph based game engine for Java)
  • 112.
    OpenSceneGraph • http://www.openscenegraph.org/ • Open-sourcescene graph implementation • Based on OpenGL • Object-oriented C++ following design pattern principles • Used for simulation, games, research, and industrial projects • Active development community • mailing list, documentation (www.osgbooks.com) • Uses the OSG Public License (similar to LGPL)
  • 113.
    OpenSceneGraph Features • Pluginsfor loading and saving • 3D: 3D Studio (.3ds), OpenFlight (.flt), Wavefront (.obj)… • 2D: .png, .jpg, .bmp, QuickTime movies • NodeKits to extend functionality • osgTerrain - terrain rendering • osgAnimation - character animation • osgShadow - shadow framework • Multi-language support • C++, Java, Lua and Python • Cross-platform support: • Windows, Linux, MacOS, iOS, Android, etc.
  • 114.
    OpenSceneGraph Architecture Scene graphand Rendering functionality Plugins read and write 2D image and 3D model files NodeKits extend core functionality, exposing higher-level node types
  • 115.
    OpenSceneGraph and VirtualReality • Need to create VR wrapper on top of OSG • Add support for HMDs, device interaction, etc.. • Several viewer nodes available with VR support • OsgOpenVRViewer: viewing on VR devices compatible with openVR/steamVR • OsgOculusViewer: OsgViewer with support for the Oculus Rift
  • 116.
    Examples Using OsgOculusViewer, LeapMotion and Oculus Rift HMD https://www.youtube.com/watch?v=xZgyOF-oT0g
  • 117.
    Typical VR SimulationLoop • User moves head, scene updates, displayed graphics change
  • 118.
    • Need tosynchronize system to reduce delays System Delays
  • 119.
    Typical Delay fromTracking to Rendering System Delay
  • 120.
    Typical System Delays •Total Delay = 50 + 2 + 33 + 17 = 102 ms • 1 ms delay = 1/3 mm error for object drawn at arms length • So total of 33mm error from when user begins moving to when object drawn Tracking Calculate Viewpoint Simulation Render Scene Draw to Display x,y,z r,p,y Application Loop 20 Hz = 50ms 500 Hz = 2ms 30 Hz = 33ms 60 Hz = 17ms
  • 121.
    Living with HighLatency (1/3 sec – 3 sec) https://www.youtube.com/watch?v=_fNp37zFn9Q
  • 122.
    Effects of SystemLatency • Degraded Visual Acuity • Scene still moving when head stops = motion blur • Degraded Performance • As latency increases it’s difficult to select objects etc. • If latency > 120 ms, training doesn’t improve performance • Breaks-in-Presence • If system delay high user doesn’t believe they are in VR • Negative Training Effects • User train to operative in world with delay • Simulator Sickness • Latency is greatest cause of simulator sickness
  • 123.
    Simulator Sickness • Visualinput conflicting with vestibular system
  • 124.
    What Happens WhenSenses Don’t Match? • 20-30% VR users experience motion sickness • Sensory Conflict Theory • Visual cues don’t match vestibular cues • Eyes – “I’m moving!”, Vestibular – “No, you’re not!”
  • 125.
    Avoiding Motion Sickness •Better VR experience design • More natural movements • Improved VR system performance • Less tracking latency, better graphics frame rate • Provide a fixed frame of reference • Ground plane, vehicle window • Add a virtual nose • Provide peripheral cue • Eat ginger • Reduces upset stomach
  • 126.
    Many Causes ofSimulator Sickness • 25-40% of VR users get Simulator Sickness, due to: • Latency • Major cause of simulator sickness • Tracking accuracy/precision • Seeing world from incorrect position, viewpoint drift • Field of View • Wide field of view creates more periphery vection = sickness • Refresh Rate/Flicker • Flicker/low refresh rate creates eye fatigue • Vergence/Accommodation Conflict • Creates eye strain over time • Eye separation • If IPD not matching to inter-image distance then discomfort
  • 127.
  • 128.
    The VR Book •The VR Book: Human-Centered Design for Virtual Reality • Jason Jerald • https://thevrbook.net/
  • 129.
    System Design Guidelines- I • Hardware • Choose HMDs with fast pixel response time, no flicker • Choose trackers with high update rates, accurate, no drift • Choose HMDs that are lightweight, comfortable to wear • Use hand controllers with no line-of-sight requirements • System Calibration • Have virtual FOV match actual FOV of HMD • Measure and set users IPD • Latency Reduction • Minimize overall end to end system delay • Use displays with fast response time and low persistence • Use latency compensation to reduce perceived latency Jason Jerald, The VR Book, 2016
  • 130.
    System Design Guidelines- II • General Design • Design for short user experiences • Minimize visual stimuli closer to eye (vergence/accommodation) • For binocular displays, do not use 2D overlays/HUDs • Design for sitting, or provide physical barriers • Show virtual warning when user reaches end of tracking area • Motion Design • Move virtual viewpoint with actual motion of the user • If latency high, no tasks requiring fast head motion • Interface Design • Design input/interaction for user’s hands at their sides • Design interactions to be non-repetitive to reduce strain injuries Jason Jerald, The VR Book, 2016
  • 131.
  • 132.
    XR Prototyping Tools LowFidelity (Concept, visual design) • Sketching • Photoshop • PowerPoint • Video High Fidelity (Interaction, experience design) • Interactive sketching • Desktop & on-device authoring • Immersive authoring & visual scripting • XR development toolkits
  • 133.
    XR Prototyping Tools LowFidelity (Concept, visual design) • Sketching • Photoshop • PowerPoint • Video High Fidelity (Interaction, experience design) • Interactive sketching • Desktop & on-device authoring • Immersive authoring & visual scripting • XR development toolkits
  • 134.
    Sketching VR Interfaces •Download360 panorama template grid •Draw interface ideas into grid •Scan into 360 photo viewer for VR HMD See https://virtualrealitypop.com/vr-sketches-56599f99b357
  • 135.
    VR Sketch Sheets Availablefrom https://blog.prototypr.io/vr-paper-prototyping-9e1cab6a75f3
  • 136.
    Concept Sheet Top downand First Person view Suitable for rough sketch
  • 138.
  • 139.
  • 140.
  • 141.
  • 142.
  • 143.
    VR Storyboarding -Describing the Experience
  • 144.
  • 145.
  • 146.
    Sketching in VR UsingVR applications for rapid prototyping - Intuitive sketching in immersive space - Creating/testing at 1:1 scale - Rapid UI design/layout Examples - Quill - https://quill.fb.com/ - Tilt Brush - https://www.tiltbrush.com/
  • 147.
  • 148.
    Gravity Sketch •Intuitive immersive3D design platform •Move from sketch to final 3D model render •Natural 3D UI manipulation •Two handed input, 3D menus, etc •Multi-platform •HMD (Quest, Steam), tablet, etc •Support for collaboration
  • 149.
  • 150.
    Scene Assembly •Assemble assetsinto 3D scene •Create high-fidelity view •Collect user feedback •Immersive Scene Assembly •Microsoft Maquette: https://www.maquette.ms/ •Sketchbox: https://www.sketchbox3d.com/
  • 151.
    Example: Mocking upa Scene - Twitch VR https://www.notion.so/Twitch-VR-Prototype-Workflow-5cf65c7bfcd84226a87b1c0db07e46f2
  • 152.
    1. Collect toolsneeded 2. Know/imagine the user 3. Storyboard/sketch the experience 4. Create assets needed 5. Create the interface scenes 6. View in VR/record video Prototyping Process
  • 155.
    2D Asset Creation •Manypossible tools •Sketch •MacOS •https://www.sketch.com/ •Figma •Chrome plugin •https://www.figma.com/
  • 156.
  • 157.
  • 158.
    3D Scene Layout •Scenelayout in VR •Sketchbox •https://www.sketchbox3d.com/ •Fast and simple VR prototyping tool •Collaborative VR design tool •Export to SketchFab/fbx
  • 159.
  • 161.
  • 162.
    XR Prototyping Tools LowFidelity (Concept, visual design) • Sketching • Photoshop • PowerPoint • Video High Fidelity (Interaction, experience design) • Interactive sketching • Desktop & on-device authoring • Immersive authoring & visual scripting • XR development toolkits
  • 163.
    Digital Authoring Toolsfor VR • Support visual authoring of 3D scene graphs with VR previews • Basic interactions can be implemented without coding • Advanced interactions require JavaScript, C#, or C++ Amazon Sumerian Unity Editor
  • 164.
    Immersive Authoring Toolsfor VR • Enable visual authoring of 3D content in VR • Make it possible to edit while previewing VR experience • Focus on 3D modeling rather than animation & scripting • Typically support export to common 3D model formats and asset sharing platforms like Google Poly, Sketchfab, or 3D Warehouse Google Blocks Oculus Quill
  • 165.
    Interactive 360 Prototypingfor VR •Create 360 images and add interactive elements •Many possible tools •InstaVR •http://www.instavr.co/ •Free, fast panorama VR •Drag and drop web interface •Deploy to multi platforms (Quest, Vive, phone, etc) •VR Direct •https://www.vrdirect.com/ •Connect multiple 360 scenes •Instant content update •EasyVR •https://www.360easyvr.com/
  • 166.
  • 167.
    VR Visual Programming •Drag and drop VR development • Visual Programming for Unity • VR Easy - http://blog.avrworks.com/ • Key VR functionality (navigation, etc) • HMD and VR controller support • Bolt • Rich visual flow • Integrated with Unity • Playmaker - https://hutonggames.com/ • Popular game authoring tool • Can be combined with VR toolkits
  • 168.
    Video Demo -VR Easy https://www.youtube.com/watch?v=L49ENduYgac
  • 169.
  • 170.
  • 171.
  • 172.
  • 173.
    High Level GraphicsTools • Game Engines • Powerful, need scripting ability • Unity, Unreal, Cry Engine, etc.. • Combine with VR plugins • HMDs, input devices, interaction, assets, etc..
  • 174.