SlideShare a Scribd company logo
1 of 176
Download to read offline
DESIGNING AR SYSTEMS
COMP 4010 Lecture Six
Mark Billinghurst
September 1st 2022
mark.billinghurst@unisa.edu.au
REVIEW
Typical Interaction Design Cycle
Develop alternative prototypes/concepts and compare them, And iterate, iterate, iterate....
Tom Chi’s Prototyping Rules
1. Find the quickest path to experience
2. Doing is the best kind of thinking
3. Use materials that move at the speed of
thought to maximize your rate of learning
From Idea to Product
Define
Requirements
CityViewAR
1
2
3
4
5
6
1
Sketch
Interface
2
Rough
Wireframes
3
Interactive
Prototype
4
High Fidelity
Prototype
5
Developer
Coding
6
User
Testing
7
Deploy
App
8
Develop
Iterate
XR Prototyping Techniques
Lo-
Fi
Hi-
Fi
Easy
Hard
Digital
Authoring
Immersive
Authoring
Web-Based
Development*
Cross-Platform
Development*
Native
Development*
* requires scripting and 3D programming skills
Sketching
Paper Prototyping
Video Prototyping
Wireframing
Bodystorming
Wizard of Oz
XR Prototyping Tools
Low Fidelity (Concept, visual design)
• Sketching
• Photoshop
• PowerPoint
• Video
High Fidelity (Interaction, experience design)
• Interactive sketching
• Desktop & on-device authoring
• Immersive authoring & visual scripting
• XR development toolkits
From Sketch to Prototype
Storyboard Wireframe Mock-up
Sketch Interactive
Prototype
Buxton’s Key Attributes of Sketching
• Quick
• Work at speed of thought
• Timely
• Always available
• Disposable
• Inexpensive, little investment
• Plentiful
• Easy to iterate
• A catalyst
• Evokes conversations
Storyboarding - Describing the Experience
http://monamishra.com/projects/Argo
.html
Key Elements
1. Scenario: Storyboards are based on a scenario or a user story. The
persona or role that corresponds to that scenario is clearly specified
2. Visuals: Each step in the scenario is represented visually in a sequence.
The steps can be sketches, illustrations, or photos.
3. Captions: Each visual has a corresponding caption. The caption
describes the user’s actions, environment, emotional state, device, etc.
Wireframes
It’s about
- Functional specs
- Navigation and interaction
- Functionality and layout
- How interface elements work together
- Defining the interaction flow/experience
Leaving room for the design to be created
Example Wireframe
Mockup
It’s about
- Look and feel
- Building on wireframe
- High fidelity visuals
- Putting together final assets
- Getting feedback on design
Designing AR in VR
https://www.youtube.com/watch?v=TfQJhSJQiaU
Vuforia Studio
• Author animated AR experiences
• Drag and drop content
• Add animations
• Import CAD models
• Combine with IOT sensors
• https://www.ptc.com/en/products/vuforia/vuforia-studio
Mock-up Guidelines
1. Generate final 2D/3D interface elements
2. Replace wireframe UI elements with high quality visuals
3. Use standard AR/VR UI elements
4. Simulate AR/VR views
5. Focus on visual/audio design
6. Collect feedback from target end-users
Sketch vs. Wireframe vs. Mock-up
Low Fidelity Low to Medium
Fidelity
Medium to High Fidelity
IDEATE FLOW VISUALIZE
Adding Transitions
ShapesXR - www.shapesxr.com/
Immersive VR tool for:
● Design
● Prototyping
● Communication
Key Features
● Assets Library and Importing
● Storyboarding
● Multi-user collaboration
● Export and Sharing
Shapes XR Workflow
XR Prototyping Tools
Low Fidelity (Concept, visual design)
• Sketching
• Photoshop
• PowerPoint
• Video
High Fidelity (Interaction, experience design)
• Interactive sketching
• Desktop & on-device authoring
• Immersive authoring & visual scripting
• XR development toolkits
Digital Authoring Tools for AR
Vuforia Studio
Lens Studio
• Support visual authoring of marker-
based and/or marker-less AR apps
• Provide default markers and support
for custom markers
• Typically enable AR previews
through emulator but need to deploy
to AR device for testing
Immersive Authoring Tools for AR
• Enable visual authoring of 3D
content in AR
• Make it possible to edit while
previewing AR experience in the
environment
• Provide basic support for interactive
behaviors
• Sometimes support export to
WebXR
Apple Reality Composer
Adobe Aero
Interactive Sketching
•Pop App
● Pop - https://marvelapp.com/pop
● Combining sketching and interactivity on mobiles
● Take pictures of sketches, link pictures together
Proto.io
• Web based prototyping tool
• Visual drag and drop interface
• Rich transitions
• Scroll, swipe, buttons, etc
• Deploy on device
• mobile, PC, browser
• Ideal for mobile interfaces
• iOS, Android template
• For low and high fidelity prototypes
AR Visual Programming
• Rapid prototype on desktop
• Deliver on mobile
• Simple interactivity
• Examples
• Zapworks Studio
• https://zap.works/studio/
• Snap Lens Studio
• https://lensstudio.snapchat.com/
• Facebook Spark AR Studio
• https://sparkar.facebook.com/ar-studio/
Creating On Device
•Adobe Aero
•Create AR on mobile devices
•Touch based interaction and authoring
•Only iOS support for now
•https://www.adobe.com/nz/products/aero.html
Apple Reality Composer
• Rapidly create 3D scenes and AR experiences
• Creation on device (iPhone, iPad)
• Drag and drop interface
• Loading 2D/3D content
• Adding simple interactivity
• Anchor content in real world (AR view)
• Planes (vertical, horizontal), faces, images
Development & Testing
WebXR
THREE.js AR.js
A-Frame ...
Unity /
Unreal
SteamVR
AR
Foundation
MRTK ...
Native SDKs
Oculus
ARKit/
ARCore
VIVE ...
A-Frame
• Based on Three.js and WebGL
• New HTML tags for 3D scenes
• A-Frame Inspector (not editor)
• Asset management (img, video,
audio, & 3D models)
• ECS architecture with many open-
source components
• Cross-platform XR
Unity
• Started out as game engine
• Has integrated support for many
types of XR apps
• Powerful scene editor
• Asset management & store
• Basically all XR device vendors
provide Unity SDKs
Unity vs. A-Frame
Unity is a game engine and XR dev
platform
● De facto standard for XR apps
● Increasingly built-in support
● Most “XR people” will ask you about
your Unity skills :-)
Support for all XR devices
● Basically all AR and VR device
vendors provide Unity SDKs
A-Frame is a declarative WebXR
framework
● Emerging XR app development
framework on top of THREE.js
● Good for novice XR designers with
web dev background
Support for most XR devices
● Full WebXR support in Firefox,
Chrome, & Oculus Browser
XR Toolkits
Card-
board
AR
Kit
AR
Core
Oculus VIVE
Holo
Lens
WMR
Web
Cam
A-Frame
AR.js
SteamVR
MRTK
Vuforia
AR Foundation
XR Interaction
WebXR
DESIGNING AR SYSTEMS
Design in Interaction Design
Key Design and
Prototyping Steps
Good vs. Bad AR Design
https://www.youtube.com/watch?v=YJg02ivYzSs
AR. Design Considerations
• 1. Design for Humans
• Use Human Information Processing model
• 2. Design for Different User Groups
• Different users may have unique needs
• 3. Design for the Whole User
• Social, cultural, emotional, physical cognitive
• 4. Use UI Best Practices
• Adapt known UI guidelines to AR/VR
• 5. Use of Interface Metaphors/Affordances
• Decide best metaphor for AR/VR application
1. Design for Human Information Processing
• High level staged model from Wickens and Carswell (1997)
• Relates perception, cognition, and physical ergonomics
Perception Cognition Ergonomics
Design for Perception
• Need to understand perception to design AR
• Visual perception
• Many types of visual cues (stereo, oculomotor, etc.)
• Auditory system
• Binaural cues, vestibular cues
• Somatosensory
• Haptic, tactile, kinesthetic, proprioceptive cues
• Chemical Sensing System
• Taste and smell
Depth Perception Problems
• Without proper depth cues AR interfaces look unreal
Which of these POI are near or far?
Types of Depth Cues
Improving Depth Perception
Cutaways
Occlusion
Shadows
Cutaway Example
• Providing depth perception cues for AR
https://www.youtube.com/watch?v=2mXRO48w_E4
Design for Cognition
• Design for Working and Long-term memory
• Working memory
• Short term storage, Limited storage (~5-9 items)
• Long term memory
• Memory recall trigger by associative cues
• Situational Awareness
• Model of current state of user’s environment
• Used for wayfinding, object interaction, spatial awareness, etc..
• Provide cognitive cues to help with situational awareness
• Landmarks, procedural cues, map knowledge
• Support both ego-centric and exo-centric views
Micro-Interactions
▪ Using mobile phones people split their attention
between the display and the real world
Time Looking at Screen
Oulasvirta, A. (2005). The fragmentation of attention in mobile
interaction, and what to do with it. interactions, 12(6), 16-18.
Dividing Attention to World
• Number of times looking away from mobile screen
Design for Micro Interactions
▪ Design interaction for less than a few seconds
• Tiny bursts of interaction
• One task per interaction
• One input per interaction
▪ Benefits
• Use limited input
• Minimize interruptions
• Reduce attention fragmentation
NHTSA Guidelines - www.nhtsa.gov
For technology in cars:
• Any task by a driver should be interruptible at any time.
• The driver should control the pace of task interactions.
• Tasks should be completed with glances away from road <2 seconds
• Cumulative time glancing away from the road <=12 secs.
Make it Glanceable
• Seek to rigorously reduce information density. Successful designs afford for
recognition, not reading.
Bad Good
Reduce Information Chunks
You are designing for recognition, not reading. Reducing the total # of information
chunks will greatly increase the glanceability of your design.
1
2
3
1
2
3
4
5 (6)
Eye movements
For 1: 1-2 460ms
For 2: 1 230ms
For 3: 1 230ms
~920ms
Eye movements
For 1: 1 230ms
For 2: 1 230ms
For 3: 1 230ms
For 4: 3 690ms
For 5: 2 460ms
~1,840ms
Ego-centric and Exo-centric views
• Combining ego-centric and exo-centric cue for better situational awareness
Cognitive Issues in Mobile AR
• Information Presentation
• Amount, Representation, Placement, View combination
• Physical Interaction
• Navigation, Direct manipulation, Content creation
• Shared Experience
• Social context, Bodily Configuration, Artifact manipulation, Display space
Li, N., & Duh, H. B. L. (2013). Cognitive issues in mobile augmented reality: an embodied perspective.
In Human factors in augmented reality environments (pp. 109-135). Springer, New York, NY.
Information Presentation
• Consider
• The amount of information
• Clutter, complexity
• The representation of information
• Navigation cues, POI representation
• The placement of information
• Head, body, world stabilized
• Using view combinations
• Multiple views
Example: Twitter 360
• iPhone application
• See geo-located tweets in real world
• Twitter.com supports geo tagging
But: Information Clutter from Many Tweets
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Solution: Information Filtering
Information Filtering
Before After
OutdoorAR:Limited FOV
• Show POI outside FOV
• Zooms between map and panorama views
Zooming Views
https://www.youtube.com/watch?v=JLxLH9Cya20
Design for Physical Ergonomics
• Design for the human motion range
• Consider human comfort and natural posture
• Design for hand input
• Coarse and fine scale motions, gripping and grasping
• Avoid “Gorilla arm syndrome” from holding arm pose
Gorilla Arm in AR
• Design interface to reduce mid-air gestures
XRgonomics
• Uses physiological model to calculate ergonomic interaction cost
• Difficulty of reaching points around the user
• Customizable for different users
• Programmable API, Hololens demonstrator
• GitHub Repository
• https://github.com/joaobelo92/xrgonomics
Evangelista Belo, J. M., Feit, A. M., Feuchtner, T., & Grønbæk, K. (2021, May). XRgonomics: Facilitating the Creation of
Ergonomic 3D Interfaces. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-11).
XRgonomics
https://www.youtube.com/watch?v=cQW9jfVXf4g
2. Designing for Different User Groups
• Design for Difference Ages
• Children require different interface design than adults
• Older uses have different needs than younger
• Prior Experience with AR systems
• Familiar with HMDs, AR input devices
• People with Different Physical Characteristics
• Height and arm reach, handedness
• Perceptual, Cognitive and Motor Abilities
• Colour perception varies between people
• Spatial ability, cognitive or motor disabilities
Designing for Children
• HMDS
• inter pupillary distance, head fit, size and weight
• Tablets
• Poor dexterity, need to hold large tablet
• Content
• Reading ability, spatial perception
3. Design for the Whole User
Consider Your User
• Consider context of user
• Physical, social, emotional, cognitive, etc.
• Mobile Phone AR User
• Probably Mobile
• One hand interaction
• Short application use
• Need to be able to multitask
• Use in outdoor or indoor environment
• Want to enhance interaction with real world
Would you wear this HMD?
Whole User Needs
• Social
• Don’t make your user look stupid
• Cultural
• Follow local cultural norms
• Physical
• Can the user physically use the interface?
• Cognitive
• Can the user understand how the interface works?
• Emotional
• Make the user feel good and in control
Example: Social Acceptance
• People don’t want to look silly
• Only 12% of 4,600 adults would be willing to wear AR glasses
• 20% of mobile AR browser users experience social issues
• Acceptance more due to Social than Technical issues
• Needs further study (ethnographic, field tests, longitudinal)
TAT AugmentedID
4. Use UI Best Practices
• General UI design principles can be applied to AR
• E.g. Shneiderman’s UI guidelines from 1998
• Providing interface feedback
• Mixture of reactive, instrumental and operational feedback
• Maintain spatial and temporal correspondence
• Use constraints
• Specify relations between variables that must be satisfied
• E.g. physical constraints reduce freedom of movement
• Support Two-Handed control
• Use Guiard’s framework of bimanual manipulation
• Dominant vs. non-dominant hands
Follow Good HCI Principles
• Provide good conceptual model/Metaphor
• customers want to understand how UI works
• Make things visible
• if object has function, interface should show it
• Map interface controls to customerʼs model
• infix -vs- postfix calculator -- whose model?
• Provide feedback
• what you see is what you get!
Example: Guiard’s model of bimanual manipulation
Guiard, Y. (1987). Asymmetric division of labor in human skilled bimanual action: The kinematic chain as a model. Journal of Motor Behavior, 19, 486-517.
Dominant
hand
Non-dominant
hand
Dominant
hand
Non-dominant
hand
Non-Dominant: Leads, set spatial reference frame, performs coarse motions
Dominant: Follows, works in reference frame, performs fine motions
Adapting Existing Guidelines
• Mobile Phone AR
• Phone HCI Guidelines
• Mobile HCI Guidelines
• HMD Based AR
• 3D User Interface Guidelines
• VR Interface Guidelines
• Desktop AR
• Desktop UI Guidelines
Example: Apple iOS Interface Guidelines
• Make it obvious how to use your content.
• Avoid clutter, unused blank space, and busy backgrounds.
• Minimize required user input.
• Express essential information succinctly.
• Provide a fingertip-sized target for all controls.
• Avoid unnecessary interactivity.
• Provide feedback when necessary
From: https://developer.apple.com/ios/human-interface-guidelines/
Applying Principles to Mobile AR
• Clean
• Large Video View
• Large Icons
• Text Overlay
• Feedback
•Interface Components
• Physical components
• Display elements
• Visual/audio
• Interaction metaphors
Physical
Elements
Display
Elements
Interaction
Metaphor
Input Output
5. Use Interface Metaphors
AR Interfaces
Tangible AR
Tangible input
AR overlay
Direct interaction
Natural AR
Freehand gesture
Speech, gaze
Tangible UI
Augmented surfaces
Object interaction
Familiar controllers
Indirect interaction
3D AR
3D UI
Dedicated
controllers
Custom devices
Browsing
Simple input
Viewpoint control
Expressiveness, Intuitiveness
AR Interfaces
Tangible AR
Tangible input
AR overlay
Direct interaction
Natural AR
Freehand gesture
Speech, gaze
Tangible UI
Augmented surfaces
Object interaction
Familiar controllers
Indirect interaction
3D AR
3D UI
Dedicated
controllers
Custom devices
Browsing
Simple input
Viewpoint control
Design for Layers
Information Layers
• Head-stabilized
• Heads-up display
• Body-stabilized
• E.g., virtual tool-belt
• World-stabilized
• E.g., billboard or signpost
Head stabilized
• Information attached to view – always visible
Body Stabilized
• Information moves with person
Body Stabilized Interface
• Elements you want always available, but not always visible
World Stabilized
• Information fixed in world
• Elements you want fixed relative to real world objects
“Diegetic UI”
• Integrated with world
Example: Fragments
• UI Elements embedded
in real world
• Real world occlusion
Fragments Demo
https://www.youtube.com/watch?v=kBGWZztPZ4A
Design to Device Constraints
• Understand the platform and design for limitations
• Hardware, software platforms
• E.g. Handheld AR game with visual tracking
• Use large screen icons
• Consider screen reflectivity
• Support one-hand interaction
• Consider the natural viewing angle
• Do not tire users out physically
• Do not encourage fast actions
• Keep at least one tracking surface in view
Art of Defense Game
Handheld AR Constraints/Affordances
• Camera and screen are linked
• Fast motions a problem when looking at screen
• Intuitive “navigation”
• Phone in hand
• Two handed activities: awkward or intuitive
• Extended periods of holding phone tiring
• Awareness of surrounding environment
• Small screen
• Extended periods of looking at screen tiring
• In general, small awkward platform
• Vibration, sound
• Can provide feedback when looking elsewhere
Common Mobile AR Metaphors
• Tangible AR Lens Viewing
• Look through screen into AR scene
• Interact with screen to interact with AR content
• Touch screen input
• E.g. Invisible Train
• Metaphor – holding a window into the AR world
The Invisible Train
https://www.youtube.com/watch?v=6LE98k0YMLM
Common Mobile AR Metaphors
• Tangible AR Lens Manipulation
• Select AR object and attach to device
• Physically move phone to move AR object
• Use motion of device as input
• E.g. AR Lego
• Metaphor – phone as physical handle for device
• https://www.youtube.com/watch?v=icmqv32HEPU
AR Interfaces
Tangible AR
Tangible input
AR overlay
Direct interaction
Natural AR
Freehand gesture
Speech, gaze
Tangible UI
Augmented surfaces
Object interaction
Familiar controllers
Indirect interaction
3D AR
3D UI
Dedicated
controllers
Custom devices
Browsing
Simple input
Viewpoint control
Design for Affordances
Tangible AR Metaphor
• AR overcomes limitation of TUIs
• enhance display possibilities
• merge task/display space
• provide public and private views
• TUI + AR = Tangible AR
• Apply TUI methods to AR interface design
Tangible AR Design Principles
• Tangible AR Interfaces use TUI principles
• Physical controllers for moving virtual content
• Support for spatial 3D interaction techniques
• Time and space multiplexed interaction
• Support for multi-handed interaction
• Match object affordances to task requirements
• Support parallel activity with multiple objects
• Allow collaboration between multiple users
AR Design Space
Reality Virtual Reality
Augmented Reality
Physical Design Virtual Design
Affordances
”… the perceived and actual properties of the thing, primarily
those fundamental properties that determine just how the
thing could possibly be used. [...]
Affordances provide strong clues to the operations of things.”
(Norman, The Psychology of Everyday Things 1988, p.9)
Affordances
Affordance Matrix
Affordance Matrix
Fake door
Hidden door
Real door
No door
Physical vs. Virtual Affordances
• Physical Affordance
• Look and feel of real objects
• Shape, texture, colour, weight, etc.
• Industrial Design
• Virtual Affordance
• Look of virtual objects
• Copy real objects
• Interface Design
•AR design is mixture of physical
affordance and virtual affordance
•Physical
•Tangible controllers and objects
•Virtual
•Virtual graphics and audio
Affordances in AR
• Design AR interface objects to show how they are used
• Use visual and physical cues to show possible affordances
• Perceived affordances should match actual affordances
• Physical and virtual affordances should match
Merge Cube Tangible Molecules
Case Study 1: 3D AR Lens
Goal: Develop a lens based AR interface
• MagicLenses
• Developed at Xerox PARC in 1993
• View a region of the workspace differently to the rest
• Overlap MagicLenses to create composite effects
3D MagicLenses
MagicLenses extended to 3D (Veiga et. al. 96)
§ Volumetric and flat lenses
AR Lens Design Principles
• Physical Components
• Lens handle
• Virtual lens attached to real object
• Display Elements
• Lens view
• Reveal layers in dataset
• Interaction Metaphor
• Physically holding lens
3D AR Lenses: Model Viewer
§ Displays models made up of multiple parts
§ Each part can be shown or hidden through the lens
§ Allows the user to peer inside the model
§ Maintains focus + context
AR Lens Demo
AR Lens Demo
AR Lens Implementation
Stencil Buffer Outside Lens
Inside Lens Virtual Magnifying Glass
Case Study 2 : LevelHead
• Block based game
Case Study 2: LevelHead
• Physical Components
• Real blocks
• Display Elements
• Virtual person and rooms
• Interaction Metaphor
• Blocks are rooms
Level Head Demo
Case Study 3: AR Chemistry (Fjeld 2002)
• Tangible AR chemistry education
Goal: An AR application to teach
molecular structure in chemistry
•Physical Components
• Real book, rotation cube, scoop, tracking
markers
•Display Elements
• AR atoms and molecules
• Interaction Metaphor
• Build your own molecule
AR Chemistry Input Devices
AR Chemistry Demo
Case Study 4: Transitional Interfaces
Goal: An AR interface supporting
transitions from reality to virtual reality
•Physical Components
• Real book
•Display Elements
• AR and VR content
• Interaction Metaphor
• Book pages hold virtual scenes
Milgram’s Continuum (1994)
Reality
(Tangible
Interfaces)
Virtualit
y
(Virtual
Reality)
Augmented
Reality (AR)
Augmented
Virtuality (AV)
Mixed Reality (MR)
Central Hypothesis
• The next generation of interfaces will support
transitions along the Reality-Virtuality continuum
Transitions
• Interfaces of the future will need to support
transitions along the RV continuum
• Augmented Reality is preferred for:
• co-located collaboration
• Immersive Virtual Reality is preferred for:
• experiencing world immersively (egocentric)
• sharing views
• remote collaboration
The MagicBook
•Design Goals:
•Allows user to move smoothly
between reality and virtual reality
•Support collaboration
MagicBook Metaphor
MagicBook Demo
Features
• Seamless transition from Reality to Virtuality
• Reliance on real decreases as virtual increases
• Supports egocentric and exocentric views
• User can pick appropriate view
• Computer becomes invisible
• Consistent interface metaphors
• Virtual content seems real
• Supports collaboration
Collaboration in MagicBook
• Collaboration on multiple levels:
• Physical Object
• AR Object
• Immersive Virtual Space
• Egocentric + exocentric collaboration
• multiple multi-scale users
• Independent Views
• Privacy, role division, scalability
Technology
• Reality
• No technology
• Augmented Reality
• Camera – tracking
• Switch – fly in
• Virtual Reality
• Compass – tracking
• Press pad – move
• Switch – fly out
Summary
•When designing AR interfaces, think of:
• Physical Components
• Physical affordances
• Virtual Components
• Virtual affordances
• Interface Metaphors
• Tangible AR or similar
Mobile AR Game Design
Demo: Roku’s Reward
AR INTERFACE DESIGN
GUIDELINES
Design Guidelines
By Vendors
Platform driven
By Designers
User oriented
By Practitioners
Experience based
By Researchers
Empirically derived
Design Patterns
“Each pattern describes a problem which occurs
over and over again in our environment, and then
describes the core of the solution to that problem in
such a way that you can use this solution a million
times over, without ever doing it the same way twice.”
– Christopher Alexander et al.
Use Design Patterns to Address Reoccurring Problems
C.A. Alexander, A Pattern Language, Oxford Univ. Press, New York, 1977.
Example UI Design Patterns
• http://ui-patterns.com/patterns
Design Patterns for Handheld AR
• Set of design patterns for Handheld AR
• Title: a short phase that is memorable.
• Definition: what experiences the prepattern supports
• Description: how and why the prepattern works,
what aspects of game design it is based on.
• Examples: Illustrate the meaning of the pre-pattern.
• Using the pre-patterns: reveal the challenges and
context of applying the pre-patterns.
Xu, Y., Barba, E., Radu, I., Gandy, M., Shemaka, R., Schrank, B., ... & Tseng, T.
(2011, October). Pre-patterns for designing embodied interactions in handheld
augmented reality games. In 2011 IEEE International Symposium on Mixed and
Augmented Reality-Arts, Media, and Humanities (pp. 19-28). IEEE.
Handheld AR Design Patterns
Title Meaning Embodied Skills
Device Metaphors Using metaphor to suggest available player
actions
Body A&S Naïve physics
Control Mapping Intuitive mapping between physical and digital
objects
Body A&S Naïve physics
Seamful Design Making sense of and integrating the
technological seams through game design
Body A&S
World Consistency Whether the laws and rules in
physical world hold in digital world
Naïve physics
Environmental A&S
Landmarks Reinforcing the connection between digital-
physical space through landmarks
Environmental A&S
Personal Presence The way that a player is represented in the
game decides how much they feel like living in
the digital game world
Environmental A&S
Naïve physics
Living Creatures Game characters that are responsive to
physical, social events that mimic behaviours
of living beings
Social A&S Body A&S
Body constraints Movement of one’s body position
constrains another player’s action
Body A&S Social A&S
Hidden information The information that can be hidden and
revealed can foster emergent social play
Social A&S Body A&S
*A&S = awareness and skills
Example: Seamless Design
• Design to reduce seams in the user experience
• Eg: AR tracking failure, change in interaction mode
• Paparazzi Game
• Change between AR tracking to accelerometer input
Yan Xu , et.al. , Pre-patterns for designing embodied interactions in handheld augmented reality games,
Proceedings of the 2011 IEEE International Symposium on Mixed and Augmented Reality--Arts, Media, and
Humanities, p.19-28, October 26-29, 2011
Demo: Paparazzi Game
• https://www.youtube.com/watch?v=MIGH5WGMnbs
Example: Living Creatures
• Virtual creatures should respond to real world events
• eg. Player motion, wind, light, etc
• Creates illusion creatures are alive in the real world
• Sony EyePet
• Responds to player blowing on creature
Google ARCore Interface Guidelines
https://developers.google.com/ar/design
ARCore Elements App
• Mobile AR app demonstrating
interface guidelines
• Multiple Interface Guidelines
• User interface
• User environment
• Object manipulation
• Off-screen markers
• Etc..
• Test on Device
• https://play.google.com/store/apps/details?id=com.google.ar.unity.ddelements
ARCore Elements
• https://www.youtube.com/watch?v=pRHmLuXIs0s
ARKit Interface Guidelines
• developer.apple.com/design/human-interface-guidelines/ios/system-capabilities/augmented-reality/
Microsoft Mixed Reality Design Guidelines
• https://docs.microsoft.com/en-us/windows/mixed-reality/design/design
MRTK Interface Examples
• Examples of UX Building Blocks
• http://aka.ms/MRTK
The Trouble with AR Design Guidelines
1) Rapidly evolving best practices
Still a moving target, lots to learn about AR design
Slowly emerging design patterns, but often change with OS updates
Already major differences between device platforms
2) Challenges with scoping guidelines
Often too high level, like “keep the user safe and comfortable”
Or, too application/device/vendor-specific
3) Best guidelines come from learning by doing
Test your designs early and often, learn from your own “mistakes”
Mind differences between VR and AR, but less so between devices
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

More Related Content

What's hot

Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
 
Comp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingComp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingMark Billinghurst
 
COMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and SystemsCOMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and SystemsMark Billinghurst
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VRMark Billinghurst
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignMark Billinghurst
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRMark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsMark Billinghurst
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 
Comp4010 lecture3-AR Technology
Comp4010 lecture3-AR TechnologyComp4010 lecture3-AR Technology
Comp4010 lecture3-AR TechnologyMark Billinghurst
 
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysCOMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysMark Billinghurst
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR SystemsMark Billinghurst
 
Application in Augmented and Virtual Reality
Application in Augmented and Virtual RealityApplication in Augmented and Virtual Reality
Application in Augmented and Virtual RealityMark Billinghurst
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsMark Billinghurst
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionMark Billinghurst
 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionMark Billinghurst
 

What's hot (20)

Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
Comp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingComp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and Prototyping
 
COMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and SystemsCOMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and Systems
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface Design
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
 
Designing Usable Interface
Designing Usable InterfaceDesigning Usable Interface
Designing Usable Interface
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Comp4010 lecture3-AR Technology
Comp4010 lecture3-AR TechnologyComp4010 lecture3-AR Technology
Comp4010 lecture3-AR Technology
 
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysCOMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
Application in Augmented and Virtual Reality
Application in Augmented and Virtual RealityApplication in Augmented and Virtual Reality
Application in Augmented and Virtual Reality
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR Interaction
 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-Perception
 

Similar to 2022 COMP4010 Lecture 6: Designing AR Systems

Mobile AR Lecture 3 - Prototyping
Mobile AR Lecture 3 - PrototypingMobile AR Lecture 3 - Prototyping
Mobile AR Lecture 3 - PrototypingMark Billinghurst
 
Storytelling using Immersive Technologies
Storytelling using Immersive TechnologiesStorytelling using Immersive Technologies
Storytelling using Immersive TechnologiesKumar Ahir
 
Workshop: AR Glasses and their Peculiarities
Workshop: AR Glasses and their PeculiaritiesWorkshop: AR Glasses and their Peculiarities
Workshop: AR Glasses and their PeculiaritiesMartin Lechner
 
The Glass Class: Rapid Prototyping for Wearable Computers
The Glass Class: Rapid Prototyping for Wearable ComputersThe Glass Class: Rapid Prototyping for Wearable Computers
The Glass Class: Rapid Prototyping for Wearable ComputersMark Billinghurst
 
SEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
SEARIS 2014 Keynote - MiddleVR - Philosophy and architectureSEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
SEARIS 2014 Keynote - MiddleVR - Philosophy and architectureSebastien Kuntz
 
MHIT 603: Lecture 3 - Prototyping Tools
MHIT 603: Lecture 3 - Prototyping ToolsMHIT 603: Lecture 3 - Prototyping Tools
MHIT 603: Lecture 3 - Prototyping ToolsMark Billinghurst
 
2016 AR Summer School - Lecture4
2016 AR Summer School - Lecture42016 AR Summer School - Lecture4
2016 AR Summer School - Lecture4Mark Billinghurst
 
Impact of Adobe Edge Tools and Services in Higher Education
Impact of Adobe Edge Tools and Services in Higher EducationImpact of Adobe Edge Tools and Services in Higher Education
Impact of Adobe Edge Tools and Services in Higher EducationJoseph Labrecque
 
Intro to User Centered Design Workshop
Intro to User Centered Design WorkshopIntro to User Centered Design Workshop
Intro to User Centered Design WorkshopPatrick McNeil
 
Adapting Expectations to Fit a Mobile Workflow
Adapting Expectations to Fit a Mobile WorkflowAdapting Expectations to Fit a Mobile Workflow
Adapting Expectations to Fit a Mobile WorkflowJoseph Labrecque
 
How ABB shapes the future of industry with Microsoft HoloLens and Unity - Uni...
How ABB shapes the future of industry with Microsoft HoloLens and Unity - Uni...How ABB shapes the future of industry with Microsoft HoloLens and Unity - Uni...
How ABB shapes the future of industry with Microsoft HoloLens and Unity - Uni...Unity Technologies
 
The Wikitude SDK and the Wikitude Studio
The Wikitude SDK and the Wikitude StudioThe Wikitude SDK and the Wikitude Studio
The Wikitude SDK and the Wikitude StudioMartin Lechner
 
When the Developer Must Design
When the Developer Must DesignWhen the Developer Must Design
When the Developer Must DesignAndrew Malek
 
Prototyping is the panacea
Prototyping is the panaceaPrototyping is the panacea
Prototyping is the panaceaMichael Meikson
 
Beginners guide to creating mobile apps
Beginners guide to creating mobile appsBeginners guide to creating mobile apps
Beginners guide to creating mobile appsJames Quick
 
COMP 4026 Lecture3 Prototyping and Evaluation
COMP 4026 Lecture3 Prototyping and EvaluationCOMP 4026 Lecture3 Prototyping and Evaluation
COMP 4026 Lecture3 Prototyping and EvaluationMark Billinghurst
 
Mobile Design at Gilt
Mobile Design at GiltMobile Design at Gilt
Mobile Design at GiltDavid Park
 
[I3 d]04 interactivity
[I3 d]04 interactivity[I3 d]04 interactivity
[I3 d]04 interactivityjylee_kgit
 

Similar to 2022 COMP4010 Lecture 6: Designing AR Systems (20)

Mobile AR Lecture 3 - Prototyping
Mobile AR Lecture 3 - PrototypingMobile AR Lecture 3 - Prototyping
Mobile AR Lecture 3 - Prototyping
 
Storytelling using Immersive Technologies
Storytelling using Immersive TechnologiesStorytelling using Immersive Technologies
Storytelling using Immersive Technologies
 
Workshop: AR Glasses and their Peculiarities
Workshop: AR Glasses and their PeculiaritiesWorkshop: AR Glasses and their Peculiarities
Workshop: AR Glasses and their Peculiarities
 
Xamarin tools
Xamarin toolsXamarin tools
Xamarin tools
 
The Glass Class: Rapid Prototyping for Wearable Computers
The Glass Class: Rapid Prototyping for Wearable ComputersThe Glass Class: Rapid Prototyping for Wearable Computers
The Glass Class: Rapid Prototyping for Wearable Computers
 
SEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
SEARIS 2014 Keynote - MiddleVR - Philosophy and architectureSEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
SEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
 
MHIT 603: Lecture 3 - Prototyping Tools
MHIT 603: Lecture 3 - Prototyping ToolsMHIT 603: Lecture 3 - Prototyping Tools
MHIT 603: Lecture 3 - Prototyping Tools
 
2016 AR Summer School - Lecture4
2016 AR Summer School - Lecture42016 AR Summer School - Lecture4
2016 AR Summer School - Lecture4
 
Impact of Adobe Edge Tools and Services in Higher Education
Impact of Adobe Edge Tools and Services in Higher EducationImpact of Adobe Edge Tools and Services in Higher Education
Impact of Adobe Edge Tools and Services in Higher Education
 
Intro to User Centered Design Workshop
Intro to User Centered Design WorkshopIntro to User Centered Design Workshop
Intro to User Centered Design Workshop
 
Adapting Expectations to Fit a Mobile Workflow
Adapting Expectations to Fit a Mobile WorkflowAdapting Expectations to Fit a Mobile Workflow
Adapting Expectations to Fit a Mobile Workflow
 
The Glass Class at AWE 2015
The Glass Class at AWE 2015The Glass Class at AWE 2015
The Glass Class at AWE 2015
 
How ABB shapes the future of industry with Microsoft HoloLens and Unity - Uni...
How ABB shapes the future of industry with Microsoft HoloLens and Unity - Uni...How ABB shapes the future of industry with Microsoft HoloLens and Unity - Uni...
How ABB shapes the future of industry with Microsoft HoloLens and Unity - Uni...
 
The Wikitude SDK and the Wikitude Studio
The Wikitude SDK and the Wikitude StudioThe Wikitude SDK and the Wikitude Studio
The Wikitude SDK and the Wikitude Studio
 
When the Developer Must Design
When the Developer Must DesignWhen the Developer Must Design
When the Developer Must Design
 
Prototyping is the panacea
Prototyping is the panaceaPrototyping is the panacea
Prototyping is the panacea
 
Beginners guide to creating mobile apps
Beginners guide to creating mobile appsBeginners guide to creating mobile apps
Beginners guide to creating mobile apps
 
COMP 4026 Lecture3 Prototyping and Evaluation
COMP 4026 Lecture3 Prototyping and EvaluationCOMP 4026 Lecture3 Prototyping and Evaluation
COMP 4026 Lecture3 Prototyping and Evaluation
 
Mobile Design at Gilt
Mobile Design at GiltMobile Design at Gilt
Mobile Design at Gilt
 
[I3 d]04 interactivity
[I3 d]04 interactivity[I3 d]04 interactivity
[I3 d]04 interactivity
 

More from Mark Billinghurst

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented RealityMark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesMark Billinghurst
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional InterfacesMark Billinghurst
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Mark Billinghurst
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARMark Billinghurst
 

More from Mark Billinghurst (10)

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 

Recently uploaded

Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machinePadma Pradeep
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...Fwdays
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Wonjun Hwang
 
Artificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxArtificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxhariprasad279825
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 3652toLead Limited
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek SchlawackFwdays
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024Scott Keck-Warren
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024The Digital Insurer
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyAlfredo García Lavilla
 

Recently uploaded (20)

Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food Manufacturing
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machine
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
Artificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxArtificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptx
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easy
 

2022 COMP4010 Lecture 6: Designing AR Systems

  • 1. DESIGNING AR SYSTEMS COMP 4010 Lecture Six Mark Billinghurst September 1st 2022 mark.billinghurst@unisa.edu.au
  • 3. Typical Interaction Design Cycle Develop alternative prototypes/concepts and compare them, And iterate, iterate, iterate....
  • 4. Tom Chi’s Prototyping Rules 1. Find the quickest path to experience 2. Doing is the best kind of thinking 3. Use materials that move at the speed of thought to maximize your rate of learning
  • 5. From Idea to Product Define Requirements CityViewAR 1 2 3 4 5 6 1 Sketch Interface 2 Rough Wireframes 3 Interactive Prototype 4 High Fidelity Prototype 5 Developer Coding 6 User Testing 7 Deploy App 8 Develop Iterate
  • 6. XR Prototyping Techniques Lo- Fi Hi- Fi Easy Hard Digital Authoring Immersive Authoring Web-Based Development* Cross-Platform Development* Native Development* * requires scripting and 3D programming skills Sketching Paper Prototyping Video Prototyping Wireframing Bodystorming Wizard of Oz
  • 7. XR Prototyping Tools Low Fidelity (Concept, visual design) • Sketching • Photoshop • PowerPoint • Video High Fidelity (Interaction, experience design) • Interactive sketching • Desktop & on-device authoring • Immersive authoring & visual scripting • XR development toolkits
  • 8. From Sketch to Prototype Storyboard Wireframe Mock-up Sketch Interactive Prototype
  • 9. Buxton’s Key Attributes of Sketching • Quick • Work at speed of thought • Timely • Always available • Disposable • Inexpensive, little investment • Plentiful • Easy to iterate • A catalyst • Evokes conversations
  • 10.
  • 11. Storyboarding - Describing the Experience http://monamishra.com/projects/Argo .html
  • 12. Key Elements 1. Scenario: Storyboards are based on a scenario or a user story. The persona or role that corresponds to that scenario is clearly specified 2. Visuals: Each step in the scenario is represented visually in a sequence. The steps can be sketches, illustrations, or photos. 3. Captions: Each visual has a corresponding caption. The caption describes the user’s actions, environment, emotional state, device, etc.
  • 13. Wireframes It’s about - Functional specs - Navigation and interaction - Functionality and layout - How interface elements work together - Defining the interaction flow/experience Leaving room for the design to be created
  • 15. Mockup It’s about - Look and feel - Building on wireframe - High fidelity visuals - Putting together final assets - Getting feedback on design
  • 16. Designing AR in VR https://www.youtube.com/watch?v=TfQJhSJQiaU
  • 17. Vuforia Studio • Author animated AR experiences • Drag and drop content • Add animations • Import CAD models • Combine with IOT sensors • https://www.ptc.com/en/products/vuforia/vuforia-studio
  • 18. Mock-up Guidelines 1. Generate final 2D/3D interface elements 2. Replace wireframe UI elements with high quality visuals 3. Use standard AR/VR UI elements 4. Simulate AR/VR views 5. Focus on visual/audio design 6. Collect feedback from target end-users
  • 19. Sketch vs. Wireframe vs. Mock-up Low Fidelity Low to Medium Fidelity Medium to High Fidelity IDEATE FLOW VISUALIZE
  • 21. ShapesXR - www.shapesxr.com/ Immersive VR tool for: ● Design ● Prototyping ● Communication Key Features ● Assets Library and Importing ● Storyboarding ● Multi-user collaboration ● Export and Sharing
  • 23. XR Prototyping Tools Low Fidelity (Concept, visual design) • Sketching • Photoshop • PowerPoint • Video High Fidelity (Interaction, experience design) • Interactive sketching • Desktop & on-device authoring • Immersive authoring & visual scripting • XR development toolkits
  • 24. Digital Authoring Tools for AR Vuforia Studio Lens Studio • Support visual authoring of marker- based and/or marker-less AR apps • Provide default markers and support for custom markers • Typically enable AR previews through emulator but need to deploy to AR device for testing
  • 25. Immersive Authoring Tools for AR • Enable visual authoring of 3D content in AR • Make it possible to edit while previewing AR experience in the environment • Provide basic support for interactive behaviors • Sometimes support export to WebXR Apple Reality Composer Adobe Aero
  • 26. Interactive Sketching •Pop App ● Pop - https://marvelapp.com/pop ● Combining sketching and interactivity on mobiles ● Take pictures of sketches, link pictures together
  • 27. Proto.io • Web based prototyping tool • Visual drag and drop interface • Rich transitions • Scroll, swipe, buttons, etc • Deploy on device • mobile, PC, browser • Ideal for mobile interfaces • iOS, Android template • For low and high fidelity prototypes
  • 28. AR Visual Programming • Rapid prototype on desktop • Deliver on mobile • Simple interactivity • Examples • Zapworks Studio • https://zap.works/studio/ • Snap Lens Studio • https://lensstudio.snapchat.com/ • Facebook Spark AR Studio • https://sparkar.facebook.com/ar-studio/
  • 29. Creating On Device •Adobe Aero •Create AR on mobile devices •Touch based interaction and authoring •Only iOS support for now •https://www.adobe.com/nz/products/aero.html
  • 30. Apple Reality Composer • Rapidly create 3D scenes and AR experiences • Creation on device (iPhone, iPad) • Drag and drop interface • Loading 2D/3D content • Adding simple interactivity • Anchor content in real world (AR view) • Planes (vertical, horizontal), faces, images
  • 31. Development & Testing WebXR THREE.js AR.js A-Frame ... Unity / Unreal SteamVR AR Foundation MRTK ... Native SDKs Oculus ARKit/ ARCore VIVE ...
  • 32. A-Frame • Based on Three.js and WebGL • New HTML tags for 3D scenes • A-Frame Inspector (not editor) • Asset management (img, video, audio, & 3D models) • ECS architecture with many open- source components • Cross-platform XR
  • 33. Unity • Started out as game engine • Has integrated support for many types of XR apps • Powerful scene editor • Asset management & store • Basically all XR device vendors provide Unity SDKs
  • 34. Unity vs. A-Frame Unity is a game engine and XR dev platform ● De facto standard for XR apps ● Increasingly built-in support ● Most “XR people” will ask you about your Unity skills :-) Support for all XR devices ● Basically all AR and VR device vendors provide Unity SDKs A-Frame is a declarative WebXR framework ● Emerging XR app development framework on top of THREE.js ● Good for novice XR designers with web dev background Support for most XR devices ● Full WebXR support in Firefox, Chrome, & Oculus Browser
  • 37. Design in Interaction Design Key Design and Prototyping Steps
  • 38. Good vs. Bad AR Design
  • 40. AR. Design Considerations • 1. Design for Humans • Use Human Information Processing model • 2. Design for Different User Groups • Different users may have unique needs • 3. Design for the Whole User • Social, cultural, emotional, physical cognitive • 4. Use UI Best Practices • Adapt known UI guidelines to AR/VR • 5. Use of Interface Metaphors/Affordances • Decide best metaphor for AR/VR application
  • 41. 1. Design for Human Information Processing • High level staged model from Wickens and Carswell (1997) • Relates perception, cognition, and physical ergonomics Perception Cognition Ergonomics
  • 42. Design for Perception • Need to understand perception to design AR • Visual perception • Many types of visual cues (stereo, oculomotor, etc.) • Auditory system • Binaural cues, vestibular cues • Somatosensory • Haptic, tactile, kinesthetic, proprioceptive cues • Chemical Sensing System • Taste and smell
  • 43. Depth Perception Problems • Without proper depth cues AR interfaces look unreal
  • 44. Which of these POI are near or far?
  • 47. Cutaway Example • Providing depth perception cues for AR https://www.youtube.com/watch?v=2mXRO48w_E4
  • 48. Design for Cognition • Design for Working and Long-term memory • Working memory • Short term storage, Limited storage (~5-9 items) • Long term memory • Memory recall trigger by associative cues • Situational Awareness • Model of current state of user’s environment • Used for wayfinding, object interaction, spatial awareness, etc.. • Provide cognitive cues to help with situational awareness • Landmarks, procedural cues, map knowledge • Support both ego-centric and exo-centric views
  • 49. Micro-Interactions ▪ Using mobile phones people split their attention between the display and the real world
  • 50. Time Looking at Screen Oulasvirta, A. (2005). The fragmentation of attention in mobile interaction, and what to do with it. interactions, 12(6), 16-18.
  • 51. Dividing Attention to World • Number of times looking away from mobile screen
  • 52. Design for Micro Interactions ▪ Design interaction for less than a few seconds • Tiny bursts of interaction • One task per interaction • One input per interaction ▪ Benefits • Use limited input • Minimize interruptions • Reduce attention fragmentation
  • 53. NHTSA Guidelines - www.nhtsa.gov For technology in cars: • Any task by a driver should be interruptible at any time. • The driver should control the pace of task interactions. • Tasks should be completed with glances away from road <2 seconds • Cumulative time glancing away from the road <=12 secs.
  • 54. Make it Glanceable • Seek to rigorously reduce information density. Successful designs afford for recognition, not reading. Bad Good
  • 55. Reduce Information Chunks You are designing for recognition, not reading. Reducing the total # of information chunks will greatly increase the glanceability of your design. 1 2 3 1 2 3 4 5 (6) Eye movements For 1: 1-2 460ms For 2: 1 230ms For 3: 1 230ms ~920ms Eye movements For 1: 1 230ms For 2: 1 230ms For 3: 1 230ms For 4: 3 690ms For 5: 2 460ms ~1,840ms
  • 56. Ego-centric and Exo-centric views • Combining ego-centric and exo-centric cue for better situational awareness
  • 57. Cognitive Issues in Mobile AR • Information Presentation • Amount, Representation, Placement, View combination • Physical Interaction • Navigation, Direct manipulation, Content creation • Shared Experience • Social context, Bodily Configuration, Artifact manipulation, Display space Li, N., & Duh, H. B. L. (2013). Cognitive issues in mobile augmented reality: an embodied perspective. In Human factors in augmented reality environments (pp. 109-135). Springer, New York, NY.
  • 58. Information Presentation • Consider • The amount of information • Clutter, complexity • The representation of information • Navigation cues, POI representation • The placement of information • Head, body, world stabilized • Using view combinations • Multiple views
  • 59. Example: Twitter 360 • iPhone application • See geo-located tweets in real world • Twitter.com supports geo tagging
  • 60. But: Information Clutter from Many Tweets Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah Blah
  • 64. • Show POI outside FOV • Zooms between map and panorama views Zooming Views
  • 66. Design for Physical Ergonomics • Design for the human motion range • Consider human comfort and natural posture • Design for hand input • Coarse and fine scale motions, gripping and grasping • Avoid “Gorilla arm syndrome” from holding arm pose
  • 67. Gorilla Arm in AR • Design interface to reduce mid-air gestures
  • 68. XRgonomics • Uses physiological model to calculate ergonomic interaction cost • Difficulty of reaching points around the user • Customizable for different users • Programmable API, Hololens demonstrator • GitHub Repository • https://github.com/joaobelo92/xrgonomics Evangelista Belo, J. M., Feit, A. M., Feuchtner, T., & Grønbæk, K. (2021, May). XRgonomics: Facilitating the Creation of Ergonomic 3D Interfaces. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-11).
  • 70. 2. Designing for Different User Groups • Design for Difference Ages • Children require different interface design than adults • Older uses have different needs than younger • Prior Experience with AR systems • Familiar with HMDs, AR input devices • People with Different Physical Characteristics • Height and arm reach, handedness • Perceptual, Cognitive and Motor Abilities • Colour perception varies between people • Spatial ability, cognitive or motor disabilities
  • 71. Designing for Children • HMDS • inter pupillary distance, head fit, size and weight • Tablets • Poor dexterity, need to hold large tablet • Content • Reading ability, spatial perception
  • 72. 3. Design for the Whole User
  • 73. Consider Your User • Consider context of user • Physical, social, emotional, cognitive, etc. • Mobile Phone AR User • Probably Mobile • One hand interaction • Short application use • Need to be able to multitask • Use in outdoor or indoor environment • Want to enhance interaction with real world
  • 74. Would you wear this HMD?
  • 75. Whole User Needs • Social • Don’t make your user look stupid • Cultural • Follow local cultural norms • Physical • Can the user physically use the interface? • Cognitive • Can the user understand how the interface works? • Emotional • Make the user feel good and in control
  • 76. Example: Social Acceptance • People don’t want to look silly • Only 12% of 4,600 adults would be willing to wear AR glasses • 20% of mobile AR browser users experience social issues • Acceptance more due to Social than Technical issues • Needs further study (ethnographic, field tests, longitudinal)
  • 78.
  • 79.
  • 80. 4. Use UI Best Practices • General UI design principles can be applied to AR • E.g. Shneiderman’s UI guidelines from 1998 • Providing interface feedback • Mixture of reactive, instrumental and operational feedback • Maintain spatial and temporal correspondence • Use constraints • Specify relations between variables that must be satisfied • E.g. physical constraints reduce freedom of movement • Support Two-Handed control • Use Guiard’s framework of bimanual manipulation • Dominant vs. non-dominant hands
  • 81. Follow Good HCI Principles • Provide good conceptual model/Metaphor • customers want to understand how UI works • Make things visible • if object has function, interface should show it • Map interface controls to customerʼs model • infix -vs- postfix calculator -- whose model? • Provide feedback • what you see is what you get!
  • 82. Example: Guiard’s model of bimanual manipulation Guiard, Y. (1987). Asymmetric division of labor in human skilled bimanual action: The kinematic chain as a model. Journal of Motor Behavior, 19, 486-517. Dominant hand Non-dominant hand Dominant hand Non-dominant hand Non-Dominant: Leads, set spatial reference frame, performs coarse motions Dominant: Follows, works in reference frame, performs fine motions
  • 83. Adapting Existing Guidelines • Mobile Phone AR • Phone HCI Guidelines • Mobile HCI Guidelines • HMD Based AR • 3D User Interface Guidelines • VR Interface Guidelines • Desktop AR • Desktop UI Guidelines
  • 84. Example: Apple iOS Interface Guidelines • Make it obvious how to use your content. • Avoid clutter, unused blank space, and busy backgrounds. • Minimize required user input. • Express essential information succinctly. • Provide a fingertip-sized target for all controls. • Avoid unnecessary interactivity. • Provide feedback when necessary From: https://developer.apple.com/ios/human-interface-guidelines/
  • 85. Applying Principles to Mobile AR • Clean • Large Video View • Large Icons • Text Overlay • Feedback
  • 86. •Interface Components • Physical components • Display elements • Visual/audio • Interaction metaphors Physical Elements Display Elements Interaction Metaphor Input Output 5. Use Interface Metaphors
  • 87. AR Interfaces Tangible AR Tangible input AR overlay Direct interaction Natural AR Freehand gesture Speech, gaze Tangible UI Augmented surfaces Object interaction Familiar controllers Indirect interaction 3D AR 3D UI Dedicated controllers Custom devices Browsing Simple input Viewpoint control Expressiveness, Intuitiveness
  • 88. AR Interfaces Tangible AR Tangible input AR overlay Direct interaction Natural AR Freehand gesture Speech, gaze Tangible UI Augmented surfaces Object interaction Familiar controllers Indirect interaction 3D AR 3D UI Dedicated controllers Custom devices Browsing Simple input Viewpoint control Design for Layers
  • 89. Information Layers • Head-stabilized • Heads-up display • Body-stabilized • E.g., virtual tool-belt • World-stabilized • E.g., billboard or signpost
  • 90. Head stabilized • Information attached to view – always visible
  • 91. Body Stabilized • Information moves with person
  • 92. Body Stabilized Interface • Elements you want always available, but not always visible
  • 94. • Elements you want fixed relative to real world objects
  • 96. Example: Fragments • UI Elements embedded in real world • Real world occlusion
  • 98. Design to Device Constraints • Understand the platform and design for limitations • Hardware, software platforms • E.g. Handheld AR game with visual tracking • Use large screen icons • Consider screen reflectivity • Support one-hand interaction • Consider the natural viewing angle • Do not tire users out physically • Do not encourage fast actions • Keep at least one tracking surface in view Art of Defense Game
  • 99. Handheld AR Constraints/Affordances • Camera and screen are linked • Fast motions a problem when looking at screen • Intuitive “navigation” • Phone in hand • Two handed activities: awkward or intuitive • Extended periods of holding phone tiring • Awareness of surrounding environment • Small screen • Extended periods of looking at screen tiring • In general, small awkward platform • Vibration, sound • Can provide feedback when looking elsewhere
  • 100. Common Mobile AR Metaphors • Tangible AR Lens Viewing • Look through screen into AR scene • Interact with screen to interact with AR content • Touch screen input • E.g. Invisible Train • Metaphor – holding a window into the AR world
  • 102. Common Mobile AR Metaphors • Tangible AR Lens Manipulation • Select AR object and attach to device • Physically move phone to move AR object • Use motion of device as input • E.g. AR Lego • Metaphor – phone as physical handle for device
  • 104. AR Interfaces Tangible AR Tangible input AR overlay Direct interaction Natural AR Freehand gesture Speech, gaze Tangible UI Augmented surfaces Object interaction Familiar controllers Indirect interaction 3D AR 3D UI Dedicated controllers Custom devices Browsing Simple input Viewpoint control Design for Affordances
  • 105. Tangible AR Metaphor • AR overcomes limitation of TUIs • enhance display possibilities • merge task/display space • provide public and private views • TUI + AR = Tangible AR • Apply TUI methods to AR interface design
  • 106. Tangible AR Design Principles • Tangible AR Interfaces use TUI principles • Physical controllers for moving virtual content • Support for spatial 3D interaction techniques • Time and space multiplexed interaction • Support for multi-handed interaction • Match object affordances to task requirements • Support parallel activity with multiple objects • Allow collaboration between multiple users
  • 107. AR Design Space Reality Virtual Reality Augmented Reality Physical Design Virtual Design
  • 108. Affordances ”… the perceived and actual properties of the thing, primarily those fundamental properties that determine just how the thing could possibly be used. [...] Affordances provide strong clues to the operations of things.” (Norman, The Psychology of Everyday Things 1988, p.9)
  • 111. Affordance Matrix Fake door Hidden door Real door No door
  • 112. Physical vs. Virtual Affordances • Physical Affordance • Look and feel of real objects • Shape, texture, colour, weight, etc. • Industrial Design • Virtual Affordance • Look of virtual objects • Copy real objects • Interface Design
  • 113. •AR design is mixture of physical affordance and virtual affordance •Physical •Tangible controllers and objects •Virtual •Virtual graphics and audio
  • 114. Affordances in AR • Design AR interface objects to show how they are used • Use visual and physical cues to show possible affordances • Perceived affordances should match actual affordances • Physical and virtual affordances should match Merge Cube Tangible Molecules
  • 115. Case Study 1: 3D AR Lens Goal: Develop a lens based AR interface • MagicLenses • Developed at Xerox PARC in 1993 • View a region of the workspace differently to the rest • Overlap MagicLenses to create composite effects
  • 116. 3D MagicLenses MagicLenses extended to 3D (Veiga et. al. 96) § Volumetric and flat lenses
  • 117. AR Lens Design Principles • Physical Components • Lens handle • Virtual lens attached to real object • Display Elements • Lens view • Reveal layers in dataset • Interaction Metaphor • Physically holding lens
  • 118. 3D AR Lenses: Model Viewer § Displays models made up of multiple parts § Each part can be shown or hidden through the lens § Allows the user to peer inside the model § Maintains focus + context
  • 121. AR Lens Implementation Stencil Buffer Outside Lens Inside Lens Virtual Magnifying Glass
  • 122. Case Study 2 : LevelHead • Block based game
  • 123. Case Study 2: LevelHead • Physical Components • Real blocks • Display Elements • Virtual person and rooms • Interaction Metaphor • Blocks are rooms
  • 124.
  • 126. Case Study 3: AR Chemistry (Fjeld 2002) • Tangible AR chemistry education
  • 127. Goal: An AR application to teach molecular structure in chemistry •Physical Components • Real book, rotation cube, scoop, tracking markers •Display Elements • AR atoms and molecules • Interaction Metaphor • Build your own molecule
  • 128. AR Chemistry Input Devices
  • 129.
  • 131. Case Study 4: Transitional Interfaces Goal: An AR interface supporting transitions from reality to virtual reality •Physical Components • Real book •Display Elements • AR and VR content • Interaction Metaphor • Book pages hold virtual scenes
  • 132. Milgram’s Continuum (1994) Reality (Tangible Interfaces) Virtualit y (Virtual Reality) Augmented Reality (AR) Augmented Virtuality (AV) Mixed Reality (MR) Central Hypothesis • The next generation of interfaces will support transitions along the Reality-Virtuality continuum
  • 133. Transitions • Interfaces of the future will need to support transitions along the RV continuum • Augmented Reality is preferred for: • co-located collaboration • Immersive Virtual Reality is preferred for: • experiencing world immersively (egocentric) • sharing views • remote collaboration
  • 134. The MagicBook •Design Goals: •Allows user to move smoothly between reality and virtual reality •Support collaboration
  • 137. Features • Seamless transition from Reality to Virtuality • Reliance on real decreases as virtual increases • Supports egocentric and exocentric views • User can pick appropriate view • Computer becomes invisible • Consistent interface metaphors • Virtual content seems real • Supports collaboration
  • 138. Collaboration in MagicBook • Collaboration on multiple levels: • Physical Object • AR Object • Immersive Virtual Space • Egocentric + exocentric collaboration • multiple multi-scale users • Independent Views • Privacy, role division, scalability
  • 139. Technology • Reality • No technology • Augmented Reality • Camera – tracking • Switch – fly in • Virtual Reality • Compass – tracking • Press pad – move • Switch – fly out
  • 140. Summary •When designing AR interfaces, think of: • Physical Components • Physical affordances • Virtual Components • Virtual affordances • Interface Metaphors • Tangible AR or similar
  • 141. Mobile AR Game Design
  • 142.
  • 143.
  • 144.
  • 145.
  • 147.
  • 148.
  • 150. Design Guidelines By Vendors Platform driven By Designers User oriented By Practitioners Experience based By Researchers Empirically derived
  • 151. Design Patterns “Each pattern describes a problem which occurs over and over again in our environment, and then describes the core of the solution to that problem in such a way that you can use this solution a million times over, without ever doing it the same way twice.” – Christopher Alexander et al. Use Design Patterns to Address Reoccurring Problems C.A. Alexander, A Pattern Language, Oxford Univ. Press, New York, 1977.
  • 152. Example UI Design Patterns • http://ui-patterns.com/patterns
  • 153.
  • 154. Design Patterns for Handheld AR • Set of design patterns for Handheld AR • Title: a short phase that is memorable. • Definition: what experiences the prepattern supports • Description: how and why the prepattern works, what aspects of game design it is based on. • Examples: Illustrate the meaning of the pre-pattern. • Using the pre-patterns: reveal the challenges and context of applying the pre-patterns. Xu, Y., Barba, E., Radu, I., Gandy, M., Shemaka, R., Schrank, B., ... & Tseng, T. (2011, October). Pre-patterns for designing embodied interactions in handheld augmented reality games. In 2011 IEEE International Symposium on Mixed and Augmented Reality-Arts, Media, and Humanities (pp. 19-28). IEEE.
  • 155. Handheld AR Design Patterns Title Meaning Embodied Skills Device Metaphors Using metaphor to suggest available player actions Body A&S Naïve physics Control Mapping Intuitive mapping between physical and digital objects Body A&S Naïve physics Seamful Design Making sense of and integrating the technological seams through game design Body A&S World Consistency Whether the laws and rules in physical world hold in digital world Naïve physics Environmental A&S Landmarks Reinforcing the connection between digital- physical space through landmarks Environmental A&S Personal Presence The way that a player is represented in the game decides how much they feel like living in the digital game world Environmental A&S Naïve physics Living Creatures Game characters that are responsive to physical, social events that mimic behaviours of living beings Social A&S Body A&S Body constraints Movement of one’s body position constrains another player’s action Body A&S Social A&S Hidden information The information that can be hidden and revealed can foster emergent social play Social A&S Body A&S *A&S = awareness and skills
  • 156.
  • 157. Example: Seamless Design • Design to reduce seams in the user experience • Eg: AR tracking failure, change in interaction mode • Paparazzi Game • Change between AR tracking to accelerometer input Yan Xu , et.al. , Pre-patterns for designing embodied interactions in handheld augmented reality games, Proceedings of the 2011 IEEE International Symposium on Mixed and Augmented Reality--Arts, Media, and Humanities, p.19-28, October 26-29, 2011
  • 158. Demo: Paparazzi Game • https://www.youtube.com/watch?v=MIGH5WGMnbs
  • 159. Example: Living Creatures • Virtual creatures should respond to real world events • eg. Player motion, wind, light, etc • Creates illusion creatures are alive in the real world • Sony EyePet • Responds to player blowing on creature
  • 160. Google ARCore Interface Guidelines https://developers.google.com/ar/design
  • 161. ARCore Elements App • Mobile AR app demonstrating interface guidelines • Multiple Interface Guidelines • User interface • User environment • Object manipulation • Off-screen markers • Etc.. • Test on Device • https://play.google.com/store/apps/details?id=com.google.ar.unity.ddelements
  • 163. ARKit Interface Guidelines • developer.apple.com/design/human-interface-guidelines/ios/system-capabilities/augmented-reality/
  • 164. Microsoft Mixed Reality Design Guidelines • https://docs.microsoft.com/en-us/windows/mixed-reality/design/design
  • 165. MRTK Interface Examples • Examples of UX Building Blocks • http://aka.ms/MRTK
  • 166.
  • 167.
  • 168.
  • 169.
  • 170.
  • 171.
  • 172.
  • 173.
  • 174.
  • 175. The Trouble with AR Design Guidelines 1) Rapidly evolving best practices Still a moving target, lots to learn about AR design Slowly emerging design patterns, but often change with OS updates Already major differences between device platforms 2) Challenges with scoping guidelines Often too high level, like “keep the user safe and comfortable” Or, too application/device/vendor-specific 3) Best guidelines come from learning by doing Test your designs early and often, learn from your own “mistakes” Mind differences between VR and AR, but less so between devices