Designing AugmentedReality ExperiencesMark Billinghurstmark.billinghurst@hitlabnz.orgThe HIT Lab NZ, University of Canterb...
How Would You Design This?  Put nice AR Picture here – and video
Or This?
DARE 1011.  Know the Technology2.  Design for User Experience  All aspects of user experience3.  Follow good Interaction ...
Know the Technology
What is Augmented Reality? Defining Characteristics (Azuma 97)•  Combines Real and Virtual Images– Both can be seen at th...
AR From Science Fiction to Fact1977 – Star Wars2008 – CNN
AR Part of MR ContinuumMixed RealityReality - Virtuality (RV) ContinuumRealEnvironmentAugmentedReality (AR)AugmentedVirtua...
Core Technologies Combining Real and Virtual Images•  Display technologies Interactive in Real-Time•  Input and interact...
Display Technologies Types (Bimber/Raskar 2003) Head attached•  Head mounted display/projector Body attached•  Handheld...
Display Taxonomy
AR Input Technologies Tangible objects•  Tracked items Touch (HHD)•  Glove, touch Gesture•  Glove, free-hand Speech/Mu...
Tracking Technologies Active•  Mechanical, Magnetic, Ultrasonic•  GPS, Wifi, cell location Passive•  Inertial sensors (c...
Design for User Experience
“The product is no longerthe basis of value. Theexperience is.”Venkat RamaswamyThe Future of Competition.Interaction Design
experiencesservicesproductscomponentsValueGilmore + Pine: Experience EconomyFunctionEmotion
experiencesapplicationstoolscomponentsDesigning AR ExperiencesTracking, Display, InputAuthoringInteractionUsability
The Value of Good User ExperienceKenya: 20cMy house: 50cStarbucks: $3.50
Good Experience Design  Reactrix  Top down projection  Camera based input  Reactive Graphics  No instructions  No tr...
Would You Wear This?
User Experience is All About You  Designing good userexperience involvesmany aspects  Consider all theneeds of the user...
  Web Based AR  Flash, HTML 5 based AR  Marketing, education  Outdoor Mobile AR  GPS, compass tracking  Viewing Poin...
What Makes a Good AR Experience?  Compelling  Engaging, ‘Magic’ moment  Intuitive, ease of use  Uses existing skills ...
Demo: colAR  Turn colouring books pages into AR scenes  Markerless tracking, use your own colours..  Try it yourself: h...
Follow Good InteractionDesign Principles
Interaction Design  Answering three questions:  What do you do? - How do you affect the world?  What do you feel? – Wha...
Interaction Design ProcessInteraction Design
AR UI Design  Consider your user  Follow good HCI principles  Adapt HCI guidelines for AR  Design to device constraint...
Consider Your User  Consider context of user  Physical, social, emotional, cognitive, etc  Mobile Phone AR User  Proba...
AR vs. Non AR Design  Design Guidelines  Design for 3D graphics + Interaction  Consider elements of physical world  Su...
Maps vs. Junaio  Google Maps  2D, mouse driven, text/image heavy, exocentric  Junaio  3D, location driven, simple grap...
Design to Device Constraints  Understand the platform and design for limitations  Hardware, software platforms  Eg Hand...
Design Patterns“Each pattern describes a problem which occursover and over again in our environment, and thendescribes the...
Handheld AR PatternsTitle Meaning Embodied SkillsDevice Metaphors Using metaphor to suggest available playeractionsBody A&...
Example: Seamless Design  Design to reduce seams in the user experience  Eg: AR tracking failure, change in interaction ...
Example: Living Creatures  Virtual creatures respond to real world events  eg. Player motion, wind, light, etc  Creates...
Rapid Hardware Prototyping  Speed development time by using quick hardware mockups  Handheld connected to PC, LCD screen...
Build Your Own Google Glass  Rapid Prototype Glass-Like HMD  Myvu HMD + headphone + iOS Device + basic glue skills  $30...
Why Evaluate AR Applications?  To test and compare interfaces, new technologies,interaction techniques  To validate the ...
HIT Lab NZ Usability Survey  A Survey of Evaluation Techniques Used inAugmented Reality Studies  Andreas Dünser, Raphaël...
Types of Experiments and topics  Sensation, Perception & Cognition  How is virtual content perceived ?  What perceptual...
Gabbard Model for AR Design1. user task analysis2. expert guidelines-based evaluation3. formative user-centered evaluation...
Gabbard Model in Context
Consider All Design Elements
  Interface Components Physical components Display elements-  Visual/audio Interaction metaphorsPhysicalElementsVirtua...
AR Design SpaceReality Virtual RealityAugmented RealityPhysical Design Virtual Design
Design of Objects  Objects  Purposely built – affordances  “Found” – repurposed  Existing – already at use in marketpl...
Affordance Led Design  Make affordances perceivable  Provide visual, haptic, tactile, auditory cues  Affordance Led Usa...
Example: AR Chemistry  Tangible AR chemistry education (Fjeld)Fjeld, M., Juchli, P., and Voegtli, B. M. 2003. Chemistry e...
Input Devices  Form informs function and use
AR Interaction Metaphors  Information Browsing  View AR content  3D AR Interfaces  3D UI interaction techniques  Augm...
1. Information Browsing  Information is registered toreal-world context  Hand held AR displays  Interaction  Manipulat...
2. 3D AR Interfaces  Virtual objects displayed in 3Dphysical space and manipulated  HMDs and 6DOF head-tracking  6DOF h...
3. Augmented Surfaces  Basic principles  Virtual objects are projected on a surface  Physical objects are used as contr...
Lessons from Tangible Interfaces  Physical objects make us smart  Norman’s “Things that Make Us Smart”  encode affordan...
TUI Limitations  Difficult to change object properties  Can’t tell state of digital data  Limited display capabilities...
4. Tangible AR Metaphor  AR overcomes limitation of TUIs  enhance display possibilities  merge task/display space  pro...
Tangible AR Demo  Use of natural physical objectmanipulations to control virtual objects  VOMAR Demo  Catalog book:-  T...
Object Based Interaction: MagicCup  Intuitive Virtual Object Manipulationon a Table-Top Workspace  Time multiplexed  Mu...
Tangible AR Design Principles  Tangible AR Interfaces use TUI principles  Physical controllers for moving virtual conten...
Example 1: AR Lens  Physical Components  Lens handle-  Virtual lens attached to real object  Display Elements  Lens vi...
Example 2: LevelHead  Physical Components  Real blocks  Display Elements  Virtual person and rooms  Interaction Metap...
Know Future ResearchDirections
The Vision of AR
To Make the Vision Real..  Hardware/software requirements Contact lens displays Free space hand/body tracking Speech/g...
Natural Interaction  Automatically detecting real environment  Environmental awareness  Physically based interaction  ...
AR MicroMachines  AR experience with environment awarenessand physically-based interaction  Based on MS Kinect RGB-D sen...
Physics Simulation  Create virtual mesh over real world  Update at 10 fps – can move real objects  Use by physics engin...
RenderingOcclusion Shadows
Gesture Input Architecture5. Gesture•  Static Gestures•  Dynamic Gestures•  Context based Gestures4. Modeling•  Hand recog...
Results
Free Hand Multimodal Input  Use free hand to interact with AR content  Recognize simple gestures  No marker trackingPoi...
Multimodal Architecture
Multimodal Fusion
Hand Occlusion
Conclusion
Conclusion  There is need for better designed AR experiences  Through  use of Interaction Design principles  understan...
More Information•  Mark Billinghurst– mark.billinghurst@hitlabnz.org•  Websites– www.hitlabnz.org
80Resources
Designing Augmented Reality Experiences
Upcoming SlideShare
Loading in...5
×

Designing Augmented Reality Experiences

1,091

Published on

Talk on Designing Augmented Reality Experiences given by Mark Billinghurst at the AWE 2013 conference on June 5th 2013

Published in: Technology, Business
0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,091
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
84
Comments
0
Likes
3
Embeds 0
No embeds

No notes for slide

Designing Augmented Reality Experiences

  1. 1. Designing AugmentedReality ExperiencesMark Billinghurstmark.billinghurst@hitlabnz.orgThe HIT Lab NZ, University of CanterburyJune 5th 2013
  2. 2. How Would You Design This?  Put nice AR Picture here – and video
  3. 3. Or This?
  4. 4. DARE 1011.  Know the Technology2.  Design for User Experience  All aspects of user experience3.  Follow good Interaction Design principles  Discover, Design, Evaluate4.  Consider all the Design Elements  Physical, Virtual and Metaphorical5.  Know Future Research Directions
  5. 5. Know the Technology
  6. 6. What is Augmented Reality? Defining Characteristics (Azuma 97)•  Combines Real and Virtual Images– Both can be seen at the same time•  Interactive in real-time– The virtual content can be interacted with•  Registered in 3D– Virtual objects appear fixed in spaceAzuma, R., A Survey of Augmented Reality, Presence, Vol. 6, No. 4, August 1997, pp. 355-385.
  7. 7. AR From Science Fiction to Fact1977 – Star Wars2008 – CNN
  8. 8. AR Part of MR ContinuumMixed RealityReality - Virtuality (RV) ContinuumRealEnvironmentAugmentedReality (AR)AugmentedVirtuality (AV)VirtualEnvironment"...anywhere between the extrema of the virtuality continuum."P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual DisplaysIEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.
  9. 9. Core Technologies Combining Real and Virtual Images•  Display technologies Interactive in Real-Time•  Input and interactive technologies Registered in 3D•  Viewpoint tracking technologiesDisplayProcessingInput Tracking
  10. 10. Display Technologies Types (Bimber/Raskar 2003) Head attached•  Head mounted display/projector Body attached•  Handheld display/projector Spatial•  Spatially aligned projector/monitor HMD Optical vs. Video see-through  Optical: Direct view of real world -> safer, simpler  Video: Video overlay -> more image registration options
  11. 11. Display Taxonomy
  12. 12. AR Input Technologies Tangible objects•  Tracked items Touch (HHD)•  Glove, touch Gesture•  Glove, free-hand Speech/Multimodal Device motion•  HHD + sensors
  13. 13. Tracking Technologies Active•  Mechanical, Magnetic, Ultrasonic•  GPS, Wifi, cell location Passive•  Inertial sensors (compass, accelerometer, gyro)•  Computer Vision•  Marker based, Natural feature tracking, model based Hybrid Tracking•  Combined sensors (eg Vision + Inertial)
  14. 14. Design for User Experience
  15. 15. “The product is no longerthe basis of value. Theexperience is.”Venkat RamaswamyThe Future of Competition.Interaction Design
  16. 16. experiencesservicesproductscomponentsValueGilmore + Pine: Experience EconomyFunctionEmotion
  17. 17. experiencesapplicationstoolscomponentsDesigning AR ExperiencesTracking, Display, InputAuthoringInteractionUsability
  18. 18. The Value of Good User ExperienceKenya: 20cMy house: 50cStarbucks: $3.50
  19. 19. Good Experience Design  Reactrix  Top down projection  Camera based input  Reactive Graphics  No instructions  No training
  20. 20. Would You Wear This?
  21. 21. User Experience is All About You  Designing good userexperience involvesmany aspects  Consider all theneeds of the user  Especially context ofuse
  22. 22.   Web Based AR  Flash, HTML 5 based AR  Marketing, education  Outdoor Mobile AR  GPS, compass tracking  Viewing Points of Interest in real world  Handheld AR  Vision based tracking  Marketing, gaming  Location Based Experiences  HMD, fixed screens  Museums, point of sale, advertisingTypical AR Experiences
  23. 23. What Makes a Good AR Experience?  Compelling  Engaging, ‘Magic’ moment  Intuitive, ease of use  Uses existing skills  Anchored in physical world  Seamless combination of real and digital
  24. 24. Demo: colAR  Turn colouring books pages into AR scenes  Markerless tracking, use your own colours..  Try it yourself: http://www.colARapp.com/
  25. 25. Follow Good InteractionDesign Principles
  26. 26. Interaction Design  Answering three questions:  What do you do? - How do you affect the world?  What do you feel? – What do you sense of the world?  What do you know? – What do you learn? The Design of UserExperience with Technology“Designing interactive products tosupport people in their everydayand working lives”Preece, J., (2002). Interaction Design
  27. 27. Interaction Design ProcessInteraction Design
  28. 28. AR UI Design  Consider your user  Follow good HCI principles  Adapt HCI guidelines for AR  Design to device constraints  Using Design Patterns to Inform Design  Design for you interface metaphor  Design for evaluation
  29. 29. Consider Your User  Consider context of user  Physical, social, emotional, cognitive, etc  Mobile Phone AR User  Probably Mobile  One hand interaction  Short application use  Need to be able to multitask  Use in outdoor or indoor environment  Want to enhance interaction with real world
  30. 30. AR vs. Non AR Design  Design Guidelines  Design for 3D graphics + Interaction  Consider elements of physical world  Support implicit interactionCharacteristics Non-AR Interfaces AR InterfacesObject Graphics Mainly 2D Mainly 3DObject Types Mainly virtual objects Both virtual and physical objectsObject behaviors Mainly passive objects Both passive and active objectsCommunication Mainly simple Mainly complexHCI methods Mainly explicit Both explicit and implicit
  31. 31. Maps vs. Junaio  Google Maps  2D, mouse driven, text/image heavy, exocentric  Junaio  3D, location driven, simple graphics, egocentric
  32. 32. Design to Device Constraints  Understand the platform and design for limitations  Hardware, software platforms  Eg Handheld AR game with visual tracking  Use large screen icons  Consider screen reflectivity  Support one-hand interaction  Consider the natural viewing angle  Do not tire users out physically  Do not encourage fast actions  Keep at least one tracking surface in view32Art of Defense Game
  33. 33. Design Patterns“Each pattern describes a problem which occursover and over again in our environment, and thendescribes the core of the solution to that problem insuch a way that you can use this solution a milliontimes over, without ever doing it the same way twice.”– Christopher Alexander et al.Use Design Patterns to Address Reoccurring ProblemsC.A. Alexander, A Pattern Language, Oxford Univ. Press, New York, 1977.
  34. 34. Handheld AR PatternsTitle Meaning Embodied SkillsDevice Metaphors Using metaphor to suggest available playeractionsBody A&S Naïve physicsControl Mapping Intuitive mapping between physical anddigital objectsBody A&S Naïve physicsSeamful Design Making sense of and integrating thetechnological seams through game designBody A&SWorld Consistency Whether the laws and rules inphysical world hold in digital worldNaïve physicsEnvironmental A&SLandmarks Reinforcing the connection between digital-physical space through landmarksEnvironmental A&SPersonal Presence The way that a player is represented in thegame decides how much they feel like livingin the digital game worldEnvironmental A&SNaïve physicsLiving Creatures Game characters that are responsive tophysical, social events that mimic behavioursof living beingsSocial A&S Body A&SBody constraints Movement of one’s body positionconstrains another player’s actionBody A&S Social A&SHidden information The information that can be hidden andrevealed can foster emergent social playSocial A&S Body A&S
  35. 35. Example: Seamless Design  Design to reduce seams in the user experience  Eg: AR tracking failure, change in interaction mode  Paparazzi Game  Change between AR tracking to accelerometer inputYan Xu , et.al. , Pre-patterns for designing embodied interactions in handheld augmented realitygames, Proceedings of the 2011 IEEE International Symposium on Mixed and Augmented Reality--Arts, Media, and Humanities, p.19-28, October 26-29, 2011
  36. 36. Example: Living Creatures  Virtual creatures respond to real world events  eg. Player motion, wind, light, etc  Creates illusion creatures are alive in the real world  Sony EyePet  Responds to player blowing on creature36
  37. 37. Rapid Hardware Prototyping  Speed development time by using quick hardware mockups  Handheld connected to PC, LCD screen, USB phone keypad,Camera  Can use PC tools for rapid application development  Flash, Visual Basic, etc
  38. 38. Build Your Own Google Glass  Rapid Prototype Glass-Like HMD  Myvu HMD + headphone + iOS Device + basic glue skills  $300 + less than 3 hours construction  http://www.instructables.com/id/DIY-Google-Glasses-AKA-the-Beady-i/
  39. 39. Why Evaluate AR Applications?  To test and compare interfaces, new technologies,interaction techniques  To validate the efficiency and efficient the ARinterface and system  Test Usability (learnability, efficiency, satisfaction,...)  Get user feedback, Better understand your users  Refine interface design  Better understand your end users  ...
  40. 40. HIT Lab NZ Usability Survey  A Survey of Evaluation Techniques Used inAugmented Reality Studies  Andreas Dünser, Raphaël Grasset, Mark Billinghurst  reviewed publications from 1993 to 2007  Extracted 6071 papers which mentioned “AugmentedReality”  Searched to find 165 AR papers with User Studies
  41. 41. Types of Experiments and topics  Sensation, Perception & Cognition  How is virtual content perceived ?  What perceptual cues are most important ?  How to visualize augmented/overlay information on real environment?  Visual search/attention/salience issues of human performance  Interaction  How can users interact with virtual content ?  Which interaction techniques are most efficient in certain context ?  Collaboration & Social issues  How is collaboration in AR interface different ?  Which collaborative cues can be conveyed best ?  Privacy and security issues of AR interface
  42. 42. Gabbard Model for AR Design1. user task analysis2. expert guidelines-based evaluation3. formative user-centered evaluation4. summative comparative evaluationsGabbard, J.L.; Swan, J.E.; , "Usability Engineeringfor Augmented Reality: Employing User-BasedStudies to Inform Design,”Visualization and Computer Graphics, IEEE Transactionson, vol.14, no.3, pp.513-525, May-June 2008
  43. 43. Gabbard Model in Context
  44. 44. Consider All Design Elements
  45. 45.   Interface Components Physical components Display elements-  Visual/audio Interaction metaphorsPhysicalElementsVirtualElementsInteractionMetaphorInput OutputAR Design Elements
  46. 46. AR Design SpaceReality Virtual RealityAugmented RealityPhysical Design Virtual Design
  47. 47. Design of Objects  Objects  Purposely built – affordances  “Found” – repurposed  Existing – already at use in marketplace  Affordance  The quality of an object allowing an action-relationship with an actor  An attribute of an object that allows people toknow how to use it-  e.g. a door handle affords pulling
  48. 48. Affordance Led Design  Make affordances perceivable  Provide visual, haptic, tactile, auditory cues  Affordance Led Usability  Give feedback  Provide constraints  Use natural mapping  Use good cognitive model
  49. 49. Example: AR Chemistry  Tangible AR chemistry education (Fjeld)Fjeld, M., Juchli, P., and Voegtli, B. M. 2003. Chemistry education: A tangible interactionapproach. Proceedings of INTERACT 2003, September 1st -5th 2003, Zurich,Switzerland.
  50. 50. Input Devices  Form informs function and use
  51. 51. AR Interaction Metaphors  Information Browsing  View AR content  3D AR Interfaces  3D UI interaction techniques  Augmented Surfaces  Tangible UI techniques  Tangible AR  Tangible UI input + AR output
  52. 52. 1. Information Browsing  Information is registered toreal-world context  Hand held AR displays  Interaction  Manipulation of a windowinto information space  Applications  Context-awareinformation displaysRekimoto, et al. 1997
  53. 53. 2. 3D AR Interfaces  Virtual objects displayed in 3Dphysical space and manipulated  HMDs and 6DOF head-tracking  6DOF hand trackers for input  Interaction  Viewpoint control  Traditional 3D user interfaceinteraction: manipulation,selection, etc.Kiyokawa, et al. 2000
  54. 54. 3. Augmented Surfaces  Basic principles  Virtual objects are projected on a surface  Physical objects are used as controls forvirtual objects  Support for collaboration  Rekimoto, et al. 1998  Front projection  Marker-based tracking  Multiple projection surfaces
  55. 55. Lessons from Tangible Interfaces  Physical objects make us smart  Norman’s “Things that Make Us Smart”  encode affordances, constraints  Objects aid collaboration  establish shared meaning  Objects increase understanding  serve as cognitive artifacts
  56. 56. TUI Limitations  Difficult to change object properties  Can’t tell state of digital data  Limited display capabilities  projection screen = 2D  dependent on physical display surface  Separation between object and display  Augmented Surfaces
  57. 57. 4. Tangible AR Metaphor  AR overcomes limitation of TUIs  enhance display possibilities  merge task/display space  provide public and private views  TUI + AR = Tangible AR  Apply TUI methods to AR interface design
  58. 58. Tangible AR Demo  Use of natural physical objectmanipulations to control virtual objects  VOMAR Demo  Catalog book:-  Turn over the page  Paddle operation:-  Push, shake, incline, hit, scoop
  59. 59. Object Based Interaction: MagicCup  Intuitive Virtual Object Manipulationon a Table-Top Workspace  Time multiplexed  Multiple Markers-  Robust Tracking  Tangible User Interface-  Intuitive Manipulation  Stereo Display-  Good Presence
  60. 60. Tangible AR Design Principles  Tangible AR Interfaces use TUI principles  Physical controllers for moving virtual content  Support for spatial 3D interaction techniques  Time and space multiplexed interaction  Support for multi-handed interaction  Match object affordances to task requirements  Support parallel activity with multiple objects  Allow collaboration between multiple users
  61. 61. Example 1: AR Lens  Physical Components  Lens handle-  Virtual lens attached to real object  Display Elements  Lens view-  Reveal layers in dataset  Interaction Metaphor  Physically holding lens
  62. 62. Example 2: LevelHead  Physical Components  Real blocks  Display Elements  Virtual person and rooms  Interaction Metaphor  Blocks are rooms
  63. 63. Know Future ResearchDirections
  64. 64. The Vision of AR
  65. 65. To Make the Vision Real..  Hardware/software requirements Contact lens displays Free space hand/body tracking Speech/gesture recognition Etc..  Most importantly Usability/User Experience
  66. 66. Natural Interaction  Automatically detecting real environment  Environmental awareness  Physically based interaction  Gesture Input  Free-hand interaction  Multimodal Input  Speech and gesture interaction  Implicit rather than Explicit interaction
  67. 67. AR MicroMachines  AR experience with environment awarenessand physically-based interaction  Based on MS Kinect RGB-D sensor  Augmented environment supports  occlusion, shadows  physically-based interaction between real andvirtual objects
  68. 68. Physics Simulation  Create virtual mesh over real world  Update at 10 fps – can move real objects  Use by physics engine for collision detection (virtual/real)  Use by OpenScenegraph for occlusion and shadows
  69. 69. RenderingOcclusion Shadows
  70. 70. Gesture Input Architecture5. Gesture•  Static Gestures•  Dynamic Gestures•  Context based Gestures4. Modeling•  Hand recognition/modeling•  Rigid-body modeling3. Classification/Tracking2. Segmentation1. Hardware Interface
  71. 71. Results
  72. 72. Free Hand Multimodal Input  Use free hand to interact with AR content  Recognize simple gestures  No marker trackingPoint Move Pick/Drop
  73. 73. Multimodal Architecture
  74. 74. Multimodal Fusion
  75. 75. Hand Occlusion
  76. 76. Conclusion
  77. 77. Conclusion  There is need for better designed AR experiences  Through  use of Interaction Design principles  understanding of the technology  use of rapid prototyping tools  rigorous user evaluation  There a number of important areas for future research  Natural interaction, Multimodal interfaces, Intelligent agents, …
  78. 78. More Information•  Mark Billinghurst– mark.billinghurst@hitlabnz.org•  Websites– www.hitlabnz.org
  79. 79. 80Resources
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×