CHI 2013 DARE Course
Upcoming SlideShare
Loading in...5
×
 

CHI 2013 DARE Course

on

  • 5,887 views

CHI 2013 Course on Designing Augmented Reality Experiences. Taught by Mark Billinghurst and Henry Duh at the CHI 2013 Conference - April, 2013

CHI 2013 Course on Designing Augmented Reality Experiences. Taught by Mark Billinghurst and Henry Duh at the CHI 2013 Conference - April, 2013

Statistics

Views

Total Views
5,887
Views on SlideShare
5,887
Embed Views
0

Actions

Likes
4
Downloads
112
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

CHI 2013 DARE Course CHI 2013 DARE Course Presentation Transcript

  • Billinghurst and Duh 1Designing Augmented Reality ExperiencesMark BillinghurstUniversity of CanterburyChristchurch, New ZealandHenry B.L. DuhNational University of SingaporeSingapore, Singaporecourses@chi2013.comhttp://chi2013.acm.org/Copyright is held by Billinghurst & DuhCHI 2013, April 27–May 2, 2013, Paris, France.ACM 13/04
  • Billinghurst and Duh 2Introduction
  • Billinghurst and Duh 3Instructors Mark Billinghurst•  Director of HIT Lab NZ, University of Canterbury•  Degrees in Electrical Engineering, Applied Mathematics•  Research on collaborative AR, mobile AR, AR usability•  More than 250 papers in AR, VR, interface design Henry Duh•  Co-director Keio-NUS Joint International Research (CUTE) Center•  Degrees in Psychology, Industrial design and Engineering•  Research on interaction design and AR applications•  More than 80 papers in HCI, AR and DesignIntroduction
  • Billinghurst and Duh 4How Would You Design This? Put nice AR Picture here – and video
  • Billinghurst and Duh 5Or This?
  • Billinghurst and Duh 6  How to design effective AR experiences  Understanding AR interaction design possibilities  Hardware and software tools for rapid prototyping of AR applications  Effective evaluation methods for AR applications  Current areas of AR research that will contribute to future AR experiences  Hands on experiences with AR applications  Resources for your own researchWhat You Will LearnIntroduction
  • Billinghurst and Duh 7 Introduction [Mark] AR and the Interaction Design Process [Mark] Design Guidelines and Interaction Metaphors for AR [Mark] AR Development/Prototyping Tools [Mark] Afternoon Tea – Demos [Mark and Henry] AR Evaluation Methods [Henry] AR Design Case Studies [Henry] AR Research Directions [Mark]Course AgendaIntroduction
  • Billinghurst and Duh 8Course Demos AR AuthoringBuildAR, Metaio Creator AR Browers•  Junaio, Layar, Wikitude AR Gaming•  Elite CommandAR, Transformers, etc.. Marker Based Handheld AR•  NASA and CCDU Outdoor AR•  CityViewAR Displays•  Vuzix, Google Glass
  • Billinghurst and Duh 9Course Motivation AR Needs Good Interaction Design  AR increasingly popular but ergonomics, design and socialissues need to be addressed  There is a need for deeper understanding of how to uncover,design build and evaluate effective AR experiences  AR authoring tools are making it easier than ever before to buildan AR experience, but there are few design guidelines  Many AR applications are being developed, but there is littleformal evaluation being conducted  AR experiences are being delivered without an understanding ofthe interaction design/experience design processIntroduction
  • Billinghurst and Duh 10What is Augmented Reality? Defining Characteristics (Azuma 97)•  Combines Real and Virtual Images– Both can be seen at the same time•  Interactive in real-time– The virtual content can be interacted with•  Registered in 3D– Virtual objects appear fixed in spaceIntroductionAzuma, R., A Survey of Augmented Reality, Presence, Vol. 6, No. 4, August 1997, pp. 355-385.
  • Billinghurst and Duh 11From Science Fiction to Fact1977 – Star Wars2008 – CNNIntroduction
  • Billinghurst and Duh 12AR Part of MR ContinuumMixed RealityReality - Virtuality (RV) ContinuumRealEnvironmentAugmentedReality (AR)AugmentedVirtuality (AV)VirtualEnvironment"...anywhere between the extrema of the virtuality continuum."P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual DisplaysIEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.
  • Billinghurst and Duh 13AR History 1960’s – 80’s: Early Experimentation•  Military, Academic labs 1980’s – 90’s: Basic Research•  Tracking, Displays 1995 – 2005: Tools/Applications•  Interaction, Usability, Theory 2005 - : Commercial Applications•  Games, Medical, Industry, MobileIntroduction
  • Billinghurst and Duh 14Core Technologies Combining Real and Virtual Images•  Display technologies Interactive in Real-Time•  Input and interactive technologies Registered in 3D•  Viewpoint tracking technologiesIntroductionDisplayProcessingInput Tracking
  • Billinghurst and Duh 15Display Technologies Types (Bimber/Raskar 2003) Head attached•  Head mounted display/projector Body attached•  Handheld display/projector Spatial•  Spatially aligned projector/monitor HMD Optical vs. Video see-through Optical: Direct view of real world -> safer, simpler Video: Video overlay -> more image registration optionsIntroduction
  • Billinghurst and Duh 16Display Taxonomy
  • Billinghurst and Duh 17Input Technologies Tangible objects•  Tracked items Touch (HHD)•  Glove, touch Gesture•  Glove, free-hand Speech/Multimodal Device motion•  HHD + sensorsIntroduction
  • Billinghurst and Duh 18Tracking Technologies Active•  Mechanical, Magnetic, Ultrasonic•  GPS, Wifi, cell location Passive•  Inertial sensors (compass, accelerometer, gyro)•  Computer Vision•  Marker based•  Natural feature tracking Hybrid Tracking•  Combined sensors (eg Vision + Inertial)Introduction
  • Billinghurst and Duh 19 Web Based AR•  Flash, HTML 5 based AR•  Marketing, education Outdoor Mobile AR•  GPS, compass tracking•  Viewing Points of Interest in real world•  Eg: Junaio, Layar, Wikitude Handheld AR•  Vision based tracking•  Marketing, gaming Location Based Experiences•  HMD, fixed screens•  Museums, point of sale, advertisingTypical AR ExperiencesIntroduction
  • Billinghurst and Duh 20AR Becoming Big Business Marketing•  Web-based, mobile Mobile AR•  Geo-located information and service•  Driving demand for high end phones Gaming•  Mobile, Physical input (Kinect) Upcoming areas•  Manufacturing, Medical, Military Rapid Growth•  Market projected to grow 53% 2012 – 2016•  Over $5 Billion USD in Mobile AR alone by 2017
  • Billinghurst and Duh 21Mobile AR Market Size
  • Billinghurst and Duh 22Commercial AR Companies ARToolworks (http://www.artoolworks.com/)•  ARToolKit, FLARToolKit, SDKs Metaio (http://www.metaio.com/)•  Marketing, Industry, SDKs Total Immersion (http://www.t-immersion.com/)•  Marketing, Theme Parks, AR Experiences Qualcomm (http://developer.qualcomm.com/dev/augmented-reality)•  Mobile AR, Vuforia SDK Many small start-ups (String, Ogmento, etc)
  • Billinghurst and Duh 23The Interaction Design Process
  • Billinghurst and Duh 24“The product is no longerthe basis of value. Theexperience is.”Venkat RamaswamyThe Future of Competition.Interaction Design
  • Billinghurst and Duh 25experiencesservicesproductscomponentsValueGilmore + Pine:Experience EconomyFunctionEmotionInteraction Design
  • Billinghurst and Duh 26experiencesapplicationstoolscomponentsDesigning AR ExperiencesTracking, DisplayAuthoringInteractionUsabilityInteraction Design
  • Billinghurst and Duh 27The Value of Good UserExperience20c50c$3.50Interaction Design
  • Billinghurst and Duh 28Good Experience Design Reactrix•  Top down projection•  Camera based input•  Reactive Graphics•  No instructions•  No trainingInteraction Design
  • Billinghurst and Duh 29Apple: The Value of Good Design Good Experience Design Dominates MarketsiPod Sales 2002-2007
  • Billinghurst and Duh 30Nokia N-Gage Great idea – bad experience design See - http://www.sidetalkin.comGood: Handheld Gaming + Phone Bad: Look like a dork using it
  • Billinghurst and Duh 31Interaction Design Answering three questions:•  What do you do? - How do you affect the world?•  What do you feel? – What do you sense of the world?•  What do you know? – What do you learn? The Design of UserExperience with Technology“Designing interactive products tosupport people in their everyday andworking lives”Preece, J., (2002). Interaction DesignInteraction Design
  • Billinghurst and Duh 32Interaction Design is All About You Users should be involvedthroughout the DesignProcess Consider all the needs ofthe user•  Especially context of useInteraction Design
  • Billinghurst and Duh 33Interaction Design ProcessInteraction Design
  • Billinghurst and Duh 34Gabbard Model for AR Design1. user task analysis2. expert guidelines-based evaluation3. formative user-centered evaluation4. summative comparative evaluationsGabbard, J.L.; Swan, J.E.; , "Usability Engineeringfor Augmented Reality: Employing User-BasedStudies to Inform Design,”Visualization and Computer Graphics, IEEE Transactionson, vol.14, no.3, pp.513-525, May-June 2008
  • Billinghurst and Duh 35Gabbard Model in Context
  • Billinghurst and Duh 36Design Guidelines for ARDesign Guidelines
  • Billinghurst and Duh 37The Interaction Design Process
  • Billinghurst and Duh 38AR Interaction Design Designing AR System = Interface Design•  Using different input and output technologies Objective is a high quality of user experience•  Ease of use and learning•  Performance and satisfaction
  • Billinghurst and Duh 39Design Considerations Combining Real and Virtual Images•  Perceptual issues Interactive in Real-Time•  Interaction issues Registered in 3D•  Technology issuesIntroduction
  • Billinghurst and Duh 40 Interface Components•  Physical components•  Display elements– Visual/audio•  Interaction metaphorsPhysicalElementsDisplayElementsInteractionMetaphorInput OutputAR Design Elements
  • Billinghurst and Duh 41AR UI Design Consider your user Follow good HCI principles Adapt HCI guidelines for AR Design to device constraints Using Design Patterns to Inform Design Design for you interface metaphor Design for evaluation
  • Billinghurst and Duh 42Consider Your User Consider context of user•  Physical, social, emotional, cognitive, etc Mobile Phone AR User•  Probably Mobile•  One hand interaction•  Short application use•  Need to be able to multitask•  Use in outdoor or indoor environment•  Want to enhance interaction with real world
  • Billinghurst and Duh 43Good HCI Principles Affordance Reducing cognitive overload Low physical effort Learnability User satisfaction Flexibility in use Responsiveness and feedback Error tolerance
  • Billinghurst and Duh 44Norman’s Principles of Good Practice•  Ensure a high degree of visibility– allow the user to work out the current state of the systemand the range of actions possible.•  Provide feedback– continuous, clear information about the results of actions.•  Present a good conceptual model– allow the user to build up a picture of the way the systemholds together, the relationships between its different partsand how to move from one state to the next.•  Offer good mappings– aim for clear, natural relationships between actions theuser performs and the results they achieve.
  • Billinghurst and Duh 45Adapting Existing Guidelines Mobile Phone AR•  Phone HCI Guidelines•  Mobile HCI Guidelines HMD Based AR•  3D User Interface Guidelines•  VR Interface Guidelines Desktop AR•  Desktop UI Guidelines
  • Billinghurst and Duh 46iPhone Guidelines Make it obvious how to use your content. Avoid clutter, unused blank space, and busybackgrounds. Minimize required user input. Express essential information succinctly. Provide a fingertip-sized target area for all linksand controls. Avoid unnecessary interactivity. Provide feedback when necessary
  • Billinghurst and Duh 47Applying Principles to Mobile AR Clean Large Video View Large Icons Text Overlay Feedback
  • Billinghurst and Duh 48AR vs. Non AR Design Design Guidelines•  Design for 3D graphics + Interaction•  Consider elements of physical world•  Support implicit interactionCharacteristics Non-AR Interfaces AR InterfacesObject Graphics Mainly 2D Mainly 3DObject Types Mainly virtual objects Both virtual and physical objectsObject behaviors Mainly passive objects Both passive and active objectsCommunication Mainly simple Mainly complexHCI methods Mainly explicit Both explicit and implicit
  • Billinghurst and Duh 49Maps vs. Junaio Google Maps•  2D, mouse driven, text/image heavy, exocentric Junaio•  3D, location driven, simple graphics, egocentric
  • Billinghurst and Duh 50Design to Device Constraints Understand the platforms used and design for limitations•  Hardware, software platforms Eg Handheld AR game with visual tracking•  Use large screen icons•  Consider screen reflectivity•  Support one-hand interaction•  Consider the natural viewing angle•  Do not tire users out physically•  Do not encourage fast actions•  Keep at least one tracking surface in view50Art of Defense Game
  • Billinghurst and Duh 51Handheld AR Constraints/Affordances  Camera and screen are linked•  Fast motions a problem when looking at screen•  Intuitive “navigation”  Phone in hand•  Two handed activities: awkward or intuitive•  Extended periods of holding phone tiring•  Awareness of surrounding environment  Small screen•  Extended periods of looking at screen tiring•  In general, small awkward platform  Vibration, sound•  Can provide feedback when looking elsewhere  Networking - Bluetooth, 802.11•  Collaboration possible  Guaranteed minimum collection of buttons  Sensors often available•  GPS, camera, accelerometer, compass, etc
  • Billinghurst and Duh 52Design Patterns“Each pattern describes a problem which occursover and over again in our environment, and thendescribes the core of the solution to that problem insuch a way that you can use this solution a milliontimes over, without ever doing it the same way twice.”– Christopher Alexander et al.Use Design Patterns to Address Reoccurring ProblemsC.A. Alexander, A Pattern Language, Oxford Univ. Press, New York, 1977.
  • Billinghurst and Duh 53Handheld AR Design PatternsTitle Meaning Embodied SkillsDevice Metaphors Using metaphor to suggest available playeractionsBody A&S Naïve physicsControl Mapping Intuitive mapping between physical anddigital objectsBody A&S Naïve physicsSeamful Design Making sense of and integrating thetechnological seams through game designBody A&SWorld Consistency Whether the laws and rules inphysical world hold in digital worldNaïve physicsEnvironmental A&SLandmarks Reinforcing the connection between digital-physical space through landmarksEnvironmental A&SPersonal Presence The way that a player is represented in thegame decides how much they feel like livingin the digital game worldEnvironmental A&SNaïve physicsLiving Creatures Game characters that are responsive tophysical, social events that mimic behavioursof living beingsSocial A&S Body A&SBody constraints Movement of one’s body positionconstrains another player’s actionBody A&S Social A&SHidden information The information that can be hidden andrevealed can foster emergent social playSocial A&S Body A&S
  • Billinghurst and Duh 54Example: Seamless Design Design to reduce seams in the user experience•  Eg: AR tracking failure, change in interaction mode Paparazzi Game•  Change between AR tracking to accelerometer inputYan Xu , et.al. , Pre-patterns for designing embodied interactions in handheld augmentedreality games, Proceedings of the 2011 IEEE International Symposium on Mixed andAugmented Reality--Arts, Media, and Humanities, p.19-28, October 26-29, 2011
  • Billinghurst and Duh 55Example: Living Creatures Virtual creatures should respond to real world events•  eg. Player motion, wind, light, etc•  Creates illusion creatures are alive in the real world Sony EyePet•  Responds to player blowing on creature55
  • Billinghurst and Duh 56Physical ElementsDesign Guidelines
  • Billinghurst and Duh 57AR Design SpaceReality Virtual RealityAugmented RealityPhysical Design Virtual Design
  • Billinghurst and Duh 58Design of Objects Objects•  Purposely built – affordances•  “Found” – repurposed•  Existing – already at use in marketplace Affordance•  The quality of an object allowing an action-relationship with an actor•  An attribute of an object that allows people toknow how to use it– e.g. a door handle affords pulling
  • Billinghurst and Duh 59Norman on Affordances"...the term affordance refers to the perceivedand actual properties of the thing, primarilythose fundamental properties that determinejust how the thing could possibly be used.[...] Affordances provide strong clues to theoperations of things. Plates are for pushing.Knobs are for turning. Slots are for insertingthings into. Balls are for throwing .. "(Norman, The Psychology of EverydayThings 1988, p.9)
  • Billinghurst and Duh 60Physical vs. Virtual Affordances Physical affordances-  Physical and material aspects of real object Virtual affordance-  Visual and perceived aspects of digital objects AR is mixture of physical and virtualaffordances•  Physical– Tangible controllers and objects•  Virtual– Virtual graphics and audio- 
  • Billinghurst and Duh 61Affordance FrameworkWilliam W. Gaver. 1991. Technology affordances. In Proceedings of the SIGCHIConference on Human Factors in Computing Systems (CHI 91), Scott P. Robertson,Gary M. Olson, and Judith S. Olson (Eds.). ACM, New York, NY, USA, 79-84.
  • Billinghurst and Duh 62Affordance Led Design Make affordances perceivable•  Provide visual, haptic, tactile, auditory cues Affordance Led Usability•  Give feedback•  Provide constraints•  Use natural mapping•  Use good cognitive model
  • Billinghurst and Duh 63Example: AR Chemistry Tangible AR chemistry education (Fjeld)Fjeld, M., Juchli, P., and Voegtli, B. M. 2003. Chemistry education: A tangibleinteraction approach. Proceedings of INTERACT 2003, September 1st -5th2003, Zurich, Switzerland.
  • Billinghurst and Duh 64Input Devices Form informs function and use
  • Billinghurst and Duh 65Picking up an Atom
  • Billinghurst and Duh 66AR Interaction MetaphorsDesign Guidelines
  • Billinghurst and Duh 67 Interface Components•  Physical components•  Display elements– Visual/audio•  Interaction metaphorsPhysicalElementsDisplayElementsInteractionMetaphorInput OutputAR Design Principles
  • Billinghurst and Duh 68Interaction Tasks 2D (from [Foley]):•  Selection, Text Entry, Quantify, Position 3D (from [Bowman]):•  Navigation (Travel/Wayfinding)•  Selection•  Manipulation•  System Control/Data Input AR: 2D + 3D Tasks and.. more specific tasks?[Foley] The Human Factors of Computer Graphics InteractionTechniques Foley, J. D.,V.Wallace & P. Chan. IEEEComputer Graphics and Applications (Nov.): 13-48. 1984.[Bowman]: 3D User Interfaces:Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005
  • Billinghurst and Duh 69AR Interaction Metaphors Viewpoint Control Information Browsing•  establish shared meaning 3D AR Interfaces•  establish shared meaning Augmented Surfaces•  serve as cognitive artifacts Tangible AR•  serve as cognitive artifacts
  • Billinghurst and Duh 701. Viewpoint Control 2D/3D virtual objects areregistered in 3D•  “VR in Real World” Interaction•  2D/3D virtual viewpoint control Applications•  Visualization, training
  • Billinghurst and Duh 712. Information Browsering Information is registered toreal-world context•  Hand held AR displays Interaction•  Manipulation of a windowinto information space Applications•  Context-aware informationdisplaysRekimoto, et al. 1997
  • Billinghurst and Duh 723. 3D AR Interfaces Virtual objects displayed in 3Dphysical space and manipulated•  HMDs and 6DOF head-tracking•  6DOF hand trackers for input Interaction•  Viewpoint control•  Traditional 3D user interfaceinteraction: manipulation,selection, etc.Kiyokawa, et al. 2000
  • Billinghurst and Duh 734. Augmented Surfaces Basic principles•  Virtual objects are projected on a surface•  Physical objects are used as controls forvirtual objects•  Support for collaboration Rekimoto, et al. 1998•  Front projection•  Marker-based tracking•  Multiple projection surfaces
  • Billinghurst and Duh 745. Tangible User Interfaces Create digital shadows forphysical objects Foreground•  graspable UI Background•  ambient interfaces
  • Billinghurst and Duh 75Lessons from TangibleInterfaces Physical objects make us smart•  Norman’s “Things that Make Us Smart”•  encode affordances, constraints Objects aid collaboration•  establish shared meaning Objects increase understanding•  serve as cognitive artifacts
  • Billinghurst and Duh 76TUI Limitations Difficult to change object properties•  Can’t tell state of digital data Limited display capabilities•  projection screen = 2D•  dependent on physical display surface Separation between object and display•  Augmented Surfaces
  • Billinghurst and Duh 77Tangible AR Metaphor AR overcomes limitation of TUIs•  enhance display possibilities•  merge task/display space•  provide public and private views TUI + AR = Tangible AR•  Apply TUI methods to AR interface design
  • Billinghurst and Duh 78 Space-multiplexed•  Many devices each with one function–  Quicker to use, more intuitive, clutter–  Real Toolbox Time-multiplexed•  One device with many functions–  Space efficient–  mouse
  • Billinghurst and Duh 79Tangible AR: Tiles (SpaceMultiplexed) Tiles semantics•  data tiles•  operation tiles Operation on tiles•  proximity•  spatial arrangements•  space-multiplexed
  • Billinghurst and Duh 80Tangible AR: Time-multiplexed Interaction Use of natural physical object manipulations to controlvirtual objects VOMAR Demo•  Catalog book:–  Turn over the page•  Paddle operation:–  Push, shake, incline, hit, scoop
  • Billinghurst and Duh 81Object Based Interaction:MagicCup Intuitive Virtual Object Manipulationon a Table-Top Workspace•  Time multiplexed•  Multiple Markers–  Robust Tracking•  Tangible User Interface–  Intuitive Manipulation•  Stereo Display–  Good Presence
  • Billinghurst and Duh 82
  • Billinghurst and Duh 83Tangible AR Design Principles Tangible AR Interfaces use TUI principles•  Physical controllers for moving virtual content•  Support for spatial 3D interaction techniques•  Time and space multiplexed interaction•  Support for multi-handed interaction•  Match object affordances to task requirements•  Support parallel activity with multiple objects•  Allow collaboration between multiple users
  • Billinghurst and Duh 84Interaction with Handheld AR Embodied Interaction•  Focuses on the device itself•  Touch, gesture, orientation, etc Tangible Interaction•  Direct manipulation of known objects•  Tracking objects Egocentric vs. Exocentric Interaction•  Egocentric – inside out (eg outdoor AR browsing)•  Exocentric – outside in (eg marker based AR)
  • Billinghurst and Duh 85Handheld AR MetaphorsHandHeld ARWearable AROutput:DisplayInputInput &Output
  • Billinghurst and Duh 86Handheld Interface Metaphors Tangible AR Lens Viewing•  Look through screen into AR scene•  Interact with screen to interact withAR content–  Eg Invisible Train Tangible AR Lens Manipulation•  Select AR object and attach to device•  Use the motion of the device as input–  Eg AR Lego
  • Billinghurst and Duh 87Case Study 1: 3D AR LensGoal: Develop a lens based AR interface MagicLenses•  Developed at Xerox PARC in 1993•  View a region of the workspace differently to the rest•  Overlap MagicLenses to create composite effects
  • Billinghurst and Duh 883D MagicLensesMagicLenses extended to 3D (Veiga et. al. 96)  Volumetric and flat lenses
  • Billinghurst and Duh 89AR Lens Design Principles Physical Components•  Lens handle–  Virtual lens attached to real object Display Elements•  Lens view–  Reveal layers in dataset Interaction Metaphor•  Physically holding lens
  • Billinghurst and Duh 90Case Study 2: LevelHead Physical Components•  Real blocks Display Elements•  Virtual person and rooms Interaction Metaphor•  Blocks are rooms
  • Billinghurst and Duh 91AR Perceptual + Cognitive IssuesDesign Guidelines
  • Billinghurst and Duh 92AR and Perception Creating the illusion that virtual images areseamlessly part of the real world•  Must match real and virtual cues•  Depth, occlusion, lighting, shadows..
  • Billinghurst and Duh 93AR as Perception Problem Goal of AR to fool human senses – create illusion thatreal and virtual are merged Depth•  Size•  Occlusion•  Shadows•  Relative motion•  Etc..
  • Billinghurst and Duh 94Central goal of AR systems is to fool thehuman perceptual system Display Modes•  Direct View•  Stereo Video•  Stereo graphics Multi-modal display•  Different objects with different display modes•  Potential for depth cue conflictPerceptual IssuesD. Drascic and P. Milgram. Perceptual issues in augmented reality. In M. T. Bolas, S. S. Fisher, and J.O. Merritt, editors, SPIE Volume 2653: Stereoscopic Displays and Virtual Reality Systems III, pages123-134, January/February 1996.
  • Billinghurst and Duh 95Perceptual Issues Combining multiple display modes•  Direct View, Stereo Video View, Graphics View Conflict between display modes•  Mismatch between depth cues
  • Billinghurst and Duh 96Perceptual Issues Static and Dynamic registration mismatch Restricted Field of View Mismatch of Resolution and Image clarity Luminance mismatch Contrast mismatch Size and distance mismatch Limited depth resolution Vertical alignment mismatches Viewpoint dependency mismatch
  • Billinghurst and Duh 97Types of Perceptual Issues Environment: Issues related to the environment itself. Capturing: Issues related to digitizing the environment Augmentation: Issues related to the design, layout, andregistration or AR content Display device: Technical issues associated with thedisplay device. User: Issues associated with user perceiving content.E. Kruijff, J. E. Swan, and S. Feiner. Perceptual issues in augmented reality revisited.9th IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2010, pp. 3--12.
  • Billinghurst and Duh 98Depth Cues Pictorial: visual cues•  Occlusion, texture, relative brightness Kinetic: motion cues•  Relative motion parallax, motion perspective Physiological: motion cues•  Convergence, accommodation Binocular disparity: two different eye images
  • Billinghurst and Duh 99
  • Billinghurst and Duh 100Depth Perception
  • Billinghurst and Duh 101Occlusion Handling
  • Billinghurst and Duh 102Cognitive Issues in AR Three categories of issues•  Information Presentation – displaying virtualinformation on the real world•  Physical Interaction – content creation,manipulation and navigation in AR•  Shared Experience – collaboration andsupporting common experiences in ARLi, Nai, and Henry Been-Lirn Duh. "Cognitive Issues in Mobile Augmented Reality:An Embodied Perspective." Human Factors in Augmented Reality Environments.Springer New York, 2013. 109-135.
  • Billinghurst and Duh 103Information Presentation Information Presentation•  Amount of information•  Clutter, complexity•  Representation of information•  Navigation cues, POI representation•  Placement of information•  Head, body, world stabilized•  View combination•  Multiple views
  • Billinghurst and Duh 104Twitter 360 www.twitter-360.com iPhone application See geo-located tweets in real world Twitter.com supports geo tagging
  • Billinghurst and Duh 105Wikitude – www.mobilizy.comBlahBlahBlahBlahBlahBlahBlahBlahBlahBlahBlahBlahBlahBlahBlahBlahBlahBlahBlahBlahBlahBlah
  • Billinghurst and Duh 106Information Filtering
  • Billinghurst and Duh 107Information Filtering
  • Billinghurst and Duh 108Physical Interaction Physical Interaction•  Navigation•  Direct Manipulation•  Embodied vs. Tangible•  Multimodal interaction•  Content creation
  • Billinghurst and Duh 109Outdoor AR: Limited FOV
  • Billinghurst and Duh 110Possible solutions Overview + Detail•  spatial separation; two views Focus + Context•  merges both views into one view Zooming•  temporal separation
  • Billinghurst and Duh 111 TU Graz – HIT Lab NZ - collaboration•  Zooming panorama•  Zooming MapZooming Views
  • Billinghurst and Duh 112Gesture Based Interaction HMD-based AR frees the users hands•  Natural hand based interaction•  Intuitive manipulation – low cognitive load Example•  Tinmith-Hand Two hand manipulation of 3D models112
  • Billinghurst and Duh 113Shared Experiences Shared Experience•  Social context•  Bodily configuration•  Artifact manipulation•  Display space
  • Billinghurst and Duh 114TAT Augmented ID
  • Billinghurst and Duh 115
  • Billinghurst and Duh 116
  • Billinghurst and Duh 117Designing for Children Development Psychology Factors•  Motor Abilities•  Spatial Abilities•  Logic Abilities•  Attention AbilitiesRadu, Iulian, and Blair MacIntyre. "Using childrens developmental psychology toguide augmented-reality design and usability." Mixed and Augmented Reality(ISMAR), 2012 IEEE International Symposium on. IEEE, 2012.
  • Billinghurst and Duh 118Motor AbilitiesSkill Type Challenging AR InteractionMultiple hand coordination Holding phone in one hand andusing another hand to movemarkerHand-eye coordination Using a marker to intercept amoving objectFine motor skills Moving a marker on a specifiedpathGross motor skills and endurance Turning body around to look at apanorama
  • Billinghurst and Duh 119Spatial AbilitiesSkill Type Challenging AR InteractionSpatial memory Remembering the configuration of a largevirtual space while handheld screen showsa limited viewSpatial Perception Understanding when a virtual item is on topof a physical itemSpatial Visualization Predict when virtual objects are visible byother people or virtual characters
  • Billinghurst and Duh 120Attention and LogicSkill Type Challenging AR InteractionDivided attention Playing an AR game, and making sure tokeep marker in view so tracking is not lostSelective and executiveattentionPlaying an AR game while moving outdoorsSkill Type Challenging AR InteractionRemembering and reversing Remembering how to recover from trackinglossAbstract over concretethinkingUnderstanding that virtual objects arecomputer generated, and they do not needto obey physical lawsAttention AbilitiesLogic and Memory
  • Billinghurst and Duh 121121AR Development Tools
  • Billinghurst and Duh 122AR Authoring Tools Low Level Software Libraries•  osgART, Studierstube, MXRToolKit Plug-ins to existing software•  DART (Macromedia Director), mARx, Unity, Stand Alone•  AMIRE, BuildAR, Metaio Creator etc Rapid Prototyping Tools•  Flash, OpenFrameworks, Processing, Arduino, etc Next Generation•  iaTAR (Tangible AR)
  • Billinghurst and Duh 123ARToolKit (Kato 1998) Open source – computer vision based AR tracking http://artoolkit.sourceforge.net/
  • Billinghurst and Duh 124ARToolKit Structure Three key libraries:•  AR32.lib – ARToolKit image processing functions•  ARgsub32.lib – ARToolKit graphics functions•  ARvideo.lib – DirectShow video capture classDirectShowARvideo.lib
  • Billinghurst and Duh 125Software Cross platform•  Windows, Mac, Linux, IRIX, Symbian, iPhone, etc Additional basic libraries•  Video capture library (Video4Linux, VisionSDK)•  OpenGL, GLUT Requires a rendering library•  Open VRML, Open Inventor, osgART, etc
  • Billinghurst and Duh 126OSGART Programming Library Integration of ARToolKit with a High-Level Rendering Engine(OpenSceneGraph)OSGART= OpenSceneGraph + ARToolKit Supporting Geometric + Photometric Registration
  • Billinghurst and Duh 127osgART Approach: ARScene GraphVideoGeodeRootTransform3D ObjectVirtualCameraProjection matrix fromtracker calibrationTransformation matrixupdatedfrom markertracking inrealtimeVideoLayerFull-screenquad withlivetextureupdatedfromVideosourceOrthographicprojection
  • Billinghurst and Duh 128osgART:Features C++ (but also Python, Lua, etc). Multiple Video Input supports:•  Direct (Firewire/USB Camera), Files, Network byARvideo, PtGrey, CVCam, VideoWrapper, etc. Benefits of Open Scene Graph•  Rendering Engine, Plug-ins, etc
  • Billinghurst and Duh 129ARToolKit FamilyARToolKit ARToolKit NFTARToolKit (Symbian)NyToolKit- Java, C#,- Android, WMJARToolKit (Java)FLARToolKit (Flash)FLARManager (Flash)
  • Billinghurst and Duh 130Why Browser Based AR? High impact•  High marketing value Large potential install base•  1.6 Billion web users Ease of development•  Lots of developers, mature tools Low cost of entry•  Browser, web camera
  • Billinghurst and Duh 131FLARToolkitPapervision 3DAdobe FlashAR Application Components
  • Billinghurst and Duh 132FLARToolKit Example Boffswana Living Sasquatch In first month•  100K unique visits•  500K page views•  6 minutes on page
  • Billinghurst and Duh 133Low Level Mobile AR Tools Vuforia Tracking Library (Qualcomm)•  Vuforia.com•  iOS, Android•  Computer vision based tracking•  Marker tracking, 3D objects, frame markers Integration with Unity•  Interaction, model loading, game logic
  • Billinghurst and Duh 134Junaio - www.junaio.com
  • Billinghurst and Duh 135Junaio Key Features Content provided in information channels•  Over 2,000 channels available Two types of AR channels•  GLUE channels – visual tracking•  Location based channels – GPS, compass tracking Simple to use interface with multiple views•  List, map, AR (live) view Point of Interest (POI) based•  POIs are geo-located content
  • Billinghurst and Duh 136
  • Billinghurst and Duh 137AREL Augmented Reality Environment Language•  Overcomes limitations of XML by itself•  Based on web technologies; XML, HTML5, JavaScript Core Components1.  AREL XML: Static file, specifies scene content2.  AREL JavaScript: Handles all interactions and animation. Anyuser interaction send an event to AREL JS3.  AREL HTML5: GUI Elements. Buttons, icons, etc Advantages•  Scripting on device, more functionality, GUI customization
  • Billinghurst and Duh 138
  • Billinghurst and Duh 139
  • Billinghurst and Duh 140
  • Billinghurst and Duh 141Result
  • Billinghurst and Duh 142BirdsView Location Based CMS•  Add content, publish to Layar or Junaio•  http://www.birdsview.de/
  • Billinghurst and Duh 143BirdsView on Junaio
  • Billinghurst and Duh 144BirdsView on Junaio
  • Billinghurst and Duh 145BuildAR http://www.buildar.co.nz/ Stand alone application Visual interface for AR model viewing application Enables non-programmers to build AR scenes
  • Billinghurst and Duh 146Metaio Creator Drag and drop Junaio authoring
  • Billinghurst and Duh 147Total Immersion D’Fusion Studio Complete commercial authoring platform•  http://www.t-immersion.com/•  Multi-platform•  Markerless tracking•  Scripting•  Face tracking•  Finger tracking•  Kinect support
  • Billinghurst and Duh 148Others AR-Media•  http://www.inglobetechnologies.com/•  Google sketch-up plug-in LinceoVR•  http://linceovr.seac02.it/•  AR/VR authoring package Libraries•  JARToolKit, MXRToolKit, ARLib, Goblin XNA
  • Billinghurst and Duh 149Research in AR Authoring iaTAR (Lee 2004)•  Immersive AR Authoring•  Using real objects to create AR applications
  • Billinghurst and Duh 150Rapid Prototyping Speed development time by using quickhardware mockups•  handheld device connected to PC•  LCD screen•  USB phone keypad•  Camera
  • Billinghurst and Duh 151Build Your Own Google Glass Rapid Prototype Glass-Like HMD Myvu HMD + headphone + iOS Device + basic glue skills•  $300 + less than 3 hours construction  http://www.instructables.com/id/DIY-Google-Glasses-AKA-the-Beady-i/
  • Billinghurst and Duh 152
  • Billinghurst and Duh 153BUNRATTY FOLK PARK Irish visitor attraction run by Shannon Heritage 19th century life is recreated Buildings from the mid-west have been relocated to the26-land surrounding Bunratty Castle 30 buildings are set in a rural or village setting there.
  • Billinghurst and Duh 154AUGMENTED REALITY154In Bunratty Folk Park: Allows the visitor to point a camera at an exhibit, thedevice recognises its by it’s location and layers digitalinformation on to the display 3- dimensional virtual objects can be positioned with realones on display Leads to dynamic combination of a live camera view andinformation
  • Billinghurst and Duh 155ITERATIVE DESIGN PROCESSPrototyping and User Testing Low Fidelity Prototyping• Sketches• Paper Prototyping• Post-It Prototyping• PowerPoint Prototyping High Fidelity Prototyping• Wikitude
  • Billinghurst and Duh 156Storyboarding156
  • Billinghurst and Duh 157INITIAL SKETCHESPros:  •   Good  for  idea  genera/on  •   Cheap  •   Concepts  seem  feasible  Cons:  •   Not  great  feedback  gained  •   Photoshop  not  fast  enough  for  making  changes  
  • Billinghurst and Duh 158Post-it Note PrototypingCamera  View  with  3D  Annota/on  •   Selec/on  highlighted  in  blue   •   Home  buBon  added  for  easy  naviga/on  to  main  menu  
  • Billinghurst and Duh 159POWERPOINT PROTOTYPINGBenefits    •   Used  for  User  Tes/ng  •   Interac/ve  •   Func/onali/es  work  •   Quick  •   Easy  arrangement  of  slides  User  Tes/ng  •   Par/cipants  found  •   15  minute  sessions  screen  captured  •   ‘Talk  Allowed’  technique  used    •   Notes  taken  •   Post-­‐Interview  
  • Billinghurst and Duh 160WIKITUDE PROTOTYPEUser Testing Application well received Understandable Participants playful with thetechnology
  • Billinghurst and Duh 161FINAL VIDEO PROTOTYPE Flexible  tool  for  capturing  the  use  of  an  interface   Elaborate  simula/on  of  how  the  naviga/onal  aid  will  work   Does  not  need  to  be  realis/c  in  every  detail   Gives  a  good  idea  of  how  the  finished  system  will  work  
  • Billinghurst and Duh 162162AR Evaluation Methods
  • Billinghurst and Duh 163The Interaction Design Process
  • Billinghurst and Duh 164Why Evaluate ARApplications? To test and compare interfaces, new technologies,interaction techniques To validate the efficiency and efficient the AR interfaceand system Test Usability (learnability, efficiency, satisfaction,...) Get user feedback Refine interface design Better understand your end users ...
  • Billinghurst and Duh 165Survey of AR Papers Edward Swan (2005) Surveyed major conference/journals (1992-2004)– Presence, ISMAR, ISWC, IEEE VR Summary•  1104 total papers•  266 AR papers•  38 AR HCI papers (Interaction)•  21 AR user studies Only 21 from 266 AR papers had a formal user study•  Less than 8% of all AR papers
  • Billinghurst and Duh 166HIT Lab NZ Usability Survey  A Survey of Evaluation Techniques Used inAugmented Reality Studies•  Andreas Dünser, Raphaël Grasset, MarkBillinghurst reviewed publications from 1993 to 2007•  Extracted 6071 papers which mentioned“Augmented Reality”•  Searched to find 165 AR papers with User Studies
  • Billinghurst and Duh 167
  • Billinghurst and Duh 168
  • Billinghurst and Duh 169Types of ExperimentalMeasures Used Types of Experimental Measures•  Objective measures•  Subjective measures•  Qualitative analysis•  Usability evaluation techniques•  Informal evaluations
  • Billinghurst and Duh 170Types of ExperimentalMeasures Used
  • Billinghurst and Duh 171Types of Experiments and topics  Sensation, Perception & Cognition•  How is virtual content perceived ?•  What perceptual cues are most important ?•  How to visualize augmented/overlay information on real environment?•  Visual search/attention/salience issues of human performance  Interaction•  How can users interact with virtual content ?•  Which interaction techniques are most efficient in certain context ?  Collaboration & Social issues•  How is collaboration in AR interface different ?•  Which collaborative cues can be conveyed best ?•  Privacy and security issues of AR interface
  • Billinghurst and Duh 172Types of AR User Studies
  • Billinghurst and Duh 173Summary Over last 10 years•  Most user studies focused on user performance•  Fewest user studies on collaboration– MobileAR was not popular before 2009•  Objective performance measures most used•  Qualitative and usability measures least used
  • Billinghurst and Duh 174Sample Size … the more the better•  for quantitative analysis:•  rule of thumb approx. 15-20 or more (for cognitiveand lab type of experiment)•  absolute minimum of 8-10 per cell Ideal sample size can be calculated - power analysis•  Power (1- beta) => the chance to reject the null hypothesiswhen the null hypothesis is false•  Power is the probability of observing a difference when it reallyexists•  Power increases with sample size•  Power decreases with variance Large effects can be detected with smaller samples•  e.g. to discriminate mean speed between turtles and a rabbits
  • Billinghurst and Duh 175Data Collection and Analysis The choice of a method is dependent on the type of data thatneeds to be collected In order to test a hypothesis the data has to be analysed usinga statistical method The choice of a statistical method depends onthe type of collected data All the decisions about an experiment should be made beforeit is carried out
  • Billinghurst and Duh 176Observe and Measure Observations are gathered…•  manually (human observers)•  automatically (computers, software, cameras, sensors, etc.) A measurement is a recorded observation Objective metrics Subjective metrics
  • Billinghurst and Duh 177Typical objective metrics task completion time errors (number, percent,…) percent of task completed ratio of successes to failures number of repetitions number of commands used number of failed commands physiological data (heart rate,…) …
  • Billinghurst and Duh 178Typical subjective metrics user satisfaction subjective performance ratings ease of use intuitiveness judgments …
  • Billinghurst and Duh 179Data Types Subjective•  Subjective survey–  Likert Scale, condition rankings•  Observations–  Think Aloud•  Interview responses Objective•  Performance measures–  Time, accuracy, errors•  Process measures–  Video/audio analysisHow easy was the task1 2 3 4 5Not very easy Very easy
  • Billinghurst and Duh 180Experimental MeasuresMeasure What does it tell us? How is itmeasured?Timings Performance Via a stopwatch, orautomatically by the device.Errors Performance, Particular stickingpoints in a taskBy success in completing thetask correctly. Throughexperimenter observation,examining the route walked.Perceived Workload Effort invested. User satisfaction Through NASA TLX scalesand other questionnaires.Distance traveled androute takenDepending on the application, thesecan be used to pinpoint errors and toindicate performanceUsing a pedometer, GPS orother location-sensing system.By experimenter observation.Percentage preferredwalking speedPerformance By finding average walkingspeed, which is comparedwith normal walking speed.Comfort User satisfaction. DeviceacceptabilityComfort Rating Scale andother questionnaires.User comments andpreferencesUser satisfaction and preferences.Particular sticking points in a task.Through questionnaires,interviews and think-alouds.ExperimenterobservationsDifferent aspects, depending on theexperimenter and on the observationsThrough observation andnote-taking
  • Billinghurst and Duh 181Statistical Analysis Once data is collected statistics can be used for analysis Typical Statistical Techniques•  Comparing between two results–  Unpaired T-Test (for between subjects – assumes normal distribution)–  Paired T-Test (for within subjects – assumes normal distribution)–  Mann–Whitney U (independent samples)•  Comparing between > two results–  Followed by post-hoc analysis – Bonferroni Test–  Analysis of Variance – ANOVA–  Kruskal–Wallis (does not assume normal distribution)
  • Billinghurst and Duh 182Case Study: A WearableInformation SpaceHead Stabilized Body StabilizedAn AR interface provides spatial audio and visual cuesDoes a spatial interface aid performance?– Task time / accuracyM. Billinghurst, J. Bowskill, Nick Dyer, Jason Morphett (1998). An Evaluation of WearableInformation Spaces. Proc. Virtual Reality Annual International Symposium.
  • Billinghurst and Duh 183Task Performance Task•  find target icons on 8 pages•  remember information space ConditionsA - head-stabilized pagesB - cylindrical display with trackballC - cylindrical display with head tracking Subjects•  Within subjects (need fewer subjects)•  12 subjects used
  • Billinghurst and Duh 184Experimental Measures Objective•  spatial ability (pre-test)•  time to perform task•  information recall•  workload (NASA TLX) Subjective•  Post Experiment Survey–  rank conditions (forced choice)–  Likert Scale Questions-  “How intuitive was the interface to use?”ManyDifferentMeasures
  • Billinghurst and Duh 185Post Experiment SurveyFor each of these conditions please answer:1) How easy was it to find the target?1 2 3 4 5 6 71=not very easy 7=very easyFor the head stabilised condition (A):For the cylindrical condition with mouse input (B):For the head tracked condition (C):Rank all the conditions in order on a scale of one to three1) Which condition was easiest to find target (1 = easiest, 3 = hardest)A: B: C:
  • Billinghurst and Duh 186Results Body Stabilization Improved Performance•  search times significantly faster (One factor ANOVA) Head Tracking Improved Information recall•  no difference between trackball and stack case Head tracking involved more physical work
  • Billinghurst and Duh 187Subjective Impressions  Subjects Felt Spatialized Conditions (ANOVA):•  More enjoyable•  Easier to find target
  • Billinghurst and Duh 188Subjective Impressions  Subject Rankings (Kruskal-Wallis)•  Spatialized easier to use than head stabilized•  Body stabilized gave better understanding•  Head tracking most intuitive
  • Billinghurst and Duh 189AR Evaluation Field, Field, Field –•  Field studies vs. Lab studies•  Contextual design and evaluation Combined methods (qualitative and quantitativestudies)•  Weakness of each method should be considered New/modified evaluation methods may need to bedeveloped Seek for more new evaluation case studies in AR
  • Billinghurst and Duh 190190AR Design Case Study
  • Billinghurst and Duh 191“The Jackson Plan”An Educational Location-based Handheld AR GameLearning while in travelMobile AR Entertainment forChildren
  • Billinghurst and Duh 192The Jackson Plan  Overview ‘The Jackson Plan’ is an educational discovery MobileAugmented Reality game that is set on the historicalurban plan of the same name (also known as the “Planof the Town of Singapore”) Using multi-modality features on an Apple iPad2, playerscollaboratively experience this location-based MobileAugmented Reality game around the several importanthistorical sites and events that revolve around SirThomas Stamford Raffles and his founding of the islandof Singapore in 1819. The Jackson Plan 1822, is ondisplay at the SingaporeHistory Gallery, NationalMuseum of Singapore
  • Billinghurst and Duh 193 Learning Goals/Objectives Unit   Learning objectives  JacksonPlan  【Knowledge】1. To acquire a better understanding of the key developments ofthe Raffles’s arrival, its early settlers and Raffle’s town plan.【Skills】1. To explains the reasons for the founding of Singapore (1819).2. To explain the importance of trade to Singapore.3. To describe the contributions of key personalities andimmigrants to the growth and development of Singapore.【Values & Attitudes】1. To develop an interest in the past.2. To appreciate culture heritage as well as to instill a sense ofcourage, diligence and perseverance to Singapore.  History Syllabus for Lower Secondary, Year of Implementation: 2006. ISBN 981-05-1669-X.Source: Curriculum Planning and Development Division, Ministry of Education, SingaporeLearning Content
  • Billinghurst and Duh 194Consideration How can a new technology help newlearning experience in cultural heritage? Interdisciplinary research (Design,Technology, Education and Learning) System building, a single application or Recognition in each field Real deployment in schools194
  • Billinghurst and Duh 195Theoretical Framework“Situated cognition viascaffolding mechanisms([Vygotsky, 1978])”Distinct HAR technologypairings available in agame, (0=No, 1=Yes),resulting in four possibleeHAR game types andplay styles, each with animplementation process.Y.-N. Chang, R. K. C. Koh, and H. B.-L. Duh, "Handheld AR games - A triarchic conceptual design framework," in Mixed and Augmented Reality -Arts, Media, and Humanities (ISMAR-AMH), 2011 IEEE International Symposium On, Basel, Switzerland, 2011, pp. 29-36.Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.
  • Billinghurst and Duh 196 Triarchic conceptual design framework •  GPS navigation: Location-based implementationfor Cultural & Historical (contextual) explorations•  Overlaying options: ‘Binoculars’ metaphor(i.e., Panoramic Map)•  Virtual properties(game inventory)•  Geo-tagging / (diary)•  Blended mini games(i.e. puzzles)•  Tasks may exploitplatform’s hardwarefeatures(GPS,Accelerometer)•  Visual identificationof past and presentimagery•  History comes to life byexploiting location-dependent contexts•  Backend confirmation withserver connectivity(‘Wizard of Oz’ possibility)for dynamic situationalexchanges, i.e. messages,images, induce player-behaviors, etc•  Promote Contextual Inquiry& Collaboration(Learning Strategies)Theoretical Framework
  • Billinghurst and Duh 197The Jackson Plan Textbook:SINGAPORE: FROM SETTLEMENTTO NATION - PRE-1819 TO 1971(Marshall Cavendish Education)Theme: Chapter 3 - What Part Did theDifferent Immigrant Communities Play inSingapore’s Development?TheJacksonPlanPrior Knowledge:The settlement ofSingapore-Why Raffles choseSingaporeSingapore’scentral locationCentral locationExcellent portGood supply ofdrinking waterThe Dutch had notoccupied the islandImmigrantsWhy immigrantscameSingapore’stown planLieutenant PhilipJackson,1822Improve thehaphazardbuilding planSegregatedpopulation groupsPopulations fromtrading goods /countriesChineseCooliesSamsui womenIndiansLabourersCooliesMalays ShipbuildersEuropeans MerchantsArabs Traders
  • Billinghurst and Duh 198• Ideation: Use of Historical Illustrations / Images in Situated Augmented Views•  Panoramic / Still - Visual Imageries of the Past +GPS
  • Billinghurst and Duh 199Game Features, Mechanisms &Platform - Ideations• ‘Civic District Trail’ - A tourist’s DIYexploration experience promoted by theSingapore Tourism Board
  • Billinghurst and Duh 200Virtual& physicalinteractionManipulateknowledge-collecttrading materials(i.e. spices)Geo-Taggingthe “right”locationsTaking pictures(Wizard of Oz)Blended casualmini-games withphysical interactionand collaborationGame Features, Mechanisms & Platform
  • Billinghurst and Duh 201Game Features, Mechanisms & Platform
  • Billinghurst and Duh 202Chinese:ChinatownIndiansEuropeans&Rich AsiansMalays &Muslims“Plan of the town of Singapore” by Lieutenant Phillip Jackson,1822Commercial SquareGame Features, Mechanisms & Platform
  • Billinghurst and Duh 203PhaseLearningObjectivesLearning Task(s) Time1Understanding ofactivitiesTo understand theGameplay andmanipulation of iPad2devices- Introduction toGameplay- Game introduction /Mission Briefing15min2ConstructingKnowledgeTo understand thebackground of Singaporesettlement- Information Collection:Know who theseimmigrants are15min3 MasteringTo analyze how did theimmigrants contribute toSingapore as a tradingcentre- Experience the entrepottrade20min4KnowledgeApplicationTo make comparisons andorganize information ofthe different contributionsof immigrants- Make an accusation byevidence (Gain asummative feedback)15minThe Jackson Plan: PlannedGaming Activities
  • Billinghurst and Duh 204The Trail
  • Billinghurst and Duh 205Game Design
  • Billinghurst and Duh 206Game Design
  • Billinghurst and Duh 207The Jackson Plan - Features
  • Billinghurst and Duh 208Evaluation • 72 students (36 pairs) tookpart in the evaluation• Secondary One classes(~12-13 years old)• They were equally divided into2 main groups, Location-basedand Digital Book versionsDigital Book   Location-based AR  Platform   Apple iPad2   Apple iPad2  Collaboration   Yes   Yes  Interaction Type   Non-AR   Location-based AR  Play Space   Indoors   Outdoors  
  • Billinghurst and Duh 209 The structure of knowledgeEvaluation
  • Billinghurst and Duh 210The Jackson Plan
  • Billinghurst and Duh 211The Jackson Plan
  • Billinghurst and Duh 212The Jackson Plan
  • Billinghurst and Duh 213The Jackson Plan
  • Billinghurst and Duh 214The Jackson Plan
  • Billinghurst and Duh 215The Jackson Plan
  • Theory into Practice: Domain-Centric Handheld Augmented Reality GameDesignStudy 3 - Co-creativity fusions in interdisciplinary AR game developments
  • Billinghurst and Duh 217217AR Research Directions
  • Billinghurst and Duh 218Vision of AR
  • Billinghurst and Duh 219To Make the Vision Real.. Hardware/software requirements•  Contact lens displays•  Free space hand/body tracking•  Speech/gesture recognition•  Etc.. Most importantly•  Usability
  • Billinghurst and Duh 220Natural Interaction Automatically detecting real environment•  Environmental awareness•  Physically based interaction Gesture Input•  Free-hand interaction Multimodal Input•  Speech and gesture interaction•  Implicit rather than Explicit interaction
  • Environmental Awareness
  • Billinghurst and Duh 222AR MicroMachines AR experience with environment awareness andphysically-based interaction•  Based on MS Kinect RGB-D sensor Augmented environment supports•  occlusion, shadows•  physically-based interaction between real and virtual objects
  • Billinghurst and Duh 223Operating Environment
  • Billinghurst and Duh 224Architecture Our framework uses five libraries:•  OpenNI•  OpenCV•  OPIRA•  Bullet Physics•  OpenSceneGraph
  • Billinghurst and Duh 225System Flow The system flow consists of three sections:•  Image Processing and Marker Tracking•  Physics Simulation•  Rendering
  • Billinghurst and Duh 226Physics Simulation Create virtual mesh over real world Update at 10 fps – can move real objects Use by physics engine for collision detection (virtual/real) Use by OpenScenegraph for occlusion and shadows
  • Billinghurst and Duh 227RenderingOcclusion Shadows
  • Gesture Input
  • Billinghurst and Duh 229Architecture5. Gesture• Static Gestures• Dynamic Gestures• Context based Gestures4. Modeling• Hand recognition/modeling• Rigid-body modeling3. Classification/Tracking2. Segmentation1. Hardware InterfaceHITLabNZ’s Gesture Library
  • Billinghurst and Duh 230Architecture5. Gesture•  Static Gestures•  Dynamic Gestures•  Context based Gestures4. Modeling•  Hand recognition/modeling•  Rigid-body modeling3. Classification/Tracking2. Segmentation1. Hardware InterfaceHITLabNZ’s Gesture Libraryo  Supports PCL, OpenNI, OpenCV,and Kinect SDK.o  Provides access to depth, RGB,XYZRGB.o  Usage: Capturing color image,depth image and concatenatedpoint clouds from a single ormultiple cameraso  For example:Kinect for Xbox 360Kinect for WindowsAsus Xtion Pro Live
  • Billinghurst and Duh 231Architecture5. Gesture•  Static Gestures•  Dynamic Gestures•  Context based Gestures4. Modeling•  Hand recognition/modeling•  Rigid-body modeling3. Classification/Tracking2. Segmentation1. Hardware Interfaceo  Segment images and pointclouds based on color, depthand space.o  Usage: Segmenting images orpoint clouds using color models,depth, or spatial properties suchas location, shape and size.o  For example:HITLabNZ’s Gesture LibrarySkin color segmentationDepth threshold
  • Billinghurst and Duh 232Architecture5. Gesture•  Static Gestures•  Dynamic Gestures•  Context based Gestures4. Modeling•  Hand recognition/modeling•  Rigid-body modeling3. Classification/Tracking2. Segmentation1. Hardware Interfaceo  Identify and track objectsbetween frames based onXYZRGB.o  Usage: Identifying currentposition/orientation of thetracked object in space.o  For example:HITLabNZ’s Gesture LibraryTraining set of handposes, colors representunique regions of thehand.Raw output (without-cleaning) classified onreal hand input (depthimage).
  • Billinghurst and Duh 233Architecture5. Gesture•  Static Gestures•  Dynamic Gestures•  Context based Gestures4. Modeling•  Hand recognition/modeling•  Rigid-body modeling3. Classification/Tracking2. Segmentation1. Hardware Interfaceo  Hand Recognition/Modeling  Skeleton based (for low resolutionapproximation)  Model based (for more accuraterepresentation)o  Object Modeling (identification and trackingrigid-body objects)o  Physical Modeling (physical interaction)  Sphere Proxy  Model based  Mesh basedo  Usage: For general spatial interaction in AR/VRenvironmentHITLabNZ’s Gesture Library
  • Billinghurst and Duh 234Method  Represent  models  as  collec1ons  of  spheres  moving  with  the  models  in  the  Bullet  physics  engine  
  • Billinghurst and Duh 235Method  Render  AR  scene  with  OpenSceneGraph,  using  depth  map  for  occlusion  Shadows  yet  to  be  implemented  
  • Billinghurst and Duh 236Results
  • Billinghurst and Duh 237Architecture5. Gesture•  Static Gestures•  Dynamic Gestures•  Context based Gestures4. Modeling•  Hand recognition/modeling•  Rigid-body modeling3. Classification/Tracking2. Segmentation1. Hardware Interfaceo  Static (hand pose recognition)o  Dynamic (meaningful movementrecognition)o  Context-based gesturerecognition (gestures withcontext, e.g. pointing)o  Usage: Issuing commands/anticipating user intention andhigh level interaction.HITLabNZ’s Gesture Library
  • Multimodal Interaction
  • Billinghurst and Duh 239Multimodal Interaction Combined speech input Gesture and Speech complimentary•  Speech–  modal commands, quantities•  Gesture–  selection, motion, qualities Previous work found multimodal interfaces intuitive for2D/3D graphics interaction
  • Billinghurst and Duh 240Free Hand Multimodal Input Use free hand to interact with AR content Recognize simple gestures No marker trackingPoint Move Pick/Drop
  • Billinghurst and Duh 241Multimodal Architecture
  • Billinghurst and Duh 242Multimodal Fusion
  • Billinghurst and Duh 243Hand Occlusion
  • Billinghurst and Duh 244User Evaluation Change object shape, colour and position Conditions•  Speech only, gesture only, multimodal Measure•  performance time, error, subjective survey
  • Billinghurst and Duh 245Experimental SetupChange object shapeand colour
  • Billinghurst and Duh 246Results Average performance time (MMI, speech fastest)•  Gesture: 15.44s•  Speech: 12.38s•  Multimodal: 11.78s No difference in user errors User subjective survey•  Q1: How natural was it to manipulate the object?–  MMI, speech significantly better•  70% preferred MMI, 25% speech only, 5% gesture only
  • Future Directions
  • Billinghurst and Duh 248Natural GestureInteraction on Mobile Use mobile camera for hand tracking•  Fingertip detection
  • Billinghurst and Duh 249Evaluation Gesture input more than twice as slow as touch No difference in naturalness
  • Billinghurst and Duh 250Intelligent Interfaces Most AR systems are stupid•  Don’t recognize user behaviour•  Don’t provide feedback•  Don’t adapt to user Especially important for training•  Scaffolded learning•  Moving beyond check-lists of actions
  • Billinghurst and Duh 251Intelligent Interfaces AR interface + intelligent tutoring system•  ASPIRE constraint based system (from UC)•  Constraints–  relevance cond., satisfaction cond., feedback
  • Billinghurst and Duh 252Domain Ontology
  • Billinghurst and Duh 253Intelligent Feedback Actively monitors user behaviour•  Implicit vs. explicit interaction Provides corrective feedback
  • Billinghurst and Duh 254
  • Billinghurst and Duh 255Evaluation Results 16 subjects, with and without ITS Improved task completion Improved learning
  • Billinghurst and Duh 256Intelligent Agents AR characters•  Virtual embodiment of system•  Multimodal input/output Examples•  AR Lego, Welbo, etc•  Mr Virtuoso–  AR character more real, more fun–  On-screen 3D and AR similar in usefulness
  • Billinghurst and Duh 257Contact Lens Display Babak Parviz•  University Washington MEMS components•  Transparent elements•  Micro-sensors Challenges•  Miniaturization•  Assembly•  Eye-safe
  • Billinghurst and Duh 258Contact Lens Prototype
  • Billinghurst and Duh 259Conclusion
  • Billinghurst and Duh 260Conclusion There is need for better designed AR experiences Through•  use of Interaction Design principles•  understanding of the technology•  use of rapid prototyping tools•  rigorous user evaluation There a number of important areas for future research•  Natural interaction•  Multimodal interfaces•  Intelligent agents•  Novel displays
  • Billinghurst and Duh 261More Information•  Mark Billinghurst– mark.billinghurst@hitlabnz.org•  Websites– www.hitlabnz.org•  Henry Duh– hblduh@gmail.com
  • Billinghurst and Duh 262262Resources
  • Billinghurst and Duh 263Websites Meta List of AR SDKs•  http://www.icg.tugraz.at/Members/gerhard/augmented-reality-sdks ARToolKit Software Download•  http://artoolkit.sourceforge.net/ ARToolKit Documentation•  http://www.hitl.washington.edu/artoolkit/ ARToolKit Forum•  https://www.artoolworks.com/community/forum/ ARToolworks Inc•  http://www.artoolworks.com/
  • Billinghurst and Duh 264 ARToolKit Plus•  http://studierstube.icg.tu-graz.ac.at/handheld_ar/artoolkitplus.php osgART•  http://www.osgart.org/ FLARToolKit•  http://www.libspark.org/wiki/saqoosha/FLARToolKit/ FLARManager•  http://words.transmote.com/wp/flarmanager/
  • Billinghurst and Duh 266AR Labs Europe•  TU Graz, Cambridge U, TU Munich, FraunhoferIGD USA•  Columbia U, Georgia Tech, USC Asia•  KIST, KAIST•  AIST, Kyoto U, NAIST, U of Tsukuba•  NUS, UniSA, HITLab NZ Companies•  Qualcomm, Nokia, Layar, Wikitube, Metaio
  • Billinghurst and Duh 267Books Interactive Environments with Open-SourceSoftware: 3D Walkthroughs and Augmented Realityfor Architects with Blender 2.43, DART 3.0 andARToolKit 2.72 by Wolfgang Höhl A Hitchhikers Guide to Virtual Reality by KarenMcMenemy and Stuart Ferguson Bimber, Raskar. Spatial Augmented Reality (2005)
  • Billinghurst and Duh 268 BooksMobile Interaction DesignMatt Jones and Gary MarsdenDesigning for Small ScreensStudio 7.5Handheld UsabilityScott WeissDesigning the Mobile User ExperienceBarbara Ballard
  • Billinghurst and Duh 269Publication venues  Conference•  IEEE/ACM International Symposium in Mixed and Augmented Reality (IEEE/ACM ISMAR) (ismar.net)•  IEEE Virtual Reality (IEEE VR)•  Korean-Japan Mixed Reality Workshop (KJMR)  Journal•  IEEE Transaction on Visualization and Computer Graphics (IEEE)•  Computer & Graphics (Elsevier)•  PRESENCE (MIT Press)  Papers•  Zhou, F., Duh, H.B.L., and Billinghurst, M. (2008). Trends in Augmented Reality Tracking,Interaction and Display: A Review of Ten Years of ISMAR. in IEEE International Symposiumon Mixed and Augmented Reality (IEEE/ACM ISMAR) 193-202•  Azuma, R., Baillot, Behringer, R., Feiner, S., Julier, S., MacIntyre, B., (2001). RecentAdvances in Augmented Reality, IEEE Computer Graphics and Applications, 34-47
  • Billinghurst and Duh 270More Papers  E. Kruijff, J. E. Swan, and S. Feiner. Perceptual issues in augmented realityrevisited. 9th IEEE International Symposium on Mixed and AugmentedReality (ISMAR), 2010, pp. 3--12.  D. Drascic and P. Milgram. Perceptual issues in augmented reality. In M. T.Bolas, S. S. Fisher, and J. O. Merritt, editors, SPIE Volume 2653:Stereoscopic Displays and Virtual Reality Systems III, pages 123-134,January/February 1996.270
  • Billinghurst and Duh 271Developer Guidelines Palmhttp://www.access-company.com/developers/documents/docs/ui/UI_Design.html Zen of Palm guidelineshttp://www.access-company.com/developers/documents/docs/zenofpalm.pdf Motorolahttp://developer.motorola.com/docstools/developerguides/ iPhone Human Interface Guidelineshttp://developer.apple.com/documentation/iPhone/Conceptual/iPhoneHIG/
  • Billinghurst and Duh 272Handheld HCI DesignWebsitesDo’s and Don’ts of PocketPC designhttp://www.pocketpcmag.com/_archives/Nov04/Commandements.aspxUsability special interest group – handheld usabilityhttp://www.stcsig.org/usability/topics/handheld.htmlUsable Mobile websitehttp://www.smartgroups.com/groups/usablemobileMobile Coders Websitehttp://www.mobilecoders.com/Articles/mc-01.aspUniv of Waikato Handheld Grouphttp://www.cs.waikato.ac.nz/hci/pdas.htmlMobile Interaction Websitehttp://www.cs.waikato.ac.nz/~mattj/mwshop.html