• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
SVR2011 Keynote
 

SVR2011 Keynote

on

  • 1,203 views

Mark Billinghurst's keynote talk given at the SVR 2011 conference on Augmented and Virtual Reality in Brazil - May 24th 2011

Mark Billinghurst's keynote talk given at the SVR 2011 conference on Augmented and Virtual Reality in Brazil - May 24th 2011

Statistics

Views

Total Views
1,203
Views on SlideShare
1,175
Embed Views
28

Actions

Likes
2
Downloads
56
Comments
0

1 Embed 28

http://development.blog.shinobi.jp 28

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    SVR2011 Keynote SVR2011 Keynote Presentation Transcript

    • Research Directionsin Augmented Reality Mark Billinghurst The HIT Lab NZUniversity of Canterbury
    • Augmented Reality DefinitionDefining Characteristics [Azuma 97] Combines Real and Virtual Images - Both can be seen at the same time Interactive in real-time - The virtual content can be interacted with Registered in 3D - Virtual objects appear fixed in space
    • Augmented Reality Examples Put AR pictures here
    • Virtual Reality1989…
    • Virtual RealityImmersive VR Head mounted display, gloves Separation from the real world
    • AR vs VRVirtual Reality: Replaces Reality Scene Generation: requires realistic images Display Device: fully immersive, wide FOV Tracking and Sensing: low accuracy is okayAugmented Reality: Enhances Reality Scene Generation: minimal rendering okay Display Device: non-immersive, small FOV Tracking and Sensing: high accuracy needed
    • Milgram’s Reality-Virtuality continuum Mixed Reality Real Augmented Augmented VirtualEnvironment Reality (AR) Virtuality (AV) Environment Reality - Virtuality (RV) Continuum
    • AR History
    • AR Beginnings1960’s: Sutherland / Sproull’sfirst HMD system was see-through
    • 1960 - 80’s: US Air Force SuperCockpit (T. Furness)
    • Early 1990’s: Boeing coined theterm “AR.” Wire harnessassembly application begun (T.Caudell, D. Mizell).Early to mid 1990’s: UNCultrasound visualization projectEarly 1990’s: Boeing coined the term “AR.” Wire harnessassembly application begun (T. Caudell, D. Mizell).1994 - : UNC Research Motion stabilized display, Hybrid tracking, Ultrasound visualization
    • A Brief History of AR1996: MIT Wearable Computing efforts1998: Dedicated conferences beginLate 90’s: Collaboration, outdoor, interactionLate 90’s: Augmented sports broadcasts1998 - 2001: Mixed Reality Systems Lab
    • History Summary1960’s – 80’s: Early Experimentation1980’s – 90’s: Basic Research Tracking, displays1995 – 2005: Tools/Applications Interaction, usability, theory2005 - : Commercial Applications Games, Medical, Industry
    • Medical AR Trials Sauer et al. 2000 at Siemens Corporate Research, NJ Stereo video see throughF. Sauer, Ali Khamene, S. Vogt: An Augmented Reality Navigation System with aSingle-Camera Tracker: System Design and Needle Biopsy Phantom Trial, MICCAI 2002
    • AR Reaches Mainstream MIT Technology Review March 2007 list of the 10 most exciting technologies Economist Dec 6th 2007 Reality, only better
    • Virtual Reality, Augmented Reality
    • Esquire MagazineDec 2009 issue12 pages AR content
    • Trend One: Browser Based ARAdobe Flash + camera + 3D graphicsHigh impact High marketing valueLarge potential install base 1.6 Billion web usersEase of development Lots of developers, mature toolsLow cost of entry Browser, web camera
    • 1983 – Star Wars
    • 1999: AR Face to Face Collaboration
    • 1998: SGI O2 2008: Nokia N95CPU: 300 Mhz CPU: 332 MhzHDD; 9GB HDD; 8GBRAM: 512 mb RAM: 128 mbCamera: VGA 30fps Camera: VGA 30 fpsGraphics: 500K poly/sec Graphics: 2m poly/sec
    • Trend Two: Mobile Phone ARMobile Phones camera, sensors processor displayAR on Mobile Phones Simple graphics Optimized computer vision Collaborative Interaction
    • Collaborative ARAR Tennis Shared AR content Two user game Audio + haptic feedback Bluetooth networking
    • Location Aware PhonesMotorola Droid Nokia Navigator
    • 2009 - Outdoor Information OverlayMobile phone basedTag real world locations GPS + Compass input Overlay graphics data on live videoApplications Travel guide, Advertising, etcWikitude, Layar, Junaio, etc.. Android based, Public API released
    • Layar (www.layar.com)Location based data GPS + compass location Map + camera viewAR Layers on real world Customized data Audio, 3D, 2D contentEasy authoringAndroid, iPhone
    • Android AR PlatformArchitectural ApplicationLoads 3D models a OBJ/MTL formatPositions content in space GPS, compassIntuitive user interface toolkit to modify the modelConnects to back end model database
    • Mobile Outdoor AR
    • Client/Server Web Interface Add models Web applicationjava and php server Android applicationDatabase server Postgres
    • $784 million USD in 2014
    • SummaryAugmented Reality has a long history goingback to the 1960’sInterest in AR has exploded over the last twoyears and is being commercialized quicklyAR is growing in a number of areas Mobile AR Web based AR Advertising experiences
    • Looking to the Future
    • What’s Next? Sony CSL © 2004
    • “The product is no longer the basis of value. The experience is.” Venkat Ramaswamy The Future of Competition.
    • PS3 - Eye of JudgementComputer Vision TrackingCard based battle gameCollaborative AROctober 24th 2007
    • Building Compelling AR Experiences experiences Usability applications Interaction tools Authoring components Tracking, Display Sony CSL © 2004
    • AR Components
    • Building Compelling AR Experiences experiences applications tools components Tracking, Display Sony CSL © 2004
    • Low Level AR LibrariesARToolKit Enhancements Occlusion HandlingSSTT Simple Spatial Template TrackingOpira Robust Natural Feature Tracking
    • Markerless Tracking
    • AR Tools
    • Building Compelling AR Experiences experiences applications tools Authoring components Tracking, Display Sony CSL © 2004
    • AR AuthoringSoftware Libraries OSGART, Studierstube, MXRToolKitPlugin to existing software DART (Macromedia Director)Stand Alone AMIRE, etcNext Generation iaTAR (Tangible AR)
    • mARx Plug-in3D Studio Max Plug-inCan model and view AR content at the same time
    • BuildARhttp://www.buildar.co.nz/Stand alone applicationVisual interface for AR model viewing applicationEnables non-programmers to build AR scenes
    • AR Applications
    • Building Compelling AR Experiences experiences applications Interaction tools Authoring components Tracking, Display Sony CSL © 2004
    • AR Design PrinciplesInterface Components Physical components Display elements - Visual/audio Interaction metaphors Physical Display Elements Interaction Elements Metaphor Input Output
    • Tangible User Interfaces (Ishii 97)Create digital shadows forphysical objectsForeground graspable UIBackground ambient interfaces
    • Tangible AR MetaphorAR overcomes limitation of TUIs enhance display possibilities merge task/display space provide public and private viewsTUI + AR = Tangible AR Apply TUI methods to AR interface design
    • Tangible AR Design PrinciplesTangible AR Interfaces use TUI principles Physical controllers for moving virtual content Support for spatial 3D interaction techniques Support for multi-handed interaction Match object affordances to task requirements Support parallel activity with multiple objects Allow collaboration between multiple users
    • Case Study: 3D AR LensGoal: Develop a lens based AR interface MagicLenses Developed at Xerox PARC in 1993 View a region of the workspace differently to the rest Overlap MagicLenses to create composite effects
    • 3D MagicLensesMagicLenses extended to 3D (Veiga et. al. 96) Volumetric and flat lenses
    • AR Lens Design PrinciplesPhysical Components Lens handle - Virtual lens attached to real objectDisplay Elements Lens view - Reveal layers in datasetInteraction Metaphor Physically holding lens
    • 3D AR Lenses: Model ViewerDisplays models made up of multiple partsEach part can be shown or hidden through the lensAllows the user to peer inside the modelMaintains focus + context
    • AR Lens Demo
    • HMD vs Handheld AR Interface Wearable AR HandHeld AR Output: Display Input & Output Input
    • Handheld Interface MetaphorsTangible AR Lens Viewing Look through screen into AR scene Interact with screen to interact with AR content - Eg Invisible TrainTangible AR Lens Manipulation Select AR object and attach to device Use the motion of the device as input - Eg AR Lego
    • Next Interaction TechniquesNatural Gestures Depth sensing Natural body inputMultimodal Speech + gesture
    • AR Experiences
    • Building Compelling AR Experiences experiences Usability applications Interaction tools Authoring components Tracking, Display Sony CSL © 2004
    • Survey of AR PapersEdward Swan (2005)Surveyed major conference/journals (1992-2004) - Presence, ISMAR, ISWC, IEEE VRSummary 1104 total papers 266 AR papers 38 AR HCI papers (Interaction) 21 AR user studiesOnly 21 from 266 AR papers have formal user study(<8% of all AR papers)
    • Types of ExperimentsPerception How is virtual content perceived ? What perceptual cues are most important ?Interaction How can users interact with virtual content ? Which interaction techniques are most efficient ?Collaboration How is collaboration in AR interface different ? Which collaborative cues can be conveyed best ?
    • AR Browser InterfaceLayar (www.layar.com) show POI on real worldTypical Interface Elements Live camera view Radar view Virtual graphics of POI 2D map view Information area
    • NavigationHow useful is AR viewfor navigation ego- vs. exo-centricExperiment AR only Map only AR + map
    • Experiment DesignConditions AR: Using only an AR view 2D-map: Using only a top down 2D map view AR+2D-map: Using both an AR and 2D map viewMeasures Time to complete, Distance travelled User preference, subjective measures
    • Paths Walkedthree different paths walked around campusbetween buildings/under trees
    • Performance MeasuresAR+Map AR+Map Map Map AR AR 0 200 400 600 800 1000 0 200 400 600 800 1000 1200 1400 Average Time Taken (sec) Average Distance Travelled (m) No difference between conditions
    • Path Trails
    • User FeedbackAR + Map easy to identifypoints of interestAR only hard to knowwhere things wereLiked being able toswitch between modesAR+Map preferred best
    • Typical User Comments“With the AR mode, I didn’t know where any of thebuildings were, a couple of times I went round in acircle because I didn’t know where things were.”“I found the map interface the best one to usebecause you are actually able to see the physicalobjects around you"“I used the map at the beginning to understand wherethe buildings were and the AR between each point”
    • Navigation ConclusionAR alone provides no improvement Lack of depth cues Difficult to create spatial awarenessAR + Map preferred interface Map for creating mental mode AR for near navigation
    • Conclusions
    • “We’re living in the experience economy and the customer is the star of the show. If I’m going to spend thousands of dollars on something. I want the whole experience to be a fairy-tale” Milton Pedraza, The Luxury Institute Illustrative
    • Building Compelling AR Experiences experiences Usability applications Interaction tools Authoring components Tracking, Display Sony CSL © 2004
    • ConclusionsAR is on the verge of commercializationThere are interesting research opportunities in Developing AR Component Technology Build Easy to Use Tools Identify Application Domains Develop Compelling AR Experiences
    • More Information• Mark Billinghurst – mark.billinghurst@hitlabnz.org• Websites – http://www.hitlabnz.org/ – http://artoolkit.sourceforge.net/ – http://www.osgart.org/ – http://www.hitlabnz.org/wiki/buildAR/