Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

COSC 426 Lect 2. - AR Technology


Published on

Lecture 2 in the COSC 426 class on Augmented Reality. Taught by Mark Billinghurst

Published in: Technology, Business
  • Be the first to comment

COSC 426 Lect 2. - AR Technology

  1. 1. Lecture 2: AR T h lL 2 Technology Mark Billinghurst July 2011 COSC 426: Augmented Reality
  2. 2. Key Points from Lecture 1
  3. 3. Augmented Reality DefinitionDefining Characteristics [Azuma 97] Combines Real and Virtual Images g - Both can be seen at the same time Interactive in real-time - Virtual content can be interacted with Registered in 3D g - Virtual objects appear fixed in space
  4. 4. What is not Augmented Reality?Location-based servicesBarcode detection (QR-codes)B d d i (QR d )Augmenting still images g g gSpecial effects in movies…… but they can be combined with AR!
  5. 5. Milgram’s Reality-Virtuality Continuum Mixed Reality Real Augmented Augmented VirtualEnvironment Reality (AR) Virtuality (AV) Environment Reality - Virtuality (RV) Continuum
  6. 6. Metaverse
  7. 7. AR History Summary1960’s – 80’s: Early Experimentation1980 s 90 s:1980’s – 90’s: Basic Research Tracking, displays1995 – 2005: Tools/Applications Interaction, usability, theory y y2005 - : Commercial Applications Games, M di l Industry G Medical, I d
  8. 8. ApplicationsMedicineManufacturingInformation overlayArchitectureMuseumMarketingGaming
  9. 9. Interaction Design Process
  10. 10. Interaction Design is All About You Users should be involved throughout the Design Process Co s de all the eeds Consider a t e needs of the user
  11. 11. Building Compelling AR ExperiencesB ildi C lli E i experiences Usability applications Interaction tools Authoring components Tracking, Display
  12. 12. AR Technology
  13. 13. Building Compelling AR Experiences experiences applications tools components Display, Tracking Sony CSL © 2004
  14. 14. AR TechnologyKey Technologies Display p y Tracking Display Tracking Input Processing Processing Input
  15. 15. AR Displays
  16. 16. AR Displays AR Visual Displays Primarily Indoor Primarily Outdoor Environments (Daylight) Environments Not Head-Mounted Head-Mounted Head-Mounted Not Head Mounted Display (HMD) Display (HMD) (e.g. vehicle mounted) Virtual Images Projection CRT Display Liquid Crystal Cathode Ray Tube (CRT) Projection Display or Virtual Retinal Display (VRD) p y( ) Navigational Aids in Cars gseen off windows using beamsplitter Displays LCDs Many Military Applications Military Airborne Applications & Assistive Technologies e.g. window e.g. Reach-In e.g. Shared Space e.g. WLVA e.g. Head-Up reflections Magic Book and IVRD Display (HUD)
  17. 17. Head Mounted Displays
  18. 18. Head Mounted Displays (HMD) - Display and Optics mounted on Head - May or may not fully occlude real world - Provide full-color images - Considerations • Cumbersome to wear • Brightness • Low power consumption • Resolution limited • Cost is high? g
  19. 19. Types of Head Mounted Displays Occluded See-thru Multiplexed
  20. 20. Immersive VR Architecture Virtual World head position/orientation Head Non see-thru Tracker Image source & optics Host Data Base Rendering Frame Processor P Model M d l Engine E i Buffer virtualto network object Display Driver
  21. 21. See-thru AR Architecture head position/orientation Head see-thru Tracker combiner real world Host Data Base Rendering Frame Processor P Model M d l Engine E i Bufferto network Virtual Image Display superimposed Driver over real world object Image source g
  22. 22. Optical see-through head-mounted display Virtual images from monitors Real World Optical Combiners
  23. 23. Optical See-Through HMD
  24. 24. Optical see-through HMDs Virtual Vision VCAPSony Glasstron
  25. 25. DigiLens Compact HOE C Solid state optics Switchable Bragg Grating Stacked SBG Fast switching Ultra
  26. 26. The Virtual Retinal DisplayImage scanned onto retina age sca e o to et aCommercialized through Microvision Nomad System -
  27. 27. Strengths of optical ARSimpler (cheaper)Direct view of real worldDi i f l ld Full resolution, no time delay (for real world) Safety Lower distortionNo eye displacement but COASTAR video see-through avoids this
  28. 28. Video AR Architecture Head-mounted camera aligned to head position/orientation display optics Video image Head Tracker of real world Video Processor Host Graphics Digital Frame Processor P renderer d Mixer Mi Bufferto network Display Driver Virtual image inset into Non see-thru video of real Image source world & optics
  29. 29. Video see-through HMD Video cameras Video GraphicsMonitors Combiner
  30. 30. Video See-Through HMD
  31. 31. Video see-through HMDMR Laboratory’s COAS A HMD ’ COASTAR(Co-Optical Axis See-Through Augmented Reality)Parallax-freeParallax free video see through HMD see-through
  32. 32. pStereo video input PAL resolution cameras2 x SVGA displays 30 degree FOV User adjustable convergence$6,000 USD
  33. 33. Vuzix Displaywww.vuzix.comWrap 920$350 USDTwin 640 x 480 LCD displays31 degree diagonal field of viewWeighs less than three ounces
  34. 34. Strengths of Video ARTrue occlusion Kiyokawa optical display that supports occlusion y p p y ppDigitized image of real world Flexibility Fl b l in composition Matchable time delays More registration, calibration strategiesWide FOV is easier to support
  35. 35. Optical vs. Video AR SummaryBoth have proponentsVideo is more popular today? Likely because lack of available optical products y p pDepends on application? Manufacturing: optical i cheaper M f i i l is h Medical: video for calibration strategies
  36. 36. Eye multiplexed AR Architecture head position/orientation Head Tracker real world Host Data Base Rendering Frame Processor P Model M d l Engine E i Bufferto network Display Virtual Image Driver inset into real world scene Opaque Image source
  37. 37. Virtual Image ‘inset’ into real
  38. 38. Virtual Vision Personal Eyewear
  39. 39. Virtual image inset into real world
  40. 40. Spatial/Projected AR
  41. 41. Spatial Augmented RealityProject onto irregular surfaces Geometric Registration Projector blending, High dynamic rangeBook: Bimber, Rasker “Spatial Augmented Reality” p g y
  42. 42. Projector-based AR User (possibly head-tracked) Projector Examples:Real objects Raskar, Raskar MIT Media Labwith retroreflective Inami, Tachi Lab, U. Tokyocovering
  43. 43. Example of projector-based AR Ramesh Raskar, UNC, MERL , ,
  44. 44. Example of projector-based AR Ramesh Raskar, UNC Chapel Hill
  45. 45. The I/O BulbProjector + CP j Camera John Underkoffler, Hiroshi Ishii MIT Media Lab
  46. 46. Head Mounted ProjectorHead Mounted Projector J Jannick Rolland ( (UCF) )Retro-reflective Material Potentially portable
  47. 47. Head Mounted ProjectorNVIS P 50 HMPD P-50 1280x1024/eye Stereoscopic Stere sc ic 50 degree FOV i
  48. 48. HMD vs. HMPDHead Mounted Display Head Mounted Projected Display
  49. 49. Pico ProjectorsMicrovision - www.mvis.com3M, Samsung, Phili etc3M S Philips, t
  50. 50. MIT Sixth SenseBody worn camera and projector p p y p j
  51. 51. Other AR Displays
  52. 52. Video Monitor AR Video Stereo cameras Monitor g glasses VideoGraphics Combiner
  53. 53. Virtual ShowcaseMirrors on a projection table Head H d tracked stereo k d Up to 4 users Merges graphic and real objects M hi d l bj Exhibit/museum applicationsFraunhofer Institute (2001) Bimber, Frohlich
  54. 54. Augmented Paleontology Bimber et. al. IEEE Computer Sept. 2002
  55. 55. Alternate DisplaysLCD Panel Laptop PDA
  56. 56. Handheld DisplaysMobile Phones Camera Display Input
  57. 57. Other Types of AR DisplayAudio spatial sound p ambient audioTactileT til physical sensationHaptic virtual touch
  58. 58. Haptic InputAR Haptic Workbench CSIRO 2003 – Adcock et al et. al.
  59. 59. PhantomSensable Technologies ( DOF Force Feedback Device
  60. 60. AR Haptic InterfacePhantom, ARToolKit, Magellan g
  61. 61. AR Tracking and Registration
  62. 62. Tracking Locating the users viewpoint g p Position (x,y,z) Orientation (r p y) (r,p,y)Registration Positioning virtual object wrt real world
  63. 63. Tracking RequirementsHead Stabilized Body Stabilized World Stabilized Augmented Reality Information Display World Stabilized Body Stabilized Increasing Tracking Requirements Head Stabilized
  64. 64. Tracking Technologies• Mechanical• g Electromagnetic• Optical• Acoustic• Inertial d dead I ti l and d d reckoning k i• GPS• Hybrid
  65. 65. AR Tracking Taxonomy AR TRACKING Indoor Outdoor Environment Environment Limited Range Extended Range Low Accuracy & High Accuracy Not Robust & RobustLow Accuracy High Accuracy Many Fiducials Not Hybridized Hybrid Tracking at 15-60 Hz & High Speed in space/time GPS or GPS and Hybrid H b id but b t Camera or C Camera and C d Tracking no GPS Compass Compasse.g.e g AR Toolkit e.g. e g IVRD e.g. e g HiBall e.g. e g WLVA e.g. e g BARS
  66. 66. Tracking TypesMagnetic Inertial Ultrasonic Optical MechanicalTrackerT k Tracker T k Tracker T k Tracker T k Tracker Specialized Marker-Based Marker Based Markerless Tracking Tracking Tracking Edge-Based Template-Based Interest Point g Tracking g Tracking g Tracking
  67. 67. Tracking SystemsMechanical TrackerMagnetic TrackerUltrasonic TrackerInertial TrackerVision (Optical Tracking) Specialized (Infrared, Retro-Reflective) Monocular (DVCam, Webcam) M l (DVC W b )
  68. 68. Mechanical TrackerIdea:Id mechanical arms with joint sensors h l h Microscribe++: high accuracy haptic feedback accuracy,-- : cumbersome, expensive
  69. 69. Magnetic TrackerIdea: difference between a magnetic transmitterand a receiver Flock of Birds (Ascension)++: 6DOF robust 6DOF, b-- : wired, sensible to metal, noisy, expensive y p
  70. 70. Inertial Tracker I lT k Idea: measuring linear and angular orientation rates (accelerometer/gyroscope)IS300 (Intersense) Wii Remote ++: no transmitter, cheap small high frequency wireless transmitter cheap, small, frequency, -- : drift, hysteris only 3DOF
  71. 71. Ultrasonics Tracker Idea: Time of Flight or Phase Coherence Sound Waves Phase-CoherenceUltrasonicLogitech IS600 ++: Small, Cheap -- : 3DOF, Line of Sight, Low resolution, Affected Environment Conditon (pressure, temperature)
  72. 72. Global Positioning System (GPS)Created by US in 1978 Currently 29 satellitesSatellites send position + timeGPS Receiver positioning p g 4 satellites need to be visible Differential time of arrival TriangulationAccuracy y 5-30m+, blocked by weather, buildings etc
  73. 73. Problems with GPSTakes time to get satellite fix Satellites moving aroundEarths atmosphere affects signal p g Assumes consistent speed (the speed of light). Delay depends where you are on Earth Weather effectsSignal reflection Multi-path reflection off buildingsSignal blocking Trees, buildings, mountains T b ildi t iSatellites send out bad data Misreport their own position
  74. 74. Accurate to < 5cm close to base station (22m/100 km)Expensive - $20-40,000 USD
  75. 75. Optical Tracking
  76. 76. Optical TrackerIdea: Image Processing and Computer Vision g g pSpecialized Infrared, Retro-Reflective, Stereoscopic f f SART Hi-BallMonocular Based Vision Tracking g
  77. 77. Outside-In vs. Inside-Out Tracking
  78. 78. Optical Tracking TechnologiesScalable active trackers InterSense IS-900, 3rd Tech HiBall 3rd Tech, I 3 d T h Inc.Passive optical computer vision Line of sight, may require landmarks sight Can be brittle. Computer vision i computationally-intensive C i i is i ll i i
  79. 79. HiBall Tracking System (3rd Tech)Inside-Out Tracker O $50K USDScalable over large area Fast d t (2000H ) F t update (2000Hz) Latency Less than 1 ms.Accurate Position 0.4mm RMS 0 4mm Orientation 0.02° RMS
  80. 80. Starting simple: Marker trackingHas been done for more than 10 yearsSeveral open source solutions existS l l i iFairly simple to implement y p p Standard computer vision methodsA rectangular marker provides 4 corner points l k id i Enough for pose estimation!
  81. 81. Marker Based Tracking: ARToolKit
  82. 82. Coordinate SystemsC d S
  83. 83. Marker T kM k Tracking – O Overview
  84. 84. Marker Tracking – Fiducial DetectionThreshold the whole image to black and whiteSearch scanline by scanline for edges (white to black)Follow edge until either Back to starting pixel Image borderCheck for size Reject fiducials early that are too small (or too large)
  85. 85. Marker Tracking – Rectangle FittingStart with an arbitrary point “x” on the contourS ih bi i “ ” hThe point with maximum distance must be a corner c0Create a diagonal through the centerC d l h h hFind points c1 & c2 with maximum distance left and right of diag.New diagonal from c1 to c2Find point c3 right of diagonal with maximum distance
  86. 86. Marker Tracking – Pattern checkingCalculate homography using the 4 corner points “Direct Linear Transform” algorithm Maps normalized coordinates to marker coordinates p (simple perspective projection, no camera model)Extract pattern by samplingCheck pattern Id (implicit encoding) Template ( T l (normalized cross correlation) li d l i )
  87. 87. Marker Tracking – Corner refinement Refine corner coordinates Critical for high quality tracking Remember: 4 points is the bare minimum! So these 4 points should better be accurate… Detect sub-pixel coordinates E.g. Harris corner detector g - Specialized methods can be faster and more accurate Strongly reduces jitter! gy j Undistort corner coordinates Remove radial distortion from lens R di l di t ti f l
  88. 88. MarkerM k tracking – P k Pose estimationCalculates marker position and rotationCrelative to the cameraInitial estimation directly from homography Very fast, but coarse fast Jitters a lot…Refinement via GR fi i Gauss-Newton iteration N i i 6 parameters (3 for position, 3 for rotation) to refine At each iteration we optimize on the reprojection error
  89. 89. Coordinates for Marker Tracking
  90. 90. Coordinates for Marker Tracking•Camera Observed Screen Marker Ideal Screen Screen•Ideal ScreenCamera•Marker Observed•Perspective model (barrel shape) •Goal•Correspondence of•Nonlinear function 4 vertices•Obtained fromTranslation •Rotation & Camera Calibration•Obtained fromT processing R t ti•Real time image l ti Camera Calibration
  91. 91. From Marker To Camera F M k T CRotation & Translation TCM : 4x4 transformation matrix from marker coord. to camera coord.
  92. 92. Tracking challenges in ARToolKit Occlusion Unfocused camera, Dark/unevenly lit Jittering(image by M. Fiala) motion blur scene, vignetting (Photoshop illustration) Image noiseFalse positives and inter-marker confusion (e.g. poor lens, block coding / compression, neon tube) (image by M. Fiala)
  93. 93. Tracking, Tracking, Tracking
  94. 94. Other Marker Tracking LibrariesarTag T [Discontinued] http://studierstube.icg.tu- graz ac at/handheld ar/artoolkitplus phpstbTracker http://studierstube.icg.tu- htt // t di t b i t
  95. 95. Markerless Tracking
  96. 96. Markerless Tracking No more Markers! Markerless TrackingMagnetic T kM i Tracker Inertial I i l Ultrasonic Ul i Optical O i l Tracker Tracker Tracker Specialized Marker-Based Markerless Tracking Tracking Tracking Edge-Based Template-Based Interest Point Tracking Tracking Tracking
  97. 97. Natural feature trackingTracking from features of the surroundingenvironment Corners, edges, blobs, ...Generally more diffi l than marker trackingG ll difficult h k ki Markers are designed for their purpose g p p The natural environment is not…Less well established methods well-establishedUsually much slower than marker tracking
  98. 98. Natural Feature Tracking Features Points Use Natural Cues of Real Elements Contours Edges Surface Texture Interest Points Model or Model-Free ++: no visual pollution Surfaces
  99. 99. Texture Tracking
  100. 100. Tracking by d T k b detectionThis is what most trackers do do… Camera Image gTargets are detected every frame Keypoint detection ypPopular becausetracking and detection Descriptor creationare solved simultaneously and matching d t hi Outlier Removal Pose estimation and refinement Pose
  101. 101. Natural feature tracking – What is a keypoint? It depends on the detector you use! For high performance use the FAST corner detector Apply A l FAST t all pixels of your i to ll i l f image Obtain a set of keypoints for your image - R d Reduce the amount of corners using non-maximum suppression h f Describe the keypoints E. Rosten and T. Drummond (May 2006). "Machine learning for high‐speed corner detection". 
  102. 102. Corner keypoint
  103. 103. Natural feature tracking – Descriptors Again depends on your choice of a descriptor! Can use SIFT Estimate the d i E i h dominant keypoint k i orientation using gradients Compensate for C m ensate f r detected orientation Describe the keypoints in terms of the gradients surrounding it Wagner D., Reitmayr G., Mulloni A., Drummond T., Schmalstieg D.,  Real‐Time Detection and Tracking for Augmented Reality on Mobile Phones. IEEE Transactions on Visualization and Computer Graphics, May/June, 2010 
  104. 104. NFT – D b Database creationOffline step pSearching for corners in a static imageFor robustness look at corners on multiple scales Some corners are more descriptive at larger or smaller scales We d ’t k W don’t know how far users will be from our image h f ill b f iBuild a database file with all descriptors and theirposition on the original i ii h i i l image
  105. 105. NFT – R l Real-time tracking kSearch for keypoints Camera Imagein the video imageCreate the dC t th descriptorsi t Keypoint detectionMatch the descriptors from the Descriptor creation plive video against those and matchingin the database Outlier Removal O tli R l Brute force is not an option Need the speed-up of special Pose estimation data structures and refinement - E.g., we use multiple spill trees Pose
  106. 106. NFT – O l removal Outlier lCascade of removal techniquesStart with cheapest, finish with mostexpensive… First simple geometric tests - E.g., line tests • Select 2 points to form a line • Check all other points being on correct side of line Then, homography-based tests
  107. 107. NFT – P Pose refinement fPose from homography makes good starting pointBased on Gauss-Newton iteration Try to minimize the re-projection error of the keypointsPart of tracking pipeline that mostly benefitsfrom floating point usageCan still be implemented effectively in fixed pointTypically 2-4 iterations are enough…
  108. 108. NFT – R l Real-time tracking kSearch for keypoints Camera Imagein the video image pCreate the descriptors Keypoint detectionMatch the descriptors from thelive video against those Descriptor creation and matchingin the databaseRemove the keypoints that Outlier Removalare outliers Pose estimationUse hU the remaining k i i keypoints i and refinementto calculate the pose Poseof the camera f h
  109. 109. NFT – R l Results Wagner D., Reitmayr G., Mulloni A., Drummond T., Schmalstieg D., Real‐Time Detection and Tracking for Augmented Reality on Mobile Phones. IEEE Transactions on Visualization and Computer Graphics, May/June, 2010 
  110. 110. Edge Based TrackingRAPiD [Drummond et al. 02] Initialization, Control Points, Pose Prediction (Global Method)
  111. 111. Line Based Tracking Visual Servoing [Comport et al. 2004]
  112. 112. Model Based TrackingOpenTL - General purpose library for model based visual tracking
  113. 113. OpenTL Features
  114. 114. Visual Modalities Used For Tracking
  115. 115. The Tracking Pipeline
  116. 116. Marker vs. natural feature tracking Marker tracking Usually requires no database to be stored Markers can be an eye-catcher Tracking is less demanding g g The environment must be instrumented with markers Markers usually work only when fully in view y y y Natural feature tracking A database of keypoints must be stored/downloaded Natural feature targets might catch the attention less Natural f t N t l feature targets are potentially everywhere t t t ti ll h Natural feature targets work also if partially in view
  117. 117. Hybrid Tracking
  118. 118. Example: Outdoor Hybrid Tracking Combines computer vision - natural feature tracking inertial gyroscope sensors Both correct for each other Inertial gyro - provides frame to frame prediction of camera orientation Computer vision - correct for gyro drift
  119. 119. Outdoor AR Tracking SystemYou, Neumann, Azuma outdoor AR system (1999)
  120. 120. Robust Outdoor TrackingHybrid T kiH b id Tracking Computer Vision, GPS, inertialGoing Out Reitmayer & Drummond (Univ. Cambridge)
  121. 121. Handheld Display
  122. 122. Registration
  123. 123. The Registration ProblemVirtual and Real must stay properly alignedIf not: t Breaks the illusion that the two coexist Prevents acceptance of many serious applications
  124. 124. Sources of registration errorsStatic errorsS Optical distortions Mechanical misalignments Tracker errors Incorrect viewing parametersDynamic errors System delays (largest source of error) - 1 ms d l = 1/3 mm registration error delay i t ti
  125. 125. Reducing static errorsDistortion compensationManual adjustmentsView-based or direct measurements [Azuma94] [Caudell92] [Janin93] etc.Camera calibration (video) [ARGOS94] [Bajura93] [Tuceryan95] etc.
  126. 126. View Based Calibration (Azuma 94)
  127. 127. Dynamic errors Application Loop x,y,zTracking r,p,y Calculate Render Draw to Viewpoint Scene Display Simulation20 Hz = 50ms 500 Hz = 2ms 30 Hz = 33ms 60 Hz = 17ms Total Delay = 50 + 2 + 33 + 17 = 102 ms 1 ms delay = 1/3 mm = 33mm error
  128. 128. Reducing dynamic errors (1)Reduce system lag [Olano95] [Wloka95a] [Regan SIGGRAPH99]Reduce apparent lag Image deflection [Burbidge89] [Regan94] [So92] [Kijima [Kiji ISMR 2001] Image warping [Mark 3DI 97]
  129. 129. Reducing System Lag Application Loop x,y,zTracking r,p,y Calculate Render Draw to Viewpoint Scene Display SimulationFaster Tracker Faster CPU Faster GPU Faster Display
  130. 130. Reducing Apparent Lag Virtual Display Virtual Display x,y,z y Physical Ph i l r,p,y Physical Display Display (640x480) (640x480) Tracking 1280 x 960 Update 1280 x 960Last known position Latest position Application Loop x,y,zTracking r,p,y Calculate Render Draw to Viewpoint p Scene Display p y Simulation
  131. 131. Reducing dynamic errors (2)Match input streams (video) Delay video of real world to match system lagPredictive Tracking [Azuma94] [Emura94] Inertial sensors helpful Azuma / Bishop 1994 u a s op 99
  132. 132. Predictive TrackingPosition Now Past Future Time Can predict up to 80 ms in future (Holloway)
  133. 133. Predictive Tracking (Azuma 94)
  134. 134. Wrap-upTracking and Registration are key problemsRegistration error Measures against static error Measures against dynamic error M i d iAR typically requires multiple tracking technologies yp y q p g gResearch Areas: Hybrid Markerless Techniques,Deformable Surface, Mobile, Outdoors Surface Mobile
  135. 135. More Information• M k Billi h t Mark Billinghurst –• Websites – hi l b