• Like
426 lecture2: AR Technology
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

426 lecture2: AR Technology

  • 937 views
Published

The second lecture for the COSC 426 postgraduate course on Augmented Rea

The second lecture for the COSC 426 postgraduate course on Augmented Rea

Published in Technology , Business
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
937
On SlideShare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
30
Comments
0
Likes
1

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. COSC 426: Augmented Reality Mark Billinghurst mark.billinghurst@hitlabnz.org July 18th 2012 Lecture 2: AR Technology
  • 2. Key Points from Lecture 1
  • 3. Augmented Reality Definition  Defining Characteristics [Azuma 97]   Combines Real and Virtual Images -  Both can be seen at the same time   Interactive in real-time -  Virtual content can be interacted with   Registered in 3D -  Virtual objects appear fixed in space
  • 4. What is not Augmented Reality?  Location-based services  Barcode detection (QR-codes)  Augmenting still images  Special effects in movies  …  … but they can be combined with AR!
  • 5. Milgram’s Reality-Virtuality Continuum Mixed Reality Real Augmented Augmented VirtualEnvironment Reality (AR) Virtuality (AV) Environment Reality - Virtuality (RV) Continuum
  • 6. Metaverse
  • 7. AR History Summary  1960’s – 80’s: Early Experimentation  1980’s – 90’s: Basic Research   Tracking, displays  1995 – 2005: Tools/Applications   Interaction, usability, theory  2005 - : Commercial Applications   Games, Medical, Industry
  • 8. Applications  Medicine  Manufacturing  Information overlay  Architecture  Museum  Marketing  Gaming
  • 9. AR Technology
  • 10. “The product is no longer the basis of value. The experience is.” Venkat Ramaswamy The Future of Competition.
  • 11. Gilmore + Pine: Experience Economy experiences Emotion services Value products Function components Sony CSL © 2004
  • 12. Building Compelling AR Experiences experiences Usability applications Interaction tools Authoring components Tracking, Display
  • 13. Building Compelling AR Experiences experiences applications tools components Display, Tracking Sony CSL © 2004
  • 14. AR Technology  Key Technologies   Display Tracking Display   Tracking   Input   Processing Processing Input
  • 15. AR Displays
  • 16. AR Displays AR Visual Displays Primarily Indoor Primarily Outdoor Environments (Daylight) Environments Not Head-Mounted Head-Mounted Head-Mounted Not Head Mounted Display (HMD) Display (HMD) (e.g. vehicle mounted) Virtual Images Projection CRT Display Liquid Crystal Cathode Ray Tube (CRT) Projection Display or Virtual Retinal Display (VRD) Navigational Aids in Carsseen off windows using beamsplitter Displays LCDs Many Military Applications Military Airborne Applications & Assistive Technologies e.g. window e.g. Reach-In e.g. Shared Space e.g. WLVA e.g. Head-Up reflections Magic Book and IVRD Display (HUD)
  • 17. Head Mounted Displays
  • 18. Head Mounted Displays (HMD) -  Display and Optics mounted on Head -  May or may not fully occlude real world -  Provide full-color images -  Considerations •  Cumbersome to wear •  Brightness •  Low power consumption •  Resolution limited •  Cost is high?
  • 19. Key Properties of HMD  Field of View   Human eye 95 degrees horizontal, 60/70 degrees vertical  Resolution   > 320x240 pixel  Refresh Rate  Focus   Fixed/manual  Power  Size
  • 20. Types of Head Mounted Displays Occluded The image cannot be displayed. Your computer may not have enough memory to open the image, or the image may have been See-thru corrupted. Restart your computer, and then open the file again. If the red x still appears, you may have to delete the image and then insert it again. The ima ge can not be dis play ed. You r co mp uter Multiplexed
  • 21. Immersive VR Architecture Virtual World head position/orientation Head! Non see-thru! Tracker Image source & optics Host! Data Base! Rendering! Frame! Processor Model Engine Buffer virtualto network object Display! Driver
  • 22. See-thru AR Architecture head position/orientation Head! see-thru! Tracker combiner real world Host! Data Base! Rendering! Frame! Processor Model Engine Bufferto network Virtual Image Display! superimposed! Driver over real world object Image source
  • 23. Optical see-through head-mounted display Virtual images from monitors Real World Optical Combiners
  • 24. Optical See-Through HMD
  • 25. Optical see-through HMDs Virtual Vision VCAPSony Glasstron
  • 26. View Through Optical See-Through HMD
  • 27. DigiLens   Compact HOE   Solid state optics   Switchable Bragg Grating   Stacked SBG   Fast switching   Ultra compact  www.digilens.com
  • 28. Google Glasses
  • 29. The Virtual Retinal Display  Image scanned onto retina  Commercialized through Microvision   Nomad System - www.mvis.com
  • 30. Strengths of optical AR  Simpler (cheaper)  Direct view of real world   Full resolution, no time delay (for real world)   Safety   Lower distortion  No eye displacement   but COASTAR video see-through avoids this
  • 31. Video AR Architecture Head-mounted camera aligned to head position/orientation display optics Video image Head! Tracker of real world Video! Processor Host! Graphics! Digital! Frame! Processor renderer Mixer Bufferto network Display! Driver Virtual image inset into Non see-thru! video of real Image source world & optics
  • 32. Video see-through HMD Video cameras Video GraphicsMonitors Combiner
  • 33. Video See-Through HMD
  • 34. Video see-through HMDMR Laboratory’s COASTAR HMD(Co-Optical Axis See-Through Augmented Reality)Parallax-free video see-through HMD
  • 35. TriVisio  www.trivisio.com  Stereo video input   PAL resolution cameras  2 x SVGA displays   30 degree FOV   User adjustable convergence  $6,000 USD
  • 36. View Through a Video See-Through HMD
  • 37. Vuzix Display  www.vuzix.com  Wrap 920  $350 USD  Twin 640 x 480 LCD displays  31 degree diagonal field of view  Weighs less than three ounces
  • 38. Strengths of Video AR  True occlusion   Kiyokawa optical display that supports occlusion  Digitized image of real world   Flexibility in composition   Matchable time delays   More registration, calibration strategies  Wide FOV is easier to support
  • 39. Optical vs. Video AR Summary  Both have proponents  Video is more popular today?   Likely because lack of available optical products  Depends on application?   Manufacturing: optical is cheaper   Medical: video for calibration strategies
  • 40. Eye multiplexed AR Architecture head position/orientation Head! Tracker real world Host! Data Base! Rendering! Frame! Processor Model Engine Bufferto network Display! Virtual Image Driver inset into! real world scene Opaque! Image source
  • 41. Virtual Image ‘inset’ into real
  • 42. Virtual Vision Personal Eyewear
  • 43. Virtual image inset into real world
  • 44. Spatial/Projected AR
  • 45. Spatial Augmented Reality  Project onto irregular surfaces   Geometric Registration   Projector blending, High dynamic range  Book: Bimber, Rasker “Spatial Augmented Reality”
  • 46. Projector-based AR User (possibly head-tracked) ProjectorReal objects Examples:with retroreflective Raskar, MIT Media Labcovering Inami, Tachi Lab, U. Tokyo
  • 47. Example of projector-based AR Ramesh Raskar, UNC, MERL
  • 48. Example of projector-based AR Ramesh Raskar, UNC Chapel Hill
  • 49. The I/O Bulb  Projector + Camera   John Underkoffler, Hiroshi Ishii   MIT Media Lab
  • 50. Head Mounted Projector  Head Mounted Projector   Jannick Rolland (UCF)  Retro-reflective Material   Potentially portable
  • 51. Head Mounted Projector  NVIS P-50 HMPD   1280x1024/eye   Stereoscopic   50 degree FOV   www.nvis.com
  • 52. HMD vs. HMPDHead Mounted Display Head Mounted Projected Display
  • 53. Pico Projectors  Microvision - www.mvis.com  3M, Samsung, Philips, etc
  • 54. MIT Sixth Sense  Body worn camera and projector  http://www.pranavmistry.com/projects/sixthsense/
  • 55. Other AR Displays
  • 56. Video Monitor AR Video Stereo cameras Monitor glasses VideoGraphics Combiner
  • 57. Examples
  • 58. Virtual Showcase  Mirrors on a projection table   Head tracked stereo   Up to 4 users   Merges graphic and real objects   Exhibit/museum applications  Fraunhofer Institute (2001)   Bimber, Frohlich
  • 59. Augmented Paleontology Bimber et. al. IEEE Computer Sept. 2002
  • 60. Alternate DisplaysLCD Panel Laptop PDA
  • 61. Handheld Displays  Mobile Phones   Camera   Display   Input
  • 62. Display Taxonomy
  • 63. Other Types of AR Display  Audio   spatial sound   ambient audio  Tactile   physical sensation  Haptic   virtual touch
  • 64. Haptic Input  AR Haptic Workbench   CSIRO 2003 – Adcock et. al.
  • 65. Phantom  Sensable Technologies (www.sensable.com)  6 DOF Force Feedback Device
  • 66. AR Haptic Interface  Phantom, ARToolKit, Magellan
  • 67. AR Tracking and Registration
  • 68.   Registration   Positioning virtual object wrt real world  Tracking   Continually locating the users viewpoint -  Position (x,y,z) -  Orientation (r,p,y)
  • 69. Registration
  • 70. Spatial Registration
  • 71. The Registration Problem  Virtual and Real must stay properly aligned  If not:   Breaks the illusion that the two coexist   Prevents acceptance of many serious applications
  • 72. Sources of registration errors  Static errors   Optical distortions   Mechanical misalignments   Tracker errors   Incorrect viewing parameters  Dynamic errors   System delays (largest source of error) -  1 ms delay = 1/3 mm registration error
  • 73. Reducing static errors  Distortion compensation  Manual adjustments  View-based or direct measurements  Camera calibration (video)
  • 74. View Based Calibration (Azuma 94)
  • 75. Dynamic errors Application Loop x,y,zTracking r,p,y Calculate Render Draw to Viewpoint Scene Display Simulation20 Hz = 50ms 500 Hz = 2ms 30 Hz = 33ms 60 Hz = 17ms   Total Delay = 50 + 2 + 33 + 17 = 102 ms   1 ms delay = 1/3 mm = 33mm error
  • 76. Reducing dynamic errors (1)  Reduce system lag   Faster components/system modules  Reduce apparent lag   Image deflection   Image warping
  • 77. Reducing System Lag Application Loop x,y,zTracking r,p,y Calculate Render Draw to Viewpoint Scene Display SimulationFaster Tracker Faster CPU Faster GPU Faster Display
  • 78. Reducing Apparent Lag Virtual Display Virtual Display x,y,z Physical r,p,y Physical Display Display (640x480) (640x480) Tracking 1280 x 960 Update 1280 x 960Last known position Latest position Application Loop x,y,zTracking r,p,y Calculate Render Draw to Viewpoint Scene Display Simulation
  • 79. Reducing dynamic errors (2)  Match input streams (video)   Delay video of real world to match system lag  Predictive Tracking   Inertial sensors helpful Azuma / Bishop 1994
  • 80. Predictive TrackingPosition Now Past Future Time Can predict up to 80 ms in future (Holloway)
  • 81. Predictive Tracking (Azuma 94)
  • 82. More Information•  Mark Billinghurst –  mark.billinghurst@hitlabnz.org •  Websites –  www.hitlabnz.org