2013 Lecture3: AR Tracking

3,340 views

Published on

2013 COSC 426 Lecture 3 on AR Tracking. Taught by Mark Billinghurst from the HIT Lab NZ at the University of Canterbury. Taught on July 26th, 2013.

Published in: Technology, Business
0 Comments
10 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
3,340
On SlideShare
0
From Embeds
0
Number of Embeds
23
Actions
Shares
0
Downloads
215
Comments
0
Likes
10
Embeds 0
No embeds

No notes for slide

2013 Lecture3: AR Tracking

  1. 1. COSC 426: Augmented Reality Mark Billinghurst mark.billinghurst@hitlabnz.org July 26th 2013 Lecture 3: AR Tracking
  2. 2. Key Points from Lecture 2
  3. 3. “The product is no longer the basis of value.The experience is.” Venkat Ramaswamy The Future of Competition.
  4. 4. experiences services products components Value Sony CSL © 2004 Gilmore + Pine: Experience Economy Function Emotion
  5. 5. Interaction Design is All About You   Users should be involved throughout the Design Process   Consider all the needs of the user
  6. 6. Interaction Design Process
  7. 7. experiences applications tools components Building Compelling AR Experiences Tracking, Display Authoring Interaction Usability
  8. 8. Optical see-through head-mounted display Virtual images from monitors Real World Optical Combiners
  9. 9. Video see-through HMD Video cameras Monitors Graphics Combiner Video
  10. 10. Video Monitor AR Video cameras Monitor Graphics Combiner Video Stereo glasses
  11. 11. AR Tracking and Registration
  12. 12.   Registration   Positioning virtual object wrt real world   Tracking   Continually locating the users viewpoint -  Position (x,y,z) -  Orientation (r,p,y)
  13. 13. Tracking
  14. 14. Tracking Requirements   Augmented Reality Information Display   World Stabilized   Body Stabilized   Head Stabilized Increasing Tracking Requirements Head Stabilized Body Stabilized World Stabilized
  15. 15. Tracking Technologies  Active •  Mechanical, Magnetic, Ultrasonic •  GPS, Wifi, cell location  Passive •  Inertial sensors (compass, accelerometer, gyro) •  Computer Vision •  Marker based, Natural feature tracking  Hybrid Tracking •  Combined sensors (eg Vision + Inertial)
  16. 16. AR Tracking Taxonomy e.g. AR Toolkit Low Accuracy at 15-60 Hz e.g. IVRD High Accuracy & High Speed Hybrid Tracking Limited Range e.g. HiBall Many Fiducials in space/time but no GPS Extended Range Indoor Environment e.g. WLVA Not Hybridized GPS or Camera or Compass Low Accuracy & Not Robust e.g. BARS Hybrid Tracking GPS and Camera and Compass High Accuracy & Robust Outdoor Environment AR TRACKING
  17. 17. Tracking Types Magnetic Tracker Inertial Tracker Ultrasonic Tracker Optical Tracker Marker-Based Tracking Markerless Tracking Specialized Tracking Edge-Based Tracking Template-Based Tracking Interest Point Tracking Mechanical Tracker
  18. 18. Mechanical Tracker   Idea: mechanical arms with joint sensors   ++: high accuracy, haptic feedback   -- : cumbersome, expensive Microscribe
  19. 19. Magnetic Tracker   Idea: difference between a magnetic transmitter and a receiver   ++: 6DOF, robust   -- : wired, sensible to metal, noisy, expensive Flock of Birds (Ascension)
  20. 20. Magnetic Tracking Error
  21. 21. Ultrasonics Tracker   Idea: Time of Flight or Phase-Coherence Sound Waves   ++: Small, Cheap   -- : 3DOF, Line of Sight, Low resolution, Affected Environment Conditon (pressure, temperature) Ultrasonic Logitech IS600
  22. 22. Inertial Tracker   Idea: measuring linear and angular orientation rates (accelerometer/gyroscope)   ++: no transmitter, cheap, small, high frequency, wireless   -- : drift, hysteris only 3DOF IS300 (Intersense) Wii Remote
  23. 23. Mobile Sensors   Inertial compass   Earth’s magnetic field   Measures absolute orientation   Accelerometers   Measures acceleration about axis   Used for tilt, relative rotation   Can drift over time
  24. 24. Global Positioning System (GPS)   Created by US in 1978   Currently 29 satellites   Satellites send position + time   GPS Receiver positioning   4 satellites need to be visible   Differential time of arrival   Triangulation   Accuracy   5-30m+, blocked by weather, buildings etc
  25. 25. Problems with GPS   Takes time to get satellite fix   Satellites moving around   Earths atmosphere affects signal   Assumes consistent speed (the speed of light).   Delay depends where you are on Earth   Weather effects   Signal reflection   Multi-path reflection off buildings   Signal blocking   Trees, buildings, mountains   Satellites send out bad data   Misreport their own position
  26. 26. Accurate to < 5cm close to base station (22m/100 km) Expensive - $20-40,000 USD
  27. 27. Assisted-GPS (A-GPS)   Use external location server to send GPS signal   GPS receivers on cell towers, etc   Sends precise satellite position (Ephemeris)   Speeds up GPS Tracking   Makes it faster to search for satellites   Provides navigation data (don’t decode on phone)   Other benefits   Provides support for indoor positioning   Can use cheaper GPS hardware   Uses less battery power on device
  28. 28. Assisted GPS
  29. 29. Cell Tower Triangulation   Calculate phone position from signal strength   < 50 m in cities   > 1 km in rural
  30. 30. WiFi Positioning   Estimate location by using WiFi access points   Can use know locations of WiFi access points   Triangulate through signal strength   Eg. PlaceEngine (www.placeengine.com)   Client software for PC and mobiles   SDK returns position   Accuracy   5 – 100m (depends on WiFi density)
  31. 31. WiFi Hotspots in New York
  32. 32. Indoor WiFi Location Sensing   Indoor Location   Asset, people tracking   Aeroscout   http://aeroscout.com/   WiFi + RFID   Ekahau   http://www.ekahau.com/   WiFi + LED tracking
  33. 33. Integrated Systems   Combine GPS, Cell tower, WiFi signals   Skyhook (www.skyhookwireless.com)   Core Engine   Database of known locations   700 million Wi-Fi access points and cellular towers.
  34. 34. Comparative Accuracies   Study testing iPhone 3GS cf. low cost GPS   A-GPS   8 m error   WiFi   74 m error   Cell Tower Positioning   600 m error Accuracy of iPhone Locations: A Comparison of Assisted GPS, WiFi, and Cellular Positioning In GIScience on July 15, 2009 at 8:11 pm By Paul A Zandbergen Transactions in GIS, Volume 13 Issue s1, Pages 5 - 25
  35. 35. Optical Tracking
  36. 36. Optical Tracker   Idea: Image Processing and Computer Vision   Specialized   Infrared, Retro-Reflective, Stereoscopic   Monocular Based Vision Tracking ART Hi-Ball
  37. 37. Outside-In vs. Inside-Out Tracking
  38. 38. Optical Tracking Technologies   Scalable active trackers   InterSense IS-900, 3rd Tech HiBall   Passive optical computer vision   Line of sight, may require landmarks   Can be brittle.   Computer vision is computationally-intensive 3rd Tech, Inc.
  39. 39. HiBall Tracking System (3rd Tech)   Inside-Out Tracker   $50K USD   Scalable over large area   Fast update (2000Hz)   Latency Less than 1 ms.   Accurate   Position 0.4mm RMS   Orientation 0.02° RMS
  40. 40. Starting simple: Marker tracking   Has been done for more than 10 years   A square marker provides 4 corners   Enough for pose estimation!   Several open source solutions exist   Fairly simple to implement   Standard computer vision methods
  41. 41. Marker Based Tracking: ARToolKit http://artoolkit.sourceforge.net/
  42. 42. Tracking Range with Pattern Size Rule of thumb – range = 10 x pattern width
  43. 43. Tracking Error with Range
  44. 44. Tracking Error with Angle
  45. 45. Tracking challenges in ARToolKit False positives and inter-marker confusion (image by M. Fiala) Image noise (e.g. poor lens, block coding / compression, neon tube) Unfocused camera, motion blur Dark/unevenly lit scene, vignetting Jittering (Photoshop illustration) Occlusion (image by M. Fiala)
  46. 46. Limitations of ARToolKit   Partial occlusions cause tracking failure   Affected by lighting and shadows   Tracking range depends on marker size   Performance depends on number of markers   cf artTag, ARToolKitPlus   Pose accuracy depends on distance to marker   Pose accuracy depends on angle to marker
  47. 47. Tracking, Tracking, Tracking
  48. 48. Other Marker Tracking Libraries   arTag   http://www.artag.net/   ARToolKitPlus [Discontinued]   http://studierstube.icg.tu-graz.ac.at/handheld_ar/ artoolkitplus.php   stbTracker   http://studierstube.icg.tu-graz.ac.at/handheld_ar/ stbtracker.php   MXRToolKit   http://sourceforge.net/projects/mxrtoolkit/
  49. 49. Markerless Tracking
  50. 50. Markerless Tracking Magnetic Tracker Inertial Tracker Ultrasonic Tracker Optical Tracker Marker-Based Tracking Markerless Tracking Specialized Tracking Edge-Based Tracking Template-Based Tracking Interest Point Tracking   No more Markers! Markerless Tracking
  51. 51. Natural feature tracking   Tracking from features of the surrounding environment   Corners, edges, blobs, ...   Generally more difficult than marker tracking   Markers are designed for their purpose   The natural environment is not…   Less well-established methods   Usually much slower than marker tracking
  52. 52. Natural Feature Tracking   Use Natural Cues of Real Elements   Edges   Surface Texture   Interest Points   Model or Model-Free   ++: no visual pollution Contours Features Points Surfaces
  53. 53. Texture Tracking
  54. 54. Edge Based Tracking   RAPiD [Drummond et al. 02]   Initialization, Control Points, Pose Prediction (Global Method)
  55. 55. Line Based Tracking   Visual Servoing [Comport et al. 2004]
  56. 56. Model Based Tracking   Track from 3D model   Eg OpenTL - www.opentl.org   General purpose library for model based visual tracking
  57. 57. Marker vs. natural feature tracking   Marker tracking   + Can require no image database to be stored   + Markers can be an eye-catcher   + Tracking is less demanding   - The environment must be instrumented with markers   - Markers usually work only when fully in view   Natural feature tracking   - A database of keypoints must be stored/downloaded   + Natural feature targets might catch the attention less   + Natural feature targets are potentially everywhere   + Natural feature targets work also if partially in view
  58. 58. Hybrid Tracking
  59. 59. Sensor tracking   Used by many “AR browsers”   GPS, Compass, Accelerometer, (Gyroscope)   Not sufficient alone (drift, interference)
  60. 60. Outdoor Hybrid Tracking   Combines   computer vision -  natural feature tracking   inertial gyroscope sensors   Both correct for each other   Inertial gyro - provides frame to frame prediction of camera orientation   Computer vision - correct for gyro drift
  61. 61. Combining Sensors and Vision   Sensors -  Produce noisy output (= jittering augmentations) -  Are not sufficiently accurate (= wrongly placed augmentations) -  Gives us first information on where we are in the world, and what we are looking at   Vision -  Is more accurate (= stable and correct augmentations) -  Requires choosing the correct keypoint database to track from -  Requires registering our local coordinate frame (online- generated model) to the global one (world)
  62. 62. Outdoor AR Tracking System You, Neumann, Azuma outdoor AR system (1999)
  63. 63. Robust Outdoor Tracking   Hybrid Tracking   Computer Vision, GPS, inertial   Going Out   Reitmayer & Drummond (Univ. Cambridge)
  64. 64. Handheld Display
  65. 65. Registration
  66. 66. Spatial Registration
  67. 67. The Registration Problem   Virtual and Real must stay properly aligned   If not:   Breaks the illusion that the two coexist   Prevents acceptance of many serious applications
  68. 68. Sources of registration errors   Static errors   Optical distortions   Mechanical misalignments   Tracker errors   Incorrect viewing parameters   Dynamic errors   System delays (largest source of error) -  1 ms delay = 1/3 mm registration error
  69. 69. Reducing static errors   Distortion compensation   Manual adjustments   View-based or direct measurements   Camera calibration (video)
  70. 70. View Based Calibration (Azuma 94)
  71. 71. Dynamic errors   Total Delay = 50 + 2 + 33 + 17 = 102 ms   1 ms delay = 1/3 mm = 33mm error Tracking Calculate Viewpoint Simulation Render Scene Draw to Display x,y,z r,p,y Application Loop 20 Hz = 50ms 500 Hz = 2ms 30 Hz = 33ms 60 Hz = 17ms
  72. 72. Reducing dynamic errors (1)   Reduce system lag   Faster components/system modules   Reduce apparent lag   Image deflection   Image warping
  73. 73. Reducing System Lag Tracking Calculate Viewpoint Simulation Render Scene Draw to Display x,y,z r,p,y Application Loop Faster Tracker Faster CPU Faster GPU Faster Display
  74. 74. Reducing Apparent Lag Tracking Update x,y,z r,p,y Virtual Display Physical Display (640x480) 1280 x 960 Last known position Virtual Display Physical Display (640x480) 1280 x 960 Latest position Tracking Calculate Viewpoint Simulation Render Scene Draw to Display x,y,z r,p,y Application Loop
  75. 75. Reducing dynamic errors (2)   Match input streams (video)   Delay video of real world to match system lag   Predictive Tracking   Inertial sensors helpful Azuma / Bishop 1994
  76. 76. Predictive Tracking Time Position Past Future Can predict up to 80 ms in future (Holloway) Now
  77. 77. Predictive Tracking (Azuma 94)
  78. 78. Wrap-up   Tracking and Registration are key problems   Registration error   Measures against static error   Measures against dynamic error   AR typically requires multiple tracking technologies   Research Areas: Hybrid Markerless Techniques, Deformable Surface, Mobile, Outdoors
  79. 79. Project List   Mobile   Hybrid Tracking for Outdoor AR   City Scale AR Visualization   Outdoor AR Authoring Tool   Outdoor AR collaborative game   AR interaction for Google Glass   Non-Mobile   AR Face Painting   AR Authoring Tool   Tangible AR puppeteer studio   Gesture based interaction with AR content
  80. 80. More Information •  Mark Billinghurst –  mark.billinghurst@hitlabnz.org •  Websites –  www.hitlabnz.org

×