Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Google Glass, The META and Co. - How to calibrate your Optical See-Through Head Mounted Displays

4,966 views

Published on

Slides from our ISMAR 2014 tutorial http://stctutorial.icg.tugraz.at/

Abstract:
Head Mounted Displays such as Google Glass and the META have the potential to spur consumer-oriented Optical See-Through Augmented Reality applications. A correct spatial registration of those displays relative to a user’s eye(s) is an essential problem for any HMD-based AR application.

At our ISMAR 2014 tutorial we provide an overview of established and novel approaches for the calibration of those displays (OST calibration) including hands on experience in which participants will calibrate such head mounted displays.

Published in: Mobile
  • Hey guys! Who wants to chat with me? More photos with me here 👉 http://www.bit.ly/katekoxx
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

Google Glass, The META and Co. - How to calibrate your Optical See-Through Head Mounted Displays

  1. 1. Introduction to Optical See-Through HMD Calibration Jens Grubert (TU Graz) Yuta Itoh (TU Munich) jg@jensgrubert.de yuta.itoh@in.tum.de 9th Sep 2014
  2. 2. Theory 14:15 Introduction to OST Calibration 15:00 coffee break 15:15 Details of OST Calibration 16:15 coffee break Practice 16:30 Hands on session: calibration of OST HMDs 17:30 Discussion: experiences, feedback 17:50 wrap-up, mailing list 18:00 end of tutorial
  3. 3. Part 1 Theory: Introduction
  4. 4. Head Mounted Displays (HMD) Non Optical See-Through Optical See-Through (OST)
  5. 5. Issues on OST-HMD Photo by Mikepanhu
  6. 6. Consistency Photo by javier1949
  7. 7. The Lack of consistencies Spatial Social Visual Temporal
  8. 8. Temporal Inconsist. in OST-HMD “latencies down to 2.38 ms are required to alleviate user perception when dragging” “How fast is fast enough? : a study of the effects of latency in direct-touch pointing tasks” Jota et al. CH’13 https://www.youtube.com/watch?v=PCbSTj7LjJg
  9. 9. Temporal Inconsist. in OST-HMD Digital Light Processing Projector “Minimizing Latency for Augmented Reality Displays: Frames Considered Harmful” Zheng et al. ISMAR’14
  10. 10. Visual Consistency Occlusion, Depth, Color, Shadow, Lee and woo, 2009 Kiyokawa et al. 2003 Liu et al. 2008
  11. 11. Visual Consistency Wide Field of View, etc… “Pinlight Displays: Wide Field of View Augmented Reality Eyeglasses using Defocused Point Light Sources” Maimone et al., TOG’14
  12. 12. Social Consistency Image from Google Glass: Don't Be A Glasshole | Mashable
  13. 13. Social Consistency Image from http://www.thephoblographer.com/
  14. 14. Spatial Inconsistency in OST-HMD Spatial registration  Calibration
  15. 15. Spatial Inconsistency in OST-HMD
  16. 16. Calibration of Eye&OST-HMD
  17. 17. Camera Calibration Analogy Find 3D-2D Projection : Intrinsic
  18. 18. OST-HMD’s Screen to Camera Find 3D-2D Projection: = K*[R t]
  19. 19. In the Eye of the Beholder… We can not see what you see! = K*[R t]
  20. 20. Manual alignment – Intensive user interaction – User-dependent noise [AZU97]
  21. 21. P is a Black Box Find 3D-2D Projection: 3D 2D
  22. 22. SPAAM: Single Point Active Alignment Method –Medium user interaction – User-dependent noise [TN00] [GTN02] N times
  23. 23. You got a perfect P!!!
  24. 24. Oops, sorry I touched your HMD…
  25. 25. Essential Difficulties 1 Data acquisition 2 Dynamic parameter changes
  26. 26. Part 2: Overview Data collection SPAAM, MPAAM, Stereo Calibration Confirmation Methods Evaluation State of the art Practical tips
  27. 27. Data Collection: SPAAM
  28. 28. Data Collection: MPAAM
  29. 29. Data Collection: Stereo
  30. 30. Confirmation Methods Keyboard Voice Handheld Waiting
  31. 31. State of the Art
  32. 32. Practical Tips
  33. 33. Theory 14:15 Introduction to OST Calibration 15:00 coffee break 15:15 Details of OST Calibration 16:15 coffee break Practice 16:30 Hands on session: calibration of OST HMDs 17:30 Discussion: experiences, feedback 17:50 wrap-up, mailing list 18:00 end of tutorial
  34. 34. Part 1 Theory: Details
  35. 35. Data Collection Methods: SPAAM Single Point Active Alignment Method
  36. 36. Eye-HMD Calibration 3D 2D
  37. 37. P is a Black Box Find 3D-2D Projection: 2D 3D
  38. 38. Say, is a Perspective Projection 2D
  39. 39. 2D-3D correspondences gives 2D 3D
  40. 40. Only users can see the 2D points!
  41. 41. SPAAM: Single Point Active Alignment Method –Medium user interaction – User-dependent noise [TN00] [GTN02] N times
  42. 42. SPAAM: Single Point Active Alignment Method Minimum 6 pairs Better 16~20 pairs 3D 2D 3D 3D Better distributed in Z axis
  43. 43. Data Collection Methods: Stereo Calibration
  44. 44. SPAAM: Calibration for a Single Display
  45. 45. How to calibrate stereo systems?
  46. 46. How to calibrate stereo systems? Idea 1: Calibrate each eye individually
  47. 47. Calibrate each eye individually
  48. 48. How to calibrate stereo systems? Idea 2: Calibrate both eyes simultaneously Why? Save time
  49. 49. Calibrate both eyes simultaneously Idea 1. display 2D objects with disparity in left and right eye appears as single object at a certain distance 2. Align virtual with physical 3D object  Get point correspondence for both eyes
  50. 50. [GSW00]
  51. 51. Challenges for Simultaneous Alignment • Shape of the virtual object • Occlusion of physical target • Vergence-accomodation conflict
  52. 52. Stereo Calibration Take Aways Simultaneous calibration can be significantly faster to calibrate Perceptual issues might hinder quality calibration
  53. 53. Data Collection Methods: Multi-Point Collection
  54. 54. Idea SPAAM: align a single point multiple times Multi-Point Active Alignment (MPAAM): align several points concurently but only once Why? save time
  55. 55. Example: SPAAM
  56. 56. Example: MPAAM
  57. 57. MPAAM Variants • Align all points at once • Minimum of six points • Vary spatial distribution [TMX07]
  58. 58. MPAAM Variants • Align all points at once • Minimum of six points • Vary spatial distribution • Missing: tradeoff # points - # calibration steps [GTM10]
  59. 59. Performance • MPAAM can be conducted significantly faster than SPAAM (in average in 84 seconds vs 154 seconds for SPAAM) [GTM10] • MPAAM has comparable accuracy in the calibrated range
  60. 60. MPAAM take aways MPAAM can be alternative to SPAAM if • Working volume can be covered by calibration body • Need for repeated calibration (e.g., after HMD slips)
  61. 61. Confirmation Methods
  62. 62. User has to confirm 2D-3D matching
  63. 63. How to make confirmation stable?
  64. 64. Different confirmation options Keyboard Voice Handheld Waiting [MDW11]
  65. 65. Less motion is better [MDW11]
  66. 66. Evaluation: User in the Loop
  67. 67. Evaluation Questions • How accurate is the overlay given the current calibration? [MGT01] [GTM10] • How much do the calibration results vary between calibrations? [ASO11] • What is the impact of individual error sources on the calibration results? – Head pointing accuracy, body sway, confirmation methods ... [AXH11]
  68. 68. Evaluation Questions • How accurate is the overlay given the current calibration? [MGT01] [GTM10] • How much do the calibration results vary between calibrations? [ASO11] • What is the impact of individual error sources on the calibration results? – Head pointing accuracy, body sway, confirmation methods ... [AXH11]
  69. 69. How accurate is the overlay given the current calibration? Popular Approaches Use a camera Ask the user
  70. 70. User in the Loop Evaluation Qualitative feedback „overlay looks good“ Quantitative feedback
  71. 71. User in the Loop Evaluation Qualitative feedback „overlay looks good“ Quantitative feedback
  72. 72. Quantitative Feedback McGarrity et al. [MGT01]: • Use a tracked evaluation board • Ask AR system to superimpose object on 푃퐸퐵 = (푥퐸퐵 , 푦퐸퐵) • Ask user to indicate where she perceives the object on the board 푃푈 = (푥푈, 푦푈) • Offset:Δ푃 = 푃퐸퐵 − 푃푈
  73. 73. Quantitative Feedback McGarrity et al. [MGT01]: • Use a tracked evaluation board • Ask AR system to superimpose object on 푃퐸퐵 = (푥퐸퐵 , 푦퐸퐵) • Ask user to indicate where she perceives the object on the board 푃푈 = (푥푈, 푦푈) • Offset:Δ푃 = 푃퐸퐵 − 푃푈
  74. 74. Quantitative Feedback • Drawback of stylus approach: evaluation only within arm‘s reach Alternatives • Use laser pointer + human operator instead (beware pointing accuracy) [GTM10] • Use projector / large display + indirect pointing (e.g., mouse)
  75. 75. Quantitative Feedback Benefits: • Only way to approximate how the user herself perceives the augmentation Drawbacks: • Only valid for current view (distance, orientation) • Additional pointing error introduced
  76. 76. Take Aways • Quantitative user feedback only way to approximate how large the registration error is for indivdual users • Feedback methods introduce additional (pointing) errors • Make sure to test for all relevant working distances
  77. 77. Evaluation: Error Measurements
  78. 78. OST-HMD Calibration 2D Projection Matrix 3D
  79. 79. Ideal Case 3D-2D pairs: S Eye positions: (Camera center)
  80. 80. 2D Projection Error Reprojection Error Wrong Projection
  81. 81. 3D Eye Positions [m] >10 cm
  82. 82. Semi-Automatic Calibration Approaches
  83. 83. Motivation User guided See-Through Calibration too tedious Can the calibration process be shortened? https://www.flickr.com/photos/stuartncook/4613088809/in/photostream/
  84. 84. Observation We have to estimate 11 parameters 2D --> At least 6 point correspodences needed 3D
  85. 85. Reminder: Collecting Correspondences
  86. 86. Idea Separate certain parameters which are independent from the user? The user would need to collect fewer point correspondences, making the task faster and easier.
  87. 87. Reminder: Calibratrion Parameters Pinhole Camera
  88. 88. TCS TCS: Tracking Coordinate System EDCS: Eye-Display Coordinate System EDCS Rotation and Translation between Tracking Coordinate System and Eye-Display Coordinate System: 6 Parameters for center of projection 푡푥, 푡푦 , 푡푧 푟푥, 푟푦 , 푟푧
  89. 89. 5 intrinsic parameters of Eye-Display optical system: focal length (x,y), shear, principal point (x,y) (+ more if you want to modell distortion)
  90. 90. Separate intrinsic + extrinsic parameters [OZT04]: 1. Determine ALL parameters (including distortion) via camera without user intervention 2. Update center of projection in a user phase
  91. 91. State of the art: Automatic Method
  92. 92. INDICA: Interaction-free DIsplay CAlibration Utilizes 3D Eye Localization [IK14] – Interaction-free, thus do not bother users –More accurate than a realistic SPAAM setup
  93. 93. 3D Eye Position Estimation 1. Estimate a 2D iris ellipse – Iris detector + Fitting by RANSAC [SBD12] 2. Back project it to 3D circle [NNT11]
  94. 94. World to HMD(eye) Projection Manual (SPAAM) Interaction Free (INDICA Recycle) Interaction Free (INDICA Full) 3D 2D
  95. 95. Summary of INDICA Calibration of OST-HMDs using Simple No user interaction Accurate 3D eye position  better than Degraded manual calibrations
  96. 96. Practical Tips
  97. 97. How many control points for SPAAM? • Minimum of 6 can lead to unstable and innaccurate results? • The more the better? Not neccesarily  16-20 control points sufficient if points are equally distributed in all three dimensions
  98. 98. Calibration Error [mm] [CAR94] 16 20
  99. 99. Calibration Volume If possible calibrate the working volume you want to operate in Working Volume Calibratio n Volume
  100. 100. Quality of Tracking System Ensure the best calibration possible for your external tracking system Ensure a low latency
  101. 101. Summary of Part 2 Reducing user errors: - Data-collection - Confirmation - Evaluation Manual to automatic: State of the art Practical tips
  102. 102. References 1/2 [AXH11] Axholt, M. (2011). Pinhole Camera Calibration in the Presence of Human Noise. [ASO11] Axholt, M., Skoglund, M. A., O'Connell, S. D., Cooper, M. D., Ellis, S. R., & Ynnerman, A. (2011, March). Parameter estimation variance of the single point active alignment method in optical see-through head mounted display calibration. In Virtual Reality Conference (VR), 2011 IEEE (pp. 27-34). IEEE. [AZU97] Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385. [CAR94] Chen, L., Armstrong, C. W., & Raftopoulos, D. D. (1994). An investigation on the accuracy of three-dimensional space reconstruction using the direct linear transformation technique. Journal of biomechanics, 27(4), 493-500. [CNN11] Christian, N., Atsushi, N., & Haruo, T. (2011). Image-based Eye Pose and Reflection Analysis for Advanced Interaction Techniques and Scene Understanding. CVIM,, 2011(31), 1-16. [GTM10] Grubert, J., Tuemler, J., Mecke, R., & Schenk, M. (2010). Comparative User Study of two See-through Calibration Methods. In VR (pp. 269-270). [GTN02] Genc, Y., Tuceryan, M., & Navab, N. (2002, September). Practical solutions for calibration of optical see-through devices. In Proceedings of the 1st International Symposium on Mixed and Augmented Reality (p. 169). IEEE Computer Society.
  103. 103. References 2/2 [MAE14] Moser, K. R., Axholt, M., & Edward Swan, J. (2014, March). Baseline SPAAM calibration accuracy and precision in the absence of human postural sway error. In Virtual Reality (VR), 2014 iEEE (pp. 99-100). IEEE. [MGT01] McGarrity, E., Genc, Y., Tuceryan, M., Owen, C., & Navab, N. (2001). A new system for online quantitative evaluation of optical see-through augmentation. In ISAR 2001 (pp. 157-166). IEEE. [MDW11] P. Maier, A. Dey, C. A. Waechter, C. Sandor, M. Tönnis and G. Klinker, "An empiric evaluation of confirmation methods for optical see-through head-mounted display calibration. In International Symposium on Mixed and Augmented Reality (ISMAR), 2011 IEEE. [OZT04] Owen, C. B., Zhou, J., Tang, A., & Xiao, F. (2004, November). Display-relative calibration for optical see-through head-mounted displays. In Mixed and Augmented Reality, 2004. ISMAR 2004. Third IEEE and ACM International Symposium on (pp. 70-78). IEEE. [SBD12] Świrski, L., Bulling, A., & Dodgson, N. (2012, March). Robust real-time pupil tracking in highly off-axis images. In Proceedings of the Symposium on Eye Tracking Research and Applications (pp. 173-176). ACM. [TU00] Tuceryan, M., & Navab, N. (2000). Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR. In Augmented Reality, 2000.(ISAR 2000). Proceedings. IEEE and ACM International Symposium on (pp. 149-158). IEEE.
  104. 104. Online References Up to date references for the field of optical see-through calibration can be found here: http://www.mendeley.com/groups/4218141/ calibration-of-optical-see-through-head-mounted- displays/overview/ 104

×