Successfully reported this slideshow.

Why isnt gaze a main stream hmi

1

Share

Upcoming SlideShare
Magic pointing iswc2015
Magic pointing iswc2015
Loading in …3
×
1 of 23
1 of 23

More Related Content

Related Books

Free with a 14 day trial from Scribd

See all

Related Audiobooks

Free with a 14 day trial from Scribd

See all

Why isnt gaze a main stream hmi

  1. 1. Why isn’t Gaze a main- stream HMI?
  2. 2. Question: Gaze tracking has a long history, but it has not been adopted as a main stream interaction method. Why?
  3. 3. Because of Immature use case Special hardware required Calibration hurdle
  4. 4. Immature use case
  5. 5. Now let's looking back the gaze native functions Seeing is essentially a perception, not a means of action to make some effects in the world. •It finds something to operate, and gets continuous feedbacks of operation by other means such as hand tactics etc.
  6. 6. Existing use cases of Gaze tracking in industry • Control in shooting games • Control of screen navigations • Control of cursors or mouse pointer Commanding • Analysis of user attentions and areas of interest Marketing research or usability test
  7. 7. Existing use cases versus gaze native functions Commanding Miss-matches with gaze native functions, Leads to fatigue, and Requires learning Marketing research or usability test Matches with gaze native functions
  8. 8. Let’s get back to the gaze native functions Eyes are basically receptor and used as an effector ONLY WHEN… Starting communication with someone else (including pet animals) by eye-contact Showing interest on something to someone else (even animals use this eye gesture to request something to human)
  9. 9. Applications on top of gaze tracking should take advantage of its native functions • As a receptor: Finding • As an effector: Showing interest of/with communication partner Natural, No-fatigue, No-learning UX [JPA2017-204737]
  10. 10. Further implications As a receptor: Finding • Tracking of the flow of attention is more important than the traditional point of gaze. As an effector: Showing interest of/with communication partner • Environment SLAM used together with gaze tracking is more important than before. • Many use cases would require only rough estimation of gaze.
  11. 11. Special hardware required
  12. 12. Special hardware Hard to adopt for volume uses High price, limited use cases, and low usability Traditional Gaze tracker requires Infrared light sources and expensive imaging device (zoom camera, two cameras, and so on).
  13. 13. Use case limitation and low usability • Body attachment is bothering • Low usability Infrared light source in head mounted type device • Only for environment as Desktop PC • It can capture gaze only in near distance (~1m) • Suffers from outdoor light (eg, smart phone on the road is under sun light) Infrared light source attached with graphical monitor
  14. 14. Gaze tracking by commodity camera • Implementation • They may use Face/Eye-Ball models and track iris to approximate the point of gaze. • Advantages • Low cost • Unlike wearable type, users can have an open view • Distance flexibility (a few meters) • Free from environment with reasonable lighting condition
  15. 15. Challenge is the Accuracy • Purple positions are also clearly captured under infrared light. The infrared light source gives Purkinje reflections which give the geometry of eye-balls and purples. • On the other hand, the camera method rely on face/eye-ball models and iris position tracking. • It is not easy for commodity camera to track iris positions precisely. • It is not easy for monocular camera to acquire precise Face/eye-ball 3D models.
  16. 16. Calibration hurdle
  17. 17. Calibration Hurdle It is bothering. It makes mass-adoption difficult. Traditional gaze tracker requires initial setup of personal parameters by asking user to see several known points.
  18. 18. Background calibration • Implementations • They may calibrate personal eye parameters such as Eye-ball center positions, and Visual/optical axis delta • They may use iterative algorithm to optimize parameters • They may take advantages of user’s operation of object selection (mouse click, touch etc) to calibrate parameters in the background • Advantage • Users are not aware of calibration process.
  19. 19. Challenges of Background calibration Smooth UX Face, Eye- ball Model Sensor- object calibration JPO PA2018-173743
  20. 20. Challenge: Smooth UX Previous works • Background calibration doesn’t bother users. • But UX before/during/after calibration are different. Challenge • Smooth transition before/after background calibration are required. JPO PA2018-173743
  21. 21. Challenge: Face, Eye-ball 3D Model Previous works • Eye parameters (eye ball center depth, optical/visual axis delta) gives a geometry Challenge • Eye ball position in face model and iris gives a geometry. It needs Face model calibration besides that of eyes.
  22. 22. Challenge: Sensor-object calibration Previous work • The attention target object positions relative to camera position are pre-calibrated or fixed. Challenge • Imagine sensors in a robot which can move. Needs a calibration of the positions b/w sensor and objects
  23. 23. Summary: Factors for Gaze tracking to be among main-stream UIs Natural use of gaze Commodity device Background calibration • Face 3D • UX • De-coupling sensor-object calibration

×