Using gaze input to navigate a virtual geospatial environment

974 views

Published on

This is a presentation for the defense of my capstone project for a Master's of Science in Human-Computer Interaction, from Rochester Institute of Technology. For this project I created an application for navigating a geospatial display through gaze input on a 2D user interface overlay.

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
974
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Calendar time, not man-time
  • Using gaze input to navigate a virtual geospatial environment

    1. 1. Using Gaze Input to Navigate a Virtual Geospatial Environment Mark Hazlewood Committee Anne Haake (chair) Reynold Bailey
    2. 2. Defense Outline • Capstone project details • Prior work • Software and user interface design • User testing details and results • Conclusions and future work
    3. 3. Project Details Prior Work Software Design Project Details Capstone project objectives and timeline User Testing Future Work
    4. 4. Project Details Prior Work Software Design User Testing Future Work Objectives • Primary objective • Develop a software application allowing users to navigate in a virtual geospatial environment using their gaze as input • Secondary objectives • Attempt using the Kinect sensor as a remote eye tracker • Conduct preliminary user evaluations of the developed application
    5. 5. Project Details Prior Work Software Design User Testing Future Work Planned Timeline Week 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Activity Development Integration+Test Participant Recruitment User Testing Analysis, documentation, writeup
    6. 6. Project Details Prior Work Software Design User Testing Future Work Planned Timeline Phase 4 Phase 3 Phase 2 Phase 1 Week 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Activity Development Integration+Test Participant Recruitment User Testing Analysis, documentation, writeup Phase 1 – Exploratory Development with Kinect Phase 2 – Selection of an Eye Tracking System Phase 3 – Development of Geospatial Application Phase 4 – User Testing
    7. 7. Project Details Prior Work Software Design Prior Work Primary References and Inspiration User Testing Future Work
    8. 8. Project Details Prior Work Software Design Stellmach, et al • Designing Gaze-based User Interfaces for Steering in Virtual Environments (ETRA ‘12) • Evaluated several techniques in gazebased navigation • Proposed a taxonomy of gaze-based UI activation methods for navigation • Environment was a 3D virtual “maze” • My project built on some of the general goals of Stellmach’s research, but specifically applied to a geospatial context User Testing Future Work
    9. 9. Project Details Prior Work Software Design User Testing Future Work Stellmach, et al • Stellmach’s proposed taxonomy: Input Technique x Activation Speed Description Discrete x Constant (DC) Input activated through fixed UI regions at a constant view change rate. Once activated, a particular movement action remains active until toggled. Continuous x Gradient-based (CG) Input activated through fixed UI regions at a variable view change rate. Movement actions are only active when gaze is within the UI element.
    10. 10. Project Details Prior Work Stellmach, et al Software Design User Testing Future Work
    11. 11. Project Details Prior Work Software Design Adams, et al • The Inspection of Very Large Images by Eye-gaze Control (AVI ‘08) • Proposed multiple methods for gazebased zooming • Maintained fixed method for gaze-based panning • Used geospatial application as a test-bed, but research focus was on image viewing • My project referenced Adams’ work for the “edge-of-screen” panning UI User Testing Future Work
    12. 12. Project Details Prior Work Software Design User Testing Future Work Adams, et al • Adams’ zooming techniques: Technique Description Stare-to-Zoom (STZ) Sustained gaze in the central region of the display causes image to zoom inwards. Requires extended stationary gaze > 420 ms. Zooming continues while gaze remains stationary. Head-to-Zoom (HTZ) Zooming is initiated by movements of the users head (calculated by eye to screen distance). Leaning forward a small amount (~40 mm) initiates zooming in. Leaning backward the same amount zooms out. Dual-to-Zoom (DTZ) Zooming is initiated using the mouse. Left mouse button zooms in, right button zooms out. Panning is still done through gaze. Mouse-to-Zoom (MTZ) Used as a baseline for comparison with other techniques. Both zoom and pan is accomplished using the mouse.
    13. 13. Project Details Prior Work Software Design Software Design Design of the geospatial user interface and software User Testing Future Work
    14. 14. Project Details Prior Work Software Design User Testing Future Work User Interface Design • Hypothesis • When navigating large geospatial areas, current zoom level can be used as an indicator of the level of detailed information a user wishes to view • Zoomed out  Assume user is interested in navigating over large geographic areas, from one broad region to another • Zoomed in  Assume user is interested in “fine searching” among smaller geographic landmarks • Design Goal • Provide an adaptive UI that supports multiple user levels of interest
    15. 15. Project Details Prior Work Software Design User Interface Design User Testing Future Work
    16. 16. Project Details Prior Work Software Design User Testing Future Work User Interface Design Zoom In Pan Zoom Out
    17. 17. Project Details Prior Work Software Design User Interface Design User Testing Future Work
    18. 18. Project Details Prior Work Software Design User Interface Design User Testing Future Work
    19. 19. Project Details Prior Work Software Design User Testing Future Work User Interface Design Edge Pan Zoom In Zoom Out
    20. 20. Project Details Prior Work Software Design User Interface Design User Testing Future Work
    21. 21. Project Details Prior Work Software Design User Testing Future Work User Interface Design Center Pan Zoom In Zoom Out
    22. 22. Project Details Prior Work Software Design User Interface Design User Testing Future Work
    23. 23. Project Details Prior Work Software Design User Interface Design User Testing Future Work
    24. 24. Project Details Prior Work Software Design User Testing User Interface Design • Gaze cursor • Displays current (filtered) gaze point to the user • Helpful in maintaining orientation when navigating the UI Future Work
    25. 25. Project Details Prior Work Software Design Software Design Details User Testing Future Work
    26. 26. Project Details Prior Work Software Design User Testing Gaze Point Filter • Problem • Even calibrated, raw output from eye tracker is noisy • Brief (but valid) fixations contribute to the noisiness • Makes UI activation difficult and gaze cursor is very distracting Future Work
    27. 27. Project Details Prior Work Software Design Gaze Point Filter • Unfiltered output User Testing Future Work
    28. 28. Project Details Prior Work Software Design User Testing Future Work Gaze Point Filter • Solution • Filter the tracker’s output prior to processing by the client application
    29. 29. Project Details Prior Work Software Design Gaze Point Filter • Moving Average Filter • As 2D points are received by the tracker, samples are added to a queue (“window”) • Average of all samples currently in the window is returned • Window size is configurable – optimal found to be 15-20 samples • After initial charging, resulting output is greatly improved User Testing Future Work
    30. 30. Project Details Prior Work Software Design Gaze Point Filter • Moving Average Filter - charging User Testing Future Work
    31. 31. Project Details Prior Work Software Design Gaze Point Filter • Moving Average Filter - charging User Testing Future Work
    32. 32. Project Details Prior Work Software Design User Testing Gaze Point Filter • Moving Average Filter – output at window size = 10 Future Work
    33. 33. Project Details Prior Work Software Design User Testing Gaze Point Filter • Moving Average Filter – output at window size = 25 Future Work
    34. 34. Project Details Prior Work Software Design User Testing Detailed Design EyeTrackerAPI WorldWindGazeInput Swing WorldWind Future Work
    35. 35. Project Details Prior Work Software Design User Testing Detailed Design - EyeTrackerAPI Future Work
    36. 36. Project Details Prior Work Software Design User Testing Detailed Design - EyeTrackerAPI Future Work
    37. 37. Project Details Prior Work Software Design User Testing Detailed Design - EyeTrackerAPI Future Work
    38. 38. Project Details Prior Work Software Design User Testing Detailed Design - EyeTrackerAPI Future Work
    39. 39. Project Details Prior Work Software Design User Testing Detailed Design - EyeTrackerAPI Future Work
    40. 40. Project Details Prior Work Software Design User Testing Future Work Detailed Design - WorldWindGazeInput
    41. 41. Project Details Prior Work Software Design User Testing Future Work Detailed Design - WorldWindGazeInput
    42. 42. Project Details Prior Work Software Design User Testing Future Work Detailed Design - WorldWindGazeInput
    43. 43. Project Details Prior Work Software Design User Testing Future Work Detailed Design - WorldWindGazeInput
    44. 44. Project Details Prior Work Software Design User Testing User Testing Preliminary user evaluations of gaze input application Future Work
    45. 45. Project Details Prior Work Software Design User Testing User Testing • Goals • Evaluate the effectiveness of proposed designs • Quantitative • Are participants able to use the UI? • How effective are they in navigating to geographic regions? • Qualitative • How natural or intuitive is the experience? • Do participants feel like the system responds to their intent? Future Work
    46. 46. Project Details Prior Work Software Design User Testing Future Work Participants & Recruiting • Planned goal was 5-10 test participants • Recruited via online posts (graduate forum) and email solicitation • Prospective participants completed an online screener • Ended up with eight (8) participants
    47. 47. Project Details Prior Work Software Design Participants & Recruiting User Testing Future Work
    48. 48. Project Details Prior Work Software Design Participants & Recruiting User Testing Future Work
    49. 49. Project Details Prior Work Software Design Participants & Recruiting User Testing Future Work
    50. 50. Project Details Prior Work Software Design Participants & Recruiting User Testing Future Work
    51. 51. Project Details Prior Work Software Design User Testing Future Work Test Procedures 1. Background questionnaire 2. Introduced to eye tracking system, calibration procedure, and tasks 3. Initial calibration • 9-point automatic, using iViewX Experiment Center 4. Introduced to geospatial application and user interface 5. Navigation to practice point (with moderator support) 6. Sequential navigation to test regions (A, B, C, D) 1. 2. 3. 4. Pan to general area of region Zoom to region Activate sub-points in the region Zoom out to furthest level
    52. 52. Project Details Prior Work Software Design User Testing Test Procedures – Test regions Future Work
    53. 53. Project Details Prior Work Software Design User Testing Future Work Test Procedures – Test regions (subpoints)
    54. 54. Project Details Prior Work Software Design User Testing Future Work Test Procedures – Initial calibration • Targeted a < 1° angular error (X and Y)
    55. 55. Project Details Prior Work Software Design User Testing Test Procedures – Task ordering Participant 1 2 3 4 5 6 7 8 Region Sequence A B C D B C D A C D A B D A B C A B C D B C D A C D A B D A B C Future Work
    56. 56. Project Details Prior Work Software Design User Testing Test Results - Quantitative Region Label Average task time (seconds) A 138.41 B 141.43 C 153.21 D 151.13 Overall Average 146.05 Future Work
    57. 57. Project Details Prior Work Software Design User Testing Test Results - Quantitative Task Number Average task time (seconds) 1 179.48 2 146.93 3 135.52 4 122.23 Overall Average 146.05 Future Work
    58. 58. Project Details Prior Work Software Design User Testing Test Results - Qualitative • Participants given two surveys after tasks completion • Qualitative gaze input survey • System Usability Scale (SUS) • Then debriefed with directed questions from moderator Future Work
    59. 59. Project Details Prior Work Software Design User Testing Future Work Test Results – Gaze input survey results
    60. 60. Project Details Prior Work Software Design User Testing Test Results – SUS results Future Work
    61. 61. Project Details Prior Work Software Design User Testing Test Results – SUS results Participant 1 2 SUS Score 3 62.5 67.5 92.5 4 5 6 7 8 Average 65.0 62.5 77.5 75.0 55.0 69.7 Future Work
    62. 62. Project Details Prior Work Software Design User Testing Future Work Conclusions and Future Work
    63. 63. Project Details Prior Work Software Design User Testing Future Work User Testing Observations • Response to adaptive pan UI • Initially somewhat disruptive • Edge pan provided larger target surface • Preferred when calibration had large angular error • Central pan provided finer control and better view of map • Generally preferred, except when calibration made activation difficult • Expectation of dwell-based operation • Before initial exposure to UI, participants expected a dwell-based solution • Expected map to pan/zoom to where they were looking
    64. 64. Project Details Prior Work Software Design User Testing Future Work User Testing Observations • “Opposite Pan Problem” • Many users had a tendency to pan in the exact opposite direction • Error was relatively frequent and consistent between participants • Debriefing revealed an opposite expectation of pan behavior
    65. 65. Project Details Prior Work Software Design User Testing Future Work User Testing Observations • “Opposite Pan Problem” • Many users had a tendency to pan in the exact opposite direction • Error was relatively frequent and consistent between participants • Debriefing revealed an opposite expectation of pan behavior Pan Target
    66. 66. Project Details Prior Work Software Design User Testing Future Work User Testing Observations • “Opposite Pan Problem” • Many users had a tendency to pan in the exact opposite direction • Error was relatively frequent and consistent between participants • Debriefing revealed an opposite expectation of pan behavior Observed Error Correct Pan Pan Target
    67. 67. Project Details Prior Work Software Design User Testing Future Work Ideas for Future Work • More expansive and rigorous user testing • Large variation in qualitative survey responses • Larger sample size could yield more concrete significant results • Comparative study of effectiveness of various design alternatives • Fixed vs. adaptive pan UI • Dwell-based activation vs. UI-based • Implementation of Adams’ zoom techniques with adaptive pan UI

    ×