Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras

123 views

Published on

Presented at the 18th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2016).

The recent miniaturization of cameras has enabled finger-based reading approaches that provide blind and visually impaired readers with access to printed materials. Compared to handheld text scanners such as mobile phone applications, mounting a tiny camera on the user’s own finger has the potential to mitigate camera framing issues, enable a blind reader to better understand the spatial layout of a document, and provide better control over reading pace. A finger-based approach, however, also introduces the need to guide the reader in physically navigating a document, such as tracing along lines of text. While previous work has proposed audio and haptic directional finger guidance for this purpose, user studies of finger- based reading have not provided an in-depth performance analysis of the finger-based reading process. To further investigate the effectiveness of finger-based sensing and feedback for reading printed text, we conducted a controlled lab experiment with 19 blind participants, comparing audio and haptic directional finger guidance within an iPad-based testbed. As a small follow-up, we asked four of those participants to return and provide feedback on a preliminary wearable prototype called HandSight. Findings from the controlled experiment show similar performance between haptic and audio directional guidance, although audio may offer an accuracy advantage for tracing lines of text. Subjective feedback also highlights tradeoffs between the two types of guidance, such as the interference of audio guidance with speech output and the potential for desensitization to haptic guidance. While several participants appreciated the direct access to layout information provided by finger-based exploration, important concerns also arose about ease of use and the amount of concentration required. We close with a discussion on the effectiveness of finger-based reading for blind users and potential design improvements to the HandSight prototype.

Published in: Science
  • Be the first to comment

  • Be the first to like this

Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras

  1. 1. Lee Stearns1, Ruofei Du1, Uran Oh1, Catherine Jou1, Leah Findlater2, David A. Ross3, Jon E. Froehlich1 University of Maryland: Computer Science1, Information Studies2, Atlanta VA R&D Center for Visual & Neurocognitive Rehabilitation3 Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras TACCESS | ASSETS 2016
  2. 2. What if printed text could be accessed through touch in the same way as braille? *Video Credit: YouTube—Ginny Owens—How I See It (Reading Braille)
  3. 3. What if printed text could be accessed through touch in the same way as braille?
  4. 4. What if printed text could be accessed through touch in the same way as braille? Reading printed materials is still an important but challenging task for people with visual impairments
  5. 5. POPULAR READING DEVICES
  6. 6. POPULAR READING DEVICES Scanner | OCR | Screen Reader
  7. 7. POPULAR READING DEVICES Dedicated devices (e.g., video magnifiers)
  8. 8. POPULAR READING DEVICES Smartphone apps (e.g., KNFB Reader iOS)
  9. 9. POPULAR READING DEVICES Wearable Cameras (e.g., OrCam)
  10. 10. Scanner | OCR | Screen Reader Dedicated Devices (e.g., video magifiers) Smartphone Apps (e.g., KNFB Reader iOS) Wearable Cameras (e.g., OrCam) POPULAR READING DEVICES
  11. 11. Open Questions (Existing Devices) 1. How to assist with aiming the camera to capture desired content?
  12. 12. Open Questions (Existing Devices) 1. How to assist with aiming the camera to capture desired content? 2. How to handle complex documents and convey layout information?
  13. 13. HANDSIGHT A vision-augmented touch system
  14. 14. HANDSIGHT A vision-augmented touch system Tiny CMOS cameras,
  15. 15. HANDSIGHT Tiny CMOS cameras, haptic actuators mounted on one or more fingers A vision-augmented touch system
  16. 16. HANDSIGHT Tiny CMOS cameras, haptic actuators mounted on one or more fingers Smartwatch for power, processing, speech and audio output A vision-augmented touch system
  17. 17. HANDSIGHT Tiny CMOS cameras, haptic actuators mounted on one or more fingers Smartwatch for power, processing, speech and audio output A vision-augmented touch system * Originally proposed in Stearns et al. 2014
  18. 18. AUGMENTING THE USER’S FINGER Survey: Digital Digits (Shikrot et al. 2015)
  19. 19. Camera & Optical Mouse Sensor AUGMENTING THE USER’S FINGER Magic Finger (Yang et al. 2012)
  20. 20. Camera and Vibration Motor AUGMENTING THE USER’S FINGER FingerReader (Shilkrot et al. 2014, 2015)
  21. 21. Camera Vibration Motors AUGMENTING THE USER’S FINGER HandSight (Stearns et al. 2014) Processing+Power
  22. 22. Advantages of Finger-Based Reading 1. Does not require framing an overhead camera
  23. 23. Advantages of Finger-Based Reading 1. Does not require framing an overhead camera 2. Allows direct access to spatial information
  24. 24. Advantages of Finger-Based Reading 1. Does not require framing an overhead camera 2. Allows direct access to spatial information 3. Provides better control over pace and rereading
  25. 25. Advantages of Finger-Based Reading 1. Does not require framing an overhead camera 2. Allows direct access to spatial information 3. Provides better control over pace and rereading
  26. 26. Advantages of Finger-Based Reading 1. Does not require framing an overhead camera 2. Allows direct access to spatial information 3. Provides better control over pace and rereading New Challenges 1. How to precisely trace a line of text? 2. How to support physical navigation?
  27. 27. COMPARING TWO TYPES OF DIRECTIONAL FINGER GUIDANCE 2. Audio via built-in or external speakers 1. Finger-mounted vibration motors
  28. 28. 1. Finger-mounted vibration motors COMPARING TWO TYPES OF DIRECTIONAL FINGER GUIDANCE
  29. 29. Move up 1. Finger-mounted vibration motors COMPARING TWO TYPES OF DIRECTIONAL FINGER GUIDANCE
  30. 30. Move down 1. Finger-mounted vibration motors COMPARING TWO TYPES OF DIRECTIONAL FINGER GUIDANCE
  31. 31. 2. Audio via built-in or external speakers COMPARING TWO TYPES OF DIRECTIONAL FINGER GUIDANCE
  32. 32. Higher pitch: move up 2. Audio via built-in or external speakers COMPARING TWO TYPES OF DIRECTIONAL FINGER GUIDANCE
  33. 33. Lower pitch: move down 2. Audio via built-in or external speakers COMPARING TWO TYPES OF DIRECTIONAL FINGER GUIDANCE
  34. 34. Research Questions 1. To what extent are finger-based cameras a viable accessibility solution for reading printed text? 2. What design choices can improve this viability?
  35. 35. Study Overview Study I: initial iPad study (19 participants) Study II: physical prototype study (4 participants)
  36. 36. Study I: initial iPad study (19 participants) Study II: physical prototype study (4 participants) Study Overview
  37. 37. Study I: initial iPad study (19 participants) Study Overview Study I: initial iPad study (19 participants) Goals: Compare audio/haptic Explore & interpret spatial layouts Assess reading and comprehension Study Overview
  38. 38. Study I Method Used an iPad to focus on user experience, gather finger trace data
  39. 39. Study I Method Used an iPad to focus on user experience, gather finger trace data 19 participants Median Age 48 (SD=12, Range=26-67) Gender 11 Male, 8 Female Vision Level 10 Totally Blind, 9 Light Sensitive
  40. 40. Study I Method Used an iPad to focus on user experience, gather finger trace data 19 participants Within-subjects, two guidance conditions: audio and haptic Haptic vibrations Audio pitch
  41. 41. Study I Method Used an iPad to focus on user experience, gather finger trace data 19 participants Within-subjects, two guidance conditions: audio and haptic Participants read two documents for each condition plain magazine
  42. 42. Study I Method Used an iPad to focus on user experience, gather finger trace data 19 participants Within-subject, two guidance conditions: audio and haptic Participants read two documents for each condition Analysis: reading speed and accuracy, comprehension, subjective feedback audio haptic
  43. 43. Exploration Mode Reading Mode System Design: Exploration and Reading Modes
  44. 44. Continuous audio feedback to identify content beneath finger Flute sound: text Cello sound: picture Silence: empty space Same across both conditions System Design: Exploration Mode
  45. 45. Flute sound: text Silence: empty space Cello sound: picture
  46. 46. Bimanual: right index finger to read, left to anchor start of line Directional guidance: audio or haptic depending on condition Used to stay on the line or find the start of the next line Audio: pitch of continuous audio Haptic: strength and position of vibration Additional audio cues (same for both conditions) Start/end of line or paragraph Synthesized speech System Design: Reading Mode
  47. 47. Above the line: downward guidance (low pitch or lower vibration motor) Below the line: upward guidance (high pitch or upper vibration motor) Start/end of line or paragraph (short but distinctive audio cues)
  48. 48. Study I Findings Haptic vs. Audio: Quantitative Performance
  49. 49. Study I Findings Haptic vs. Audio: Quantitative Performance Line tracing / magazine documents: Audio significantly more accurate (p = 0.018)
  50. 50. Study I Findings Haptic vs. Audio: Quantitative Performance Line tracing / magazine documents: Audio significantly more accurate (p = 0.018) audio haptic Example finger traces—Dashed red lines mark drift off of the line
  51. 51. Study I Findings Haptic vs. Audio: Quantitative Performance Line tracing / magazine documents: Audio significantly more accurate (p = 0.018) Comprehension high, no significant differences between conditions audio haptic Example finger traces—Dashed red lines mark drift off of the line
  52. 52. Study I Findings Haptic vs. Audio: Subjective Preference Preferences split (11 haptic, 7 audio, 1 equal preference)
  53. 53. Study I Findings Haptic vs. Audio: Subjective Preference Preferences split (11 haptic, 7 audio, 1 equal preference) Preferred Haptic More intuitive Easier to use Faster Less distracting
  54. 54. Study I Findings Haptic vs. Audio: Subjective Preference Preferences split (11 haptic, 7 audio, 1 equal preference) Preferred Haptic Preferred Audio More intuitive Easier to use Faster Less distracting Less confusing More comfortable No desensitization
  55. 55. Study I Findings Haptic vs. Audio: Subjective Preference Preferences split (11 haptic, 7 audio, 1 equal preference) Preferred Haptic Preferred Audio More intuitive Easier to use Faster Less distracting Less confusing More comfortable No desensitization Reflects contradictory findings in Stearns et al. 2014, Shilkrot et al. 2014, 2015
  56. 56. Study I Findings Overall Reading Experience Pros Low learning curve Flexible Direct control over speed
  57. 57. Study I Findings Overall Reading Experience Pros Cons Low learning curve Flexible Direct control over speed Hard to use for reading High cognitive load may affect comprehension
  58. 58. Study I Findings Exploration Mode Participants appreciated direct access to spatial information, and nearly all able to locate images and count the number of columns.
  59. 59. Study I: initial iPad study (19 participants) Study II: physical prototype study (4 participants) Study Overview
  60. 60. Study I: initial iPad study (19 participants) Study II: physical prototype study (4 participants) Study Overview
  61. 61. Goals: Evaluate HandSight prototype Gather subjective feedback Compare with KNFB Reader iOS Study II: physical prototype study (4 participants) Study Overview
  62. 62. Finger-mounted camera to read physical documents Study II: HandSight Prototype System
  63. 63. Study II Method HandSight: Each participant used their preferred guidance from Study I to explore and read two documents
  64. 64. Study II Method KNFB Reader iOS: Photograph and read 3 physical documents
  65. 65. Study II Findings HandSight: Overall Experience Average reading speed: 45 wpm (SD=19, Range=18-60) Rated somewhat easy to use, but slow and required concentration
  66. 66. Study II Findings HandSight: Overall Experience Average reading speed: 45 wpm (SD=19, Range=18-60) Rated somewhat easy to use, but slow and required concentration Participant Quotes: “I’m very pleased and excited about the system. I think it could make a great difference in my life.” (P19) “It seems like a lot of effort for reading text.” (P12)
  67. 67. Study II Findings HandSight vs. KNFB Reader iOS Participants unanimously preferred KNFB Reader iOS HandSight KNFB Reader iOS
  68. 68. Study II Findings HandSight vs. KNFB Reader iOS Participants unanimously preferred KNFB Reader iOS Faster, easier to concentrate on the content of the text HandSight KNFB Reader iOS
  69. 69. Implications Pros Feasibility of a Finger-Based Reading Approach
  70. 70. Implications Pros Spatial layout information Feasibility of a Finger-Based Reading Approach
  71. 71. Implications Pros Spatial layout information Direct control over reading Feasibility of a Finger-Based Reading Approach
  72. 72. Implications Pros Spatial layout information Direct control over reading Reduced camera framing issues Feasibility of a Finger-Based Reading Approach
  73. 73. Implications Pros Spatial layout information Direct control over reading Reduced camera framing issues Efficient text detection and recognition Feasibility of a Finger-Based Reading Approach
  74. 74. Implications Pros Spatial layout information Direct control over reading Reduced camera framing issues Efficient text detection and recognition * We observed these in our studies Feasibility of a Finger-Based Reading Approach
  75. 75. Implications Feasibility of a Finger-Based Reading Approach Pros Cons Spatial layout information Direct control over reading Reduced camera framing issues Efficient text detection and recognition * We observed these in our studies Slower, requires increased concentration and physical dexterity
  76. 76. Implications Feasibility of a Finger-Based Reading Approach Pros Cons Spatial layout information Direct control over reading Reduced camera framing issues Efficient text detection and recognition * We observed these in our studies Slower, requires increased concentration and physical dexterity * Consistent with Shilkrot et al. 2014, 2015
  77. 77. Implications Feasibility of a Finger-Based Reading Approach Pros Cons Spatial layout information Direct control over reading Reduced camera framing issues Efficient text detection and recognition * We observed these in our studies Slower, requires increased concentration and physical dexterity * Consistent with Shilkrot et al. 2014, 2015 Importance of spatial layout information is unclear
  78. 78. Implications Feasibility of a Finger-Based Reading Approach Pros Cons Spatial layout information Direct control over reading Reduced camera framing issues Efficient text detection and recognition * We observed these in our studies Slower, requires increased concentration and physical dexterity * Consistent with Shilkrot et al. 2014, 2015 Importance of spatial layout information is unclear * Has yet to be investigated in this context
  79. 79. Future Work Study utility of spatial layout information in everyday use (e.g., newspapers, menus, maps, graphs)
  80. 80. Future Work Study utility of spatial layout information Explore possibilities for camera placement
  81. 81. HANDSIGHT a vision augmented touch system
  82. 82. Questions? Contact: lstearns@umd.edu Thank you to our participants and the Maryland State Library for the Blind and Physically Handicapped. This research was funded by the Department of Defense. Lee Stearns1, Ruofei Du1, Uran Oh1, Catherine Jou1, Leah Findlater2, David A. Ross3, Jon E. Froehlich1 University of Maryland: Computer Science1, Information Studies2, Atlanta VA R&D Center for Visual & Neurocognitive Rehabilitation3 Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras TACCESS | ASSETS 2016

×