Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text

38 views

Published on

This talk was presented at the 2015 HCIL Annual Symposium put on by the University of Maryland's Human Computer Interaction Lab.

We introduce the preliminary design of a novel vision-augmented touch system called HandSight intended to support activities of daily living (ADLs) by sensing and feeding back non-tactile information about the physical world as it is touched. Though we are interested in supporting a range of ADL applications, here we focus specifically on reading printed text.

The recent miniaturization of cameras has enabled finger- based reading approaches, which provide blind and visually impaired readers with access to printed materials. Compared to handheld text scanners such as mobile phone applications, mounting a tiny camera on the user’s own finger has the potential to mitigate camera framing issues, enable a blind reader to better understand the spatial layout of a document, and provide better control over reading pace. A finger-based approach, however, also introduces the need to guide the reader in physically navigating a document, such as tracing along lines of text. While previous work has proposed audio and haptic directional finger guidance for this purpose, user studies of finger-based reading have been limited to 3 or 4 participants.

To further investigate the feasibility of a finger-based sensing and feedback system for reading printed text, we conducted two lab studies: (i) a comparison of audio and haptic directional finger guidance with 20 blind participants using an iPad-based testbed, and (ii) a smaller proof-of- concept study with 4 blind participants using a preliminary wearable prototype called HandSight.
Findings from the first study show similar performance between haptic and audio directional guidance, although audio may offer an accuracy advantage for line tracing. Subjective feedback also highlights tradeoffs between the two types of guidance, such as the interference of audio guidance with speech output and the potential for desensitization to haptic guidance. While several participants appreciated the direct access to layout information provided by finger-based exploration, important concerns also arose about ease of use and the amount of concentration required.

We close with a discussion on the advantages of finger-based reading for blind readers, its feasibility, and potential design improvements to the HandSight prototype.

Published in: Science
  • Be the first to comment

  • Be the first to like this

HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text

  1. 1. HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Lee Stearns1, Ruofei Du1, Uran Oh1, Catherine Jou1, Yumeng Wang2, Leah Findlater3, Rama Chellappa4, David A. Ross5, Jon E. Froehlich1 University of Maryland: Computer Science1, Architecture2, Information Studies3, Electrical Engineering4, Atlanta VA R&D Center of Excellence in Vision and Neurocognitive Rehabilitation (CVNR)5
  2. 2. There are 285 million people with visual impairments worldwide—including 39 million who are blind
  3. 3. There are 285 million people with visual impairments worldwide—including 39 million who are blind Visual impairments can negatively impact a person’s ability to perform activities of daily living (ADLs)
  4. 4. Previous research has explored using mobile cameras with computer vision for at-a-distance tasks
  5. 5. but they fail to support touch-based interactions
  6. 6. Our Approach: HandSight a vision augmented touch system
  7. 7. Our Approach: HandSight a vision augmented touch system Tiny CMOS cameras, micro-haptic actuators mounted on one or more fingers
  8. 8. Our Approach: HandSight a vision augmented touch system Tiny CMOS cameras, micro-haptic actuators mounted on one or more fingers SmartWatch for power, processing, speech output
  9. 9. Our Approach: HandSight a vision augmented touch system Tiny CMOS cameras, micro-haptic actuators mounted on one or more fingers SmartWatch for power, processing, speech output Computer vision, machine learning algorithms to support fingertip-based sensing
  10. 10. Our Vision for HandSight Augment the sense of touch with a device that is unobtrusive, always available, and that allows seamless switching between the physical world and HandSight enabled applications.
  11. 11. Interacting with the Physical World: Example Applications
  12. 12. 1. Overview of Current Prototype 2. Study on Finger Guidance for Reading
  13. 13. Finger-mounted camera Awaiba NanEye 2C CMOS camera 1x1mm2, 250x250 pixel 44 fps Self-illuminated with 4-LED ring
  14. 14. Wrist-mounted controller Current Design: Arduino Pro Micro with Bluetooth 4.0 Powers and controls haptic feedback
  15. 15. Output Finger-mounted vibration motors
  16. 16. Output Finger-mounted vibration motors Speech / audio via onboard speakers or wireless headset
  17. 17. Objectives for HandSight • Extensible platform • Develop algorithms for touch-based interactions • Investigate physical designs and feedback modalities
  18. 18. Objectives for HandSight • Extensible platform • Develop algorithms for touch-based interactions • Investigate physical designs and feedback modalities • Proof-of-concept applications • Reading • Dressing • Technology Access
  19. 19. 1. Overview of Current Prototype 2. Study on Finger Guidance for Reading
  20. 20. Braille Popular Reading Methods
  21. 21. Popular Reading Methods Scanner / Screen Reader
  22. 22. Mobile Apps (KNFB Reader) Wearables (OrCam) Popular Reading Methods
  23. 23. Touch-based Reading • Potential advantages • Immediate access, no alignment issues
  24. 24. Touch-based Reading • Potential advantages • Immediate access, no alignment issues • Spatial awareness and interactions—may improve comprehension and document understanding
  25. 25. Touch-based Reading • Potential advantages • Immediate access, no alignment issues • Spatial awareness and interactions—may improve comprehension and document understanding FingerReader
  26. 26. Touch-based Reading • Problem: How can we guide a blind person’s finger? • Following a line of text • Locating the start of the next line • Exploring a document’s layout
  27. 27. Continuous audio cues indicate content beneath the user’s finger System Design: Exploration Mode
  28. 28. Continuous audio cues indicate content beneath the user’s finger System Design: Exploration Mode
  29. 29. Audio or haptic directional guidance helps users to stay on the line or locate the start of new line. Audio cues indicate start/end of a line or paragraph. System Design: Reading Mode
  30. 30. Audio or haptic directional guidance helps users to stay on the line or locate the start of new line. Audio cues indicate start/end of a line or paragraph. System Design: Reading Mode
  31. 31. System Design: Algorithms We use Tesseract OCR with custom preprocessing and motion tracking algorithms
  32. 32. Study Overview Study I: feasibility and directional guidance evaluation (20 participants) Study II: proof-of-concept (4 participants)
  33. 33. Study I Method Simulated reading experience using an iPad
  34. 34. Study I Method Simulated reading experience using an iPad 20 participants Average Age 48.3 (SD=11.7, Range=26-67) Gender 12 Male, 8 Female Vision Level 11 Totally Blind, 9 Light Sensitive
  35. 35. Study I Method Simulated reading experience using an iPad 20 participants Within-subjects, 2 conditions: audio and haptic
  36. 36. Study I Method Simulated reading experience using an iPad 20 participants Within-subjects, 2 conditions: audio and haptic Participants read 2 documents for each condition plain magazine
  37. 37. Study I Method Simulated reading experience using an iPad 20 participants Within-subject, 2 conditions: audio and haptic Participants read 2 documents for each condition Analysis: reading speed and accuracy, comprehension, subjective feedback audio haptic
  38. 38. Study I Findings audio haptic Audio vs. Haptic: Similar performance. Audio was significantly more accurate for line tracing on the magazine documents (p = 0.018)
  39. 39. Study I Findings Audio vs. Haptic: Similar performance. Audio was significantly more accurate for line tracing on the magazine documents (p = 0.018) Preferences split (12 haptic, 7 audio, 1 equal preference)
  40. 40. Study I Findings Audio vs. Haptic: Similar performance. Audio was significantly more accurate for line tracing on the magazine documents (p = 0.018) Preferences split (12 haptic, 7 audio, 1 equal preference) Participant comments: audio interferes with speech, desensitization to haptics over time
  41. 41. Study I Findings Pros: Learning curve < braille Access to spatial information Overall Reading Experience:
  42. 42. Study I Findings Pros: Cons: Learning curve < braille Hard to use Access to spatial information Requires concentration Overall Reading Experience:
  43. 43. Study II: Proof of Concept 4 participants
  44. 44. Future Work Study utility of spatial layout information
  45. 45. Future Work Study utility of spatial layout information Reduce cognitive and physical effort
  46. 46. Future Work Study utility of spatial layout information Reduce cognitive and physical effort Explore possibilities for camera placement
  47. 47. HandSight a vision augmented touch system
  48. 48. Questions? Lee Stearns1, Ruofei Du1, Uran Oh1, Catherine Jou1, Yumeng Wang2, Leah Findlater3, Rama Chellappa4, David A. Ross5, Jon E. Froehlich1 University of Maryland: Computer Science1, Architecture2, Information Studies3, Electrical Engineering4, Atlanta VA R&D Center of Excellence in Vision and Neurocognitive Rehabilitation (CVNR)5 Thank you to our participants and the Maryland State Library for the Blind and Physically Handicapped. This research was funded by the Department of Defense. Images from Flickr used under Creative Commons license. Contact: lstearns@umd.edu | jonf@cs.umd.edu

×