Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
HandSight: The Design and Evaluation of a
Finger-Mounted Camera and Feedback System
to Enable Blind Persons to Read Printe...
There are 285 million people with visual impairments
worldwide—including 39 million who are blind
There are 285 million people with visual impairments
worldwide—including 39 million who are blind
Visual impairments can n...
Previous research has explored using mobile cameras
with computer vision for at-a-distance tasks
but they fail to support touch-based interactions
Our Approach: HandSight
a vision augmented touch system
Our Approach: HandSight
a vision augmented touch system
Tiny CMOS cameras,
micro-haptic actuators mounted
on one or more f...
Our Approach: HandSight
a vision augmented touch system
Tiny CMOS cameras,
micro-haptic actuators mounted
on one or more f...
Our Approach: HandSight
a vision augmented touch system
Tiny CMOS cameras,
micro-haptic actuators mounted
on one or more f...
Our Vision for HandSight
Augment the sense of
touch with a device that
is unobtrusive, always
available, and that
allows s...
Interacting with the Physical World: Example Applications
1. Overview of Current Prototype
2. Study on Finger Guidance for Reading
Finger-mounted camera
Awaiba NanEye 2C CMOS camera
1x1mm2, 250x250 pixel 44 fps
Self-illuminated with 4-LED ring
Wrist-mounted controller
Current Design:
Arduino Pro Micro
with Bluetooth 4.0
Powers and controls
haptic feedback
Output
Finger-mounted
vibration motors
Output
Finger-mounted
vibration motors
Speech / audio via
onboard speakers or
wireless headset
Objectives for HandSight
• Extensible platform
• Develop algorithms for touch-based interactions
• Investigate physical de...
Objectives for HandSight
• Extensible platform
• Develop algorithms for touch-based interactions
• Investigate physical de...
1. Overview of Current Prototype
2. Study on Finger Guidance for Reading
Braille
Popular Reading Methods
Popular Reading Methods
Scanner / Screen Reader
Mobile Apps (KNFB Reader) Wearables (OrCam)
Popular Reading Methods
Touch-based Reading
• Potential advantages
• Immediate access, no alignment issues
Touch-based Reading
• Potential advantages
• Immediate access, no alignment issues
• Spatial awareness and interactions—ma...
Touch-based Reading
• Potential advantages
• Immediate access, no alignment issues
• Spatial awareness and interactions—ma...
Touch-based Reading
• Problem: How can we guide a blind person’s finger?
• Following a line of text
• Locating the start o...
Continuous audio cues indicate content beneath the user’s finger
System Design: Exploration Mode
Continuous audio cues indicate content beneath the user’s finger
System Design: Exploration Mode
Audio or haptic directional guidance helps users to stay on the line or locate
the start of new line. Audio cues indicate ...
Audio or haptic directional guidance helps users to stay on the line or locate
the start of new line. Audio cues indicate ...
System Design: Algorithms
We use Tesseract OCR with custom preprocessing and motion tracking algorithms
Study Overview
Study I: feasibility and directional
guidance evaluation (20 participants)
Study II: proof-of-concept
(4 pa...
Study I Method
Simulated reading experience using an iPad
Study I Method
Simulated reading experience using an iPad
20 participants
Average Age 48.3 (SD=11.7, Range=26-67)
Gender 1...
Study I Method
Simulated reading experience using an iPad
20 participants
Within-subjects, 2 conditions: audio and haptic
Study I Method
Simulated reading experience using an iPad
20 participants
Within-subjects, 2 conditions: audio and haptic
...
Study I Method
Simulated reading experience using an iPad
20 participants
Within-subject, 2 conditions: audio and haptic
P...
Study I Findings
audio haptic
Audio vs. Haptic:
Similar performance. Audio was significantly more accurate
for line tracin...
Study I Findings
Audio vs. Haptic:
Similar performance. Audio was significantly more accurate
for line tracing on the maga...
Study I Findings
Audio vs. Haptic:
Similar performance. Audio was significantly more accurate
for line tracing on the maga...
Study I Findings
Pros:
Learning curve < braille
Access to spatial information
Overall Reading Experience:
Study I Findings
Pros: Cons:
Learning curve < braille Hard to use
Access to spatial information Requires concentration
Ove...
Study II: Proof of Concept
4 participants
Future Work
Study utility of spatial layout information
Future Work
Study utility of spatial layout information
Reduce cognitive and physical effort
Future Work
Study utility of spatial layout information
Reduce cognitive and physical effort
Explore possibilities for cam...
HandSight
a vision augmented touch system
Questions?
Lee Stearns1, Ruofei Du1, Uran Oh1, Catherine Jou1, Yumeng Wang2,
Leah Findlater3, Rama Chellappa4, David A. Ro...
Upcoming SlideShare
Loading in …5
×

of

HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 1 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 2 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 3 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 4 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 5 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 6 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 7 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 8 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 9 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 10 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 11 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 12 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 13 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 14 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 15 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 16 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 17 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 18 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 19 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 20 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 21 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 22 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 23 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 24 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 25 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 26 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 27 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 28 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 29 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 30 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 31 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 32 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 33 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 34 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 35 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 36 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 37 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 38 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 39 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 40 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 41 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 42 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 43 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 44 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 45 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 46 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 47 HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Slide 48
Upcoming SlideShare
Masters of SlideShare
Next

HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text

This talk was presented at the 2015 HCIL Annual Symposium put on by the University of Maryland's Human Computer Interaction Lab.

We introduce the preliminary design of a novel vision-augmented touch system called HandSight intended to support activities of daily living (ADLs) by sensing and feeding back non-tactile information about the physical world as it is touched. Though we are interested in supporting a range of ADL applications, here we focus specifically on reading printed text.

The recent miniaturization of cameras has enabled finger- based reading approaches, which provide blind and visually impaired readers with access to printed materials. Compared to handheld text scanners such as mobile phone applications, mounting a tiny camera on the user’s own finger has the potential to mitigate camera framing issues, enable a blind reader to better understand the spatial layout of a document, and provide better control over reading pace. A finger-based approach, however, also introduces the need to guide the reader in physically navigating a document, such as tracing along lines of text. While previous work has proposed audio and haptic directional finger guidance for this purpose, user studies of finger-based reading have been limited to 3 or 4 participants.

To further investigate the feasibility of a finger-based sensing and feedback system for reading printed text, we conducted two lab studies: (i) a comparison of audio and haptic directional finger guidance with 20 blind participants using an iPad-based testbed, and (ii) a smaller proof-of- concept study with 4 blind participants using a preliminary wearable prototype called HandSight.
Findings from the first study show similar performance between haptic and audio directional guidance, although audio may offer an accuracy advantage for line tracing. Subjective feedback also highlights tradeoffs between the two types of guidance, such as the interference of audio guidance with speech output and the potential for desensitization to haptic guidance. While several participants appreciated the direct access to layout information provided by finger-based exploration, important concerns also arose about ease of use and the amount of concentration required.

We close with a discussion on the advantages of finger-based reading for blind readers, its feasibility, and potential design improvements to the HandSight prototype.

Related Books

Free with a 30 day trial from Scribd

See all
  • Be the first to like this

HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text

  1. 1. HandSight: The Design and Evaluation of a Finger-Mounted Camera and Feedback System to Enable Blind Persons to Read Printed Text Lee Stearns1, Ruofei Du1, Uran Oh1, Catherine Jou1, Yumeng Wang2, Leah Findlater3, Rama Chellappa4, David A. Ross5, Jon E. Froehlich1 University of Maryland: Computer Science1, Architecture2, Information Studies3, Electrical Engineering4, Atlanta VA R&D Center of Excellence in Vision and Neurocognitive Rehabilitation (CVNR)5
  2. 2. There are 285 million people with visual impairments worldwide—including 39 million who are blind
  3. 3. There are 285 million people with visual impairments worldwide—including 39 million who are blind Visual impairments can negatively impact a person’s ability to perform activities of daily living (ADLs)
  4. 4. Previous research has explored using mobile cameras with computer vision for at-a-distance tasks
  5. 5. but they fail to support touch-based interactions
  6. 6. Our Approach: HandSight a vision augmented touch system
  7. 7. Our Approach: HandSight a vision augmented touch system Tiny CMOS cameras, micro-haptic actuators mounted on one or more fingers
  8. 8. Our Approach: HandSight a vision augmented touch system Tiny CMOS cameras, micro-haptic actuators mounted on one or more fingers SmartWatch for power, processing, speech output
  9. 9. Our Approach: HandSight a vision augmented touch system Tiny CMOS cameras, micro-haptic actuators mounted on one or more fingers SmartWatch for power, processing, speech output Computer vision, machine learning algorithms to support fingertip-based sensing
  10. 10. Our Vision for HandSight Augment the sense of touch with a device that is unobtrusive, always available, and that allows seamless switching between the physical world and HandSight enabled applications.
  11. 11. Interacting with the Physical World: Example Applications
  12. 12. 1. Overview of Current Prototype 2. Study on Finger Guidance for Reading
  13. 13. Finger-mounted camera Awaiba NanEye 2C CMOS camera 1x1mm2, 250x250 pixel 44 fps Self-illuminated with 4-LED ring
  14. 14. Wrist-mounted controller Current Design: Arduino Pro Micro with Bluetooth 4.0 Powers and controls haptic feedback
  15. 15. Output Finger-mounted vibration motors
  16. 16. Output Finger-mounted vibration motors Speech / audio via onboard speakers or wireless headset
  17. 17. Objectives for HandSight • Extensible platform • Develop algorithms for touch-based interactions • Investigate physical designs and feedback modalities
  18. 18. Objectives for HandSight • Extensible platform • Develop algorithms for touch-based interactions • Investigate physical designs and feedback modalities • Proof-of-concept applications • Reading • Dressing • Technology Access
  19. 19. 1. Overview of Current Prototype 2. Study on Finger Guidance for Reading
  20. 20. Braille Popular Reading Methods
  21. 21. Popular Reading Methods Scanner / Screen Reader
  22. 22. Mobile Apps (KNFB Reader) Wearables (OrCam) Popular Reading Methods
  23. 23. Touch-based Reading • Potential advantages • Immediate access, no alignment issues
  24. 24. Touch-based Reading • Potential advantages • Immediate access, no alignment issues • Spatial awareness and interactions—may improve comprehension and document understanding
  25. 25. Touch-based Reading • Potential advantages • Immediate access, no alignment issues • Spatial awareness and interactions—may improve comprehension and document understanding FingerReader
  26. 26. Touch-based Reading • Problem: How can we guide a blind person’s finger? • Following a line of text • Locating the start of the next line • Exploring a document’s layout
  27. 27. Continuous audio cues indicate content beneath the user’s finger System Design: Exploration Mode
  28. 28. Continuous audio cues indicate content beneath the user’s finger System Design: Exploration Mode
  29. 29. Audio or haptic directional guidance helps users to stay on the line or locate the start of new line. Audio cues indicate start/end of a line or paragraph. System Design: Reading Mode
  30. 30. Audio or haptic directional guidance helps users to stay on the line or locate the start of new line. Audio cues indicate start/end of a line or paragraph. System Design: Reading Mode
  31. 31. System Design: Algorithms We use Tesseract OCR with custom preprocessing and motion tracking algorithms
  32. 32. Study Overview Study I: feasibility and directional guidance evaluation (20 participants) Study II: proof-of-concept (4 participants)
  33. 33. Study I Method Simulated reading experience using an iPad
  34. 34. Study I Method Simulated reading experience using an iPad 20 participants Average Age 48.3 (SD=11.7, Range=26-67) Gender 12 Male, 8 Female Vision Level 11 Totally Blind, 9 Light Sensitive
  35. 35. Study I Method Simulated reading experience using an iPad 20 participants Within-subjects, 2 conditions: audio and haptic
  36. 36. Study I Method Simulated reading experience using an iPad 20 participants Within-subjects, 2 conditions: audio and haptic Participants read 2 documents for each condition plain magazine
  37. 37. Study I Method Simulated reading experience using an iPad 20 participants Within-subject, 2 conditions: audio and haptic Participants read 2 documents for each condition Analysis: reading speed and accuracy, comprehension, subjective feedback audio haptic
  38. 38. Study I Findings audio haptic Audio vs. Haptic: Similar performance. Audio was significantly more accurate for line tracing on the magazine documents (p = 0.018)
  39. 39. Study I Findings Audio vs. Haptic: Similar performance. Audio was significantly more accurate for line tracing on the magazine documents (p = 0.018) Preferences split (12 haptic, 7 audio, 1 equal preference)
  40. 40. Study I Findings Audio vs. Haptic: Similar performance. Audio was significantly more accurate for line tracing on the magazine documents (p = 0.018) Preferences split (12 haptic, 7 audio, 1 equal preference) Participant comments: audio interferes with speech, desensitization to haptics over time
  41. 41. Study I Findings Pros: Learning curve < braille Access to spatial information Overall Reading Experience:
  42. 42. Study I Findings Pros: Cons: Learning curve < braille Hard to use Access to spatial information Requires concentration Overall Reading Experience:
  43. 43. Study II: Proof of Concept 4 participants
  44. 44. Future Work Study utility of spatial layout information
  45. 45. Future Work Study utility of spatial layout information Reduce cognitive and physical effort
  46. 46. Future Work Study utility of spatial layout information Reduce cognitive and physical effort Explore possibilities for camera placement
  47. 47. HandSight a vision augmented touch system
  48. 48. Questions? Lee Stearns1, Ruofei Du1, Uran Oh1, Catherine Jou1, Yumeng Wang2, Leah Findlater3, Rama Chellappa4, David A. Ross5, Jon E. Froehlich1 University of Maryland: Computer Science1, Architecture2, Information Studies3, Electrical Engineering4, Atlanta VA R&D Center of Excellence in Vision and Neurocognitive Rehabilitation (CVNR)5 Thank you to our participants and the Maryland State Library for the Blind and Physically Handicapped. This research was funded by the Department of Defense. Images from Flickr used under Creative Commons license. Contact: lstearns@umd.edu | jonf@cs.umd.edu

This talk was presented at the 2015 HCIL Annual Symposium put on by the University of Maryland's Human Computer Interaction Lab. We introduce the preliminary design of a novel vision-augmented touch system called HandSight intended to support activities of daily living (ADLs) by sensing and feeding back non-tactile information about the physical world as it is touched. Though we are interested in supporting a range of ADL applications, here we focus specifically on reading printed text. The recent miniaturization of cameras has enabled finger- based reading approaches, which provide blind and visually impaired readers with access to printed materials. Compared to handheld text scanners such as mobile phone applications, mounting a tiny camera on the user’s own finger has the potential to mitigate camera framing issues, enable a blind reader to better understand the spatial layout of a document, and provide better control over reading pace. A finger-based approach, however, also introduces the need to guide the reader in physically navigating a document, such as tracing along lines of text. While previous work has proposed audio and haptic directional finger guidance for this purpose, user studies of finger-based reading have been limited to 3 or 4 participants. To further investigate the feasibility of a finger-based sensing and feedback system for reading printed text, we conducted two lab studies: (i) a comparison of audio and haptic directional finger guidance with 20 blind participants using an iPad-based testbed, and (ii) a smaller proof-of- concept study with 4 blind participants using a preliminary wearable prototype called HandSight. Findings from the first study show similar performance between haptic and audio directional guidance, although audio may offer an accuracy advantage for line tracing. Subjective feedback also highlights tradeoffs between the two types of guidance, such as the interference of audio guidance with speech output and the potential for desensitization to haptic guidance. While several participants appreciated the direct access to layout information provided by finger-based exploration, important concerns also arose about ease of use and the amount of concentration required. We close with a discussion on the advantages of finger-based reading for blind readers, its feasibility, and potential design improvements to the HandSight prototype.

Views

Total views

127

On Slideshare

0

From embeds

0

Number of embeds

0

Actions

Downloads

0

Shares

0

Comments

0

Likes

0

×