A Multimodal Approach To Accessible Web Content On Smartphones


Published on

Portable and Mobile Systems in Assistive Technology - A Multimodal Approach To Accessible Web Content On Smartphones - Knudsen, Lars Emil (f)

Published in: Technology, Health & Medicine
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

A Multimodal Approach To Accessible Web Content On Smartphones

  1. 1. A Multimodal Approach To Accessible Web Content On Smartphones Lars Emil Knudsen Harald Holone Østfold University College 11.07.2012 1 / 16
  2. 2. Introduction Master student at Østfold University College, Norway Applied Computer Science Motivation Programming Rewarding to contribute Research value Exciting combination of multimodality and smartphones 2 / 16
  3. 3. Introduction I will present our current work in the area of multimodal interfaces on smartphones. 1. Our implementation of W3C’s multimodal interaction framework on smartphones running the Android OS. 2. The results from user tests and interviews with blind and visually impaired users 3 / 16
  4. 4. Background SMUDI-project Norwegian speech recognition Multimodal interface Achieve universal design Run by MediaLT, a norwegian research company This project has been created in relation to the SMUDI project Rapid development in the mobile market Smartphones with new capabilities New opportunities of interface design, with multimodal interaction being one of them 4 / 16
  5. 5. Designing Robust Multimodal Systems for UniversalAccess, Oviatt 2001 “Given the right context, temporal disability applies to everyone.” 5 / 16
  6. 6. Multimodality What is multimodality? Modalities describe the different paths of communication between a human and the computer. 6 / 16
  7. 7. Multimodality and universal access Oviatt Claims that multimodality can greatly expand the accessibility of computing for diverse and non-specialist users. Also claims that multimodality can promote new forms of computing not previously available. 7 / 16
  8. 8. Multimodality, universal access, visually impaired and blind Spoken and multimodal bus timetable systems: design, development and evaluation, Turunen et al., 2005 Multimodality generally improves performance Users need traning to be able to use the system 8 / 16
  9. 9. Voice recognition External Server Several commercial services: Google Voice (Siri) Nuance Vlingo 9 / 16
  10. 10. Frameworks MONA (Niklfield et al.) Miranda (Paay et al.) Others (Reithinger et. al)(Nardelli et al.) W3C Multimodal Interaction Framework 10 / 16
  11. 11. Framework Implementation Based on W3C’s specification of their Multimodal Interaction Framework Used EMMA. EMMA - “Extensible MultiModal Annotation markup language” Descriptions of XML markup language Represents the semantics and meaning of data Implemented on a need to have basis Consist of a set of components Recognition component Interpretation component Interaction manager 11 / 16
  12. 12. Method Test the framework implementation Evaluate multimodal mobile interfaces User test Four blind and visually impaired users Two phases: 1. Practical - Users were given tasks to carry out 2. Semi-structural interview 12 / 16
  13. 13. Prototype Developed on top of, and in parallel to, the multimodal framework For Android Used a norwegian weather service (yr.no) Used norwegian voice recognition from Nuance Input modalities: Speech Touch Touch gestures Orientation Acceleration gestures Touch keyboard Navigation keys 13 / 16
  14. 14. Results Speech input was preferred Touch gestures and orientation was perceived as more fun than useful Acceleration gestures was seen as simple to use Touch keyboard, nice but slow Navigation keys, OK way of navigating 14 / 16
  15. 15. Results Feasible to implement W3C’s Multimodal Interaction Framework on Android The users were positive towards a multimodal interface Multimodal interface can help support universal access 15 / 16
  16. 16. Thank you! Questions? 16 / 16