Enabling non-visual Interaction af Stephen Brewster, University of Glasgow


Published on

Oplægget blev holdt ved InfinIT-arrangementet Temadag om Interaktionsdesign, der blev afholdt den 20. juni 2013. Læs mere om arrangementet her: http://www.infinit.dk/dk/hvad_kan_vi_goere_for_dig/viden/reportager/styr_din_mobiltelefon_med_et_nik.htm

Published in: Technology, Business
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

  1. 1. Enabling non-visual interaction Stephen Brewster Glasgow Interactive Systems Group School of Computing Science University of Glasgow stephen@dcs.gla.ac.uk June 2013
  2. 2. 2
  3. 3. 3 Multimodal interaction Key area of work is Multimodality More human way to work Not everyone has all senses / control capabilities May not always be available all of the time No one sense can do everything on its own Using other senses/control capabilities to make up for lack of visual display
  4. 4. 4 Research areas Novel multimodal interaction techniques Touchscreen and mobile user interfaces Improving the usability and user experience Tabletop interaction with phone Interaction with 3D TV, phone + TV User interfaces for cameraphones and digital cameras Accessibility Blind users and visualisation, Older adults, navigation, mobility Multimodal home care Mobile health apps / sports performance apps
  5. 5. 5 Modalities Non-speech audio Earcons, 3D sound, sonification, Musicons Computer haptics Force-feedback, pressure input, temperature output Tactile (vibrotactile and pin arrays), ultrasound Gestural interaction On-screen, in-air, multi-touch, capacitive sensing Smell
  6. 6. Display-less interaction design? No display at all? Then only input with no feedback Difficult design problem  Visual display-less? Make up for lack of visual display by the use of alternative input and output techniques  6
  7. 7. Overview of talk Motivation Issues in mobile interaction ‘Eyes free’ and ‘hands free’ Alternative multimodal solutions Gestures/pressure for input Haptic and audio displays for output Tools that you might use to replace visual displays Might need new forms of input as well as output 7
  8. 8. Mobile interaction problems Mobile interaction takes place in the world Users involved in other tasks, on the move Contexts very varied Hands and eyes may be busy Visual display may not be accessible or appropriate New forms of interaction are needed if keyboard and screen are not easily available 8
  9. 9. Touchscreens Wide application of touchscreens Phones, tablets, TV remotes, …. Larger display area, direct interaction with finger, more flexible use of device, no need for physical keyboard Touchscreens problems No tactile feedback - ‘feel’ is poor Input difficult and error prone Requires much visual attention Two hands ‘Fat finger’ problem 9
  10. 10. Solutions? Need a set of tools to use when visual displays not available Multimodality Gestures/pressure for input Haptic and audio displays for output 10
  11. 11. GESTURE INPUT 11
  12. 12. Why gestures for input? Kinaesthetic perception means gestures ‘eyes free’ Types On screen of the device Device in hand Different body locations Self-contained, no screen or surface needed Can be one handed, no handed Good if users are involved in something else, e.g. carrying, operating machinery Many sensors included in devices already Others easily added via Bluetooth 12
  13. 13. Multi-touch gestures On-screen gestures Tactile guidance for gestures T-bars Dynamic feedback Keep finger on target File-o-feel Touch-n-twist 13
  14. 14. 14 Head gesture interaction Non-visual interface where users could nod at audio sources to select them Hands-free 3D audio for output Will discuss more of the audio design later Worked well when users were mobile People could easily nod and walk Backward nods not ideal
  15. 15. Wrist gestures Can rotate wrist to control a remote cursor Investigated whether users could select targets using wrist Very effective 90% accuracy for 9° Targets Other interactions Shoulder click, foot tap, head nod, body tap, ... 15 More info: www.gaime-project.org
  16. 16. Do gesture systems work in the wild? Gesture RSS System Allows browsing of news feeds 7 participants using the system on their morning commute for a week Interaction Menu Up/down/select -> rotate/shake right wrist Back up a level -> Shake left wrist Gaiting the gestures -> rotate left wrist upside down Non-visual interaction Speech/Non-speech audio
  17. 17. 17
  18. 18. PRESSURE INPUT 18
  19. 19. Pressure input Little studied in HCI, but a rich source of input and control Musical instruments Drawing, holding / grasping Can we uses pressure as another input mechanism? Avoid the ‘fat finger’ problem by doing gestures in z dimension No need for (x,y) positioning of finger so easy to do eyes free 19
  20. 20. Pressure menus 20
  21. 21. Grip and grasp Can we use the way we grip a device to control it? Can we use this for interaction? Make a two-handed interaction into a one handed version 21
  22. 22. 2252 sec
  23. 23. Grip results Compared rotate and zoom Pinch/rotate using multitouch and 2 hands Grip One handed grip equal to or better than traditional method Less time hunting for small buttons No finger occlusions No ‘fat finger’ problem Also works well when walking Squeezing devices very effective for input 23
  24. 24. HAPTIC FEEDBACK 24
  25. 25. 25 Haptic feedback Haptics – to do with the sense of touch Display to the skin Many different components Pressure, temperature, vibration, … Has benefits over visual display Eyes-free Tactons, tactile icons
  26. 26. 26 Design of Tactons Tactons are tactile messages that can be used to communicate non-visually Encode information using parameters of cutaneous perception Waveform Duration/rhythm Body location
  27. 27. Tactile button feedback Touchscreen phones have no tactile feedback for buttons More errors typing text and numbers Compared performance of real buttons to touchscreen, to touchscreen+tactile In lab and on Glasgow subway Touchscreen+tactile as good as real buttons Touchscreen alone was poor Combining tactile + audio feedback 27
  28. 28. Tactile feedback for typing Previous studies showed adding tactile feedback to touchscreen typing increases performance Can we use the tactile feedback to communicate more? Ambient display Change the feel of buttons based on external factor Arrival of email, proximity of friend Roughness and duration Duration indicated proximity Roughness indicated friend or family Users could identify meaning while typing very accurately 28
  29. 29. Artex: Artificial textures from everyday surfaces Lack of tactile feedback on touchscreens Goal Multiple texture patches for texturing screen Aims to feel like a familiar texture, not a tactile effect Texture with everyday textures Record texture using contact mic attached to stylus Process into a loopable audio file Vary amplitude and playback rate with user’s finger speed over the screen
  31. 31. Temperature Based Interaction Temperature an unused part of touch Can we use it for communication? Very strong emotional response to temperature Humans are very sensitive to temperature Key technique for determining material properties Children’s hotter/colder game 31
  32. 32. Temperature Peltier device 4 heat pumps (2 pairs of hot and cold) Can be mobile or desk based Ran a detailed series of psychophysical studies to investigate ranges of temperatures that should be used Also tested these mobile to see more real-world effects 32
  33. 33. 33
  34. 34. Indoor mobile thermal study 34
  35. 35. Effects of changing environment Front of School Back of School
  36. 36. Design Recommendations Palm is most sensitive but wrist and arm are acceptable Stimulus intensities should be at least 3°C to guarantee detection but 6°C at most for cooling and <6°C for warming to ensure comfort Both warm and cool stimuli are detectable and comfortable but cool stimuli are preferred Cool detected fastest Moderate rate of change (2-3°C/sec) provide good saliency but lower rate of change required for high intensity stimuli
  37. 37. Applications Thermal icons Enhancing emotional experiences Thermal feedback can enhance the experience of consuming media (images, music) Notifications and warnings 37
  39. 39. Ultrasound haptics New project with University of Bristol Using phased array ultrasound beams we can create pressure waves in the air Create ‘feelable’ forces in the air above the emitters Non-contact haptics Can also be used to support (light) objects in the air 39
  40. 40. 40
  41. 41. 41
  42. 42. 42
  43. 43. Levitation! 43
  44. 44. Ultrasound haptics Challenges Position array around the edges of a device to create feedback Combine with Kinect depth camera for non-contact input and output Texture design … Just beginning to see the applications 44
  45. 45. AUDIO FEEDBACK 45
  46. 46. 46 Non-speech audio feedback Music, structured sound, sound effects, natural sound Why non-speech sound? Icons vs text, non-speech vs speech Good for rapid non-visual feedback Trends, highly structured information Earcons Structured non-speech sounds Musicons Short snippets of well known music used for interaction People very good at recognising music
  47. 47. 3D audio interaction Need to increase the audio display space Deliver more information Quickly use up display space 3D audio Provides larger display area Monitor more sound sources Non-individualised HRTFs, headphones Planar sound (2.5D) ‘Audio windows’ Each application gets its own part of the audio space 47
  48. 48. How do we use spatial audio? Applications Progress indicator Diary Pie Menus Non-visual navigation Combines well with gesture Both spatial Pointing/orienting towards a sound is natural 48
  49. 49. AudioFeeds Mobile application for monitoring activity in social media Monitoring state of feeds Spotting peaks of activity in one feed Twitter, FaceBook, RSS Spatialized sound Placed each type of activity in different location Each type had different sound Within that different actions have related sounds 49
  50. 50. AudioFeeds Users able to monitor feeds and maintain overview Even with complex soundscapes When mobile 50 FaceBook (water) Twitter (birds) RSS (abstract instruments) Inbox msg (splash) Friend feed (chirp) CNN (digeridoo) News feed (bubbles) Direct msg (crow) BBC (zither) Notification (pouring) Reference (junglefowl) TechCrunch (wind chime) Friend request (drops) Hashtag (canary) Uni News (pan flute)
  51. 51. Pulse: an auditory display to present a social vibe Presenting ‘vibe’ or ‘pulse’ of an area while you move through it ‘Play’ geo-located tweets Sonification Presented around the user in 3D sound Message volume (water splashes) Message density (flow rate of river) Topic diversity (bubbling sound) Tested in lab and in Edinburgh during the festival Effective at giving awareness 51
  52. 52. Conclusions Situations where eyes (and hands) not available Current display/interaction techniques impossible to use Multimodal interaction can provide new tools for designers New input techniques needed with non-visual outputs Gestures good as input can be ‘hands free’ Sound and tactile feedback ‘eyes free’ Hard to overcome lack of visual display Multimodal interaction techniques provide new opportunities and applications 52
  53. 53. Enabling non-visual interaction Stephen Brewster Glasgow Interactive Systems Group University of Glasgow stephen@dcs.gla.ac.uk www.dcs.gla.ac.uk/~stephen June 2013