UXD minor theme ‘Multimodal, Crossmedia and Multi-Platform

INPUT MODALITIES
Theme program

March 23: ‘Input modalities’ (Hans), workshop
  and assignment kick-off
March 30: ‘Output modalities’ (Rolf...
Theme in the scheme of things

Media, modalities and
 platforms provide us
 the nuts and bolts of
 the user experience.

T...
Crossmedia

‘Crossmedia (also known as Cross-Media, Cross-
  Media Entertainment, Cross-Media
  Communication) is a media ...
Multi-platform

‘In computing, cross-platform (also known as
   multi-platform) is a term used to refer to
   computer sof...
Multimodal

‘Multimodal interaction provides the user with
  multiple modes of interfacing with a system
  beyond the trad...
Modality

‘A modality is a path of communication between
   the human and the computer.’

http://en.wikipedia.org/wiki/Mod...
Input modalities and output modalities

‘In human-computer interaction, a modality is
   the general class of:
    a sens...
Output modalities (computer-to-human)

‘Any human sense can be translated to a modality:

   Major modalities
       See...
Input modalities (human-to-computer)

An input device is any peripheral (piece of
  computer hardware equipment) used to
 ...
Pointing devices

Ivan Sutherland (MIT) demoing Sketchpad
   (1962)
(introduced by Alan Kay in 1987)
Pointing devices

‘Pointing devices are input devices used to
  specify a position in space.
    Direct/indirect
    Abs...
Fitts’ law

‘The time it takes to move from a starting
  position to a final target is determined by the
  distance to the...
Pointing devices

And you can point at more than merely pixels on
  a screen…
Alphanumeric input: keyboards
Alphanumeric input: keyboards
Alphanumeric input: keyboards
Alphanumeric input: speech recognition

Speaker dependent/independent
Discrete-word/connected-word input
Limited/large voc...
Alphanumeric input: handwriting recognition


‘Recognition’ patents
  as early as 1914

‘Electronic ink’ and
   recognitio...
Pen Computing

‘The return of the pen’
Switching modes:
  ‘pointing’ vs. ‘ink’
Tap is the New Click

quot;One of the things our grandchildren will find
  quaintest about us is that we distinguish the
 ...
Ubiquitous computing

‘Ubiquitous computing (ubicomp) is a post-
  desktop model of human-computer
  interaction in which ...
Wearable computing

‘Wearable computers are computers that are
  worn on the body.’

http://en.wikipedia.org/wiki/Wearable...
Tangible user interfaces

Hiroshi Ishii (MIT)
Sketching Mobile Experiences

Workshop in ‘Design This!’
Gestural Interfaces

Touchscreen vs. Free-form
Ergonomics of Interactive Gestures

quot;Hands are underrated. Eyes are in charge, mind
  gets all the study, and heads do...
Ergonomics of Interactive Gestures

 Limitations due to anatomy, physiology and
  mechanics of the human body (kinesiolog...
Designing Touch Targets

No smaller than 1x1cm
 in an ideal world

In a not so ideal world:
  Iceberg Tips
  Adaptive Targ...
Designing Touch Targets

But even spaciously
  sized targets can be
  tricky…
Patterns for Touchscreens and Interactive Surfaces


Tap to open/activate
Patterns for Touchscreens and Interactive Surfaces


Tap to select
Patterns for Touchscreens and Interactive Surfaces


Drag to move object
Patterns for Touchscreens and Interactive Surfaces


Slide to scroll
Patterns for Touchscreens and Interactive Surfaces


Spin to scroll
Patterns for Touchscreens and Interactive Surfaces


Pinch to shrink and spread to enlarge
Patterns for Free-Form Interactive Gestures

Proximity activates/deactivates
Patterns for Free-Form Interactive Gestures

Point to select/activate
Patterns for Free-Form Interactive Gestures

Rotate to change state
Patterns for Free-Form Interactive Gestures

Shake to change
Patterns for Free-Form Interactive Gestures

Tilt to move
Interesting demos
Reader

Wearable computers:
  Steve Mann. Eyetap.org. http://about.eyetap.org/

Ubiquitous computing:
  Mark Weiser (1991)...
Reader

Input devices
  Doug Engelbart (1968). The mother of all demos.
   Google video stream
  Wikipedia.
   http://en.w...
Reader

Fitts’ Law
  Dan Saffer (2007). Designing for Interaction: Creating
    Smart Applications and Clever Devices. New...
Reader

Handwriting recognition
  Wacom. Unleash Windows Vista With A Pen.
   http://www.wacom.com/vista/index.php

Gestur...
Theme assignment
Today’s workshop assignment
 Work together in teams of 2-3 students on one input device
 Each team will be investigating...
Today’s workshop assignment

Available devices
     Touch screen (2)
     Wii mote (4)
     Xbox USB controller (2)
  ...
Upcoming SlideShare
Loading in...5
×

Multimodal, Crossmedia, Multi Platform

612

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
612
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
12
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Multimodal, Crossmedia, Multi Platform

  1. 1. UXD minor theme ‘Multimodal, Crossmedia and Multi-Platform INPUT MODALITIES
  2. 2. Theme program March 23: ‘Input modalities’ (Hans), workshop and assignment kick-off March 30: ‘Output modalities’ (Rolf) and assignment progress April 6: Workshop with Pieter Jongerius (Fabrique) April 13: No class, Easter April 20: Final presentations assignment
  3. 3. Theme in the scheme of things Media, modalities and platforms provide us the nuts and bolts of the user experience. The quality of the user experience is determined by our ability to utilize the media, modalities and platforms at our disposal.
  4. 4. Crossmedia ‘Crossmedia (also known as Cross-Media, Cross- Media Entertainment, Cross-Media Communication) is a media property owned, service, story or experience distributed across media platforms using a variety of media forms.’ http://en.wikipedia.org/wiki/Crossmedia
  5. 5. Multi-platform ‘In computing, cross-platform (also known as multi-platform) is a term used to refer to computer software or computing methods and concepts that are implemented and inter- operate on multiple computer platforms.’ http://en.wikipedia.org/wiki/Multiplatform
  6. 6. Multimodal ‘Multimodal interaction provides the user with multiple modes of interfacing with a system beyond the traditional keyboard and mouse input/output.’ http://en.wikipedia.org/wiki/Multimodal_interaction
  7. 7. Modality ‘A modality is a path of communication between the human and the computer.’ http://en.wikipedia.org/wiki/Modality_(human-computer_interaction)
  8. 8. Input modalities and output modalities ‘In human-computer interaction, a modality is the general class of:  a sense through which the human can receive the output of the computer (for example, vision modality)  a sensor or device through which the computer can receive the input from the human’ http://en.wikipedia.org/wiki/Modality_(human-computer_interaction)
  9. 9. Output modalities (computer-to-human) ‘Any human sense can be translated to a modality:  Major modalities  Seeing or vision modality  Hearing or audition modality  Haptic modalities  Touch, tactile or tactition modality — the sense of pressure  Proprioception modality — the perception of body awareness  Other modalities  Taste or gustation modality  Smell or olfaction modality  Thermoception modality — the sense of heat and the cold  Nociception modality — the perception of pain  Equilibrioception modality — the perception of balance’ http://en.wikipedia.org/wiki/Modality_(human-computer_interaction)
  10. 10. Input modalities (human-to-computer) An input device is any peripheral (piece of computer hardware equipment) used to provide data and control signals to an information processing system (such as a computer). http://en.wikipedia.org/wiki/Input_devices
  11. 11. Pointing devices Ivan Sutherland (MIT) demoing Sketchpad (1962) (introduced by Alan Kay in 1987)
  12. 12. Pointing devices ‘Pointing devices are input devices used to specify a position in space.  Direct/indirect  Absolute/relative’ http://en.wikipedia.org/wiki/Input_devices
  13. 13. Fitts’ law ‘The time it takes to move from a starting position to a final target is determined by the distance to the target and the size of the object.’ (Saffer, 2007)
  14. 14. Pointing devices And you can point at more than merely pixels on a screen…
  15. 15. Alphanumeric input: keyboards
  16. 16. Alphanumeric input: keyboards
  17. 17. Alphanumeric input: keyboards
  18. 18. Alphanumeric input: speech recognition Speaker dependent/independent Discrete-word/connected-word input Limited/large vocabulary
  19. 19. Alphanumeric input: handwriting recognition ‘Recognition’ patents as early as 1914 ‘Electronic ink’ and recognition in Vista http://www.freepatentsonline .com/1117184.pdf
  20. 20. Pen Computing ‘The return of the pen’ Switching modes: ‘pointing’ vs. ‘ink’
  21. 21. Tap is the New Click quot;One of the things our grandchildren will find quaintest about us is that we distinguish the digital from the real.“ William Gibson - from: Saffer (2009)
  22. 22. Ubiquitous computing ‘Ubiquitous computing (ubicomp) is a post- desktop model of human-computer interaction in which information processing has been thoroughly integrated into everyday objects and activities.’ http://en.wikipedia.org/wiki/Ubiquitous_computing
  23. 23. Wearable computing ‘Wearable computers are computers that are worn on the body.’ http://en.wikipedia.org/wiki/Wearable_computer
  24. 24. Tangible user interfaces Hiroshi Ishii (MIT)
  25. 25. Sketching Mobile Experiences Workshop in ‘Design This!’
  26. 26. Gestural Interfaces Touchscreen vs. Free-form
  27. 27. Ergonomics of Interactive Gestures quot;Hands are underrated. Eyes are in charge, mind gets all the study, and heads do all the talking. Hands type letters, push mice around, and grip steering wheels, so they are not idle, just underemployed.quot; —Malcolm McCullough, Abstracting Craft (from: Saffer, 2009)
  28. 28. Ergonomics of Interactive Gestures  Limitations due to anatomy, physiology and mechanics of the human body (kinesiology)  Left-handedness (7-10%)  Fingernails  Screen Coverage
  29. 29. Designing Touch Targets No smaller than 1x1cm in an ideal world In a not so ideal world: Iceberg Tips Adaptive Targets
  30. 30. Designing Touch Targets But even spaciously sized targets can be tricky…
  31. 31. Patterns for Touchscreens and Interactive Surfaces Tap to open/activate
  32. 32. Patterns for Touchscreens and Interactive Surfaces Tap to select
  33. 33. Patterns for Touchscreens and Interactive Surfaces Drag to move object
  34. 34. Patterns for Touchscreens and Interactive Surfaces Slide to scroll
  35. 35. Patterns for Touchscreens and Interactive Surfaces Spin to scroll
  36. 36. Patterns for Touchscreens and Interactive Surfaces Pinch to shrink and spread to enlarge
  37. 37. Patterns for Free-Form Interactive Gestures Proximity activates/deactivates
  38. 38. Patterns for Free-Form Interactive Gestures Point to select/activate
  39. 39. Patterns for Free-Form Interactive Gestures Rotate to change state
  40. 40. Patterns for Free-Form Interactive Gestures Shake to change
  41. 41. Patterns for Free-Form Interactive Gestures Tilt to move
  42. 42. Interesting demos
  43. 43. Reader Wearable computers: Steve Mann. Eyetap.org. http://about.eyetap.org/ Ubiquitous computing: Mark Weiser (1991). The Computer for the 21st Century. http://www.ubiq.com/hypertext/weiser/SciAmDraft3.h tml Adam Greenfield (2006). Everyware: The Dawning Age of Ubiquitous Computing. New Riders, Berkeley, CA. Donald Norman (1998). The Invisible Computer: Why Good Products Can Fail, The Personal Computer Is so Complex, and Information Appliances Are the Solution. The MIT Press, Cambridge, MA
  44. 44. Reader Input devices Doug Engelbart (1968). The mother of all demos. Google video stream Wikipedia. http://en.wikipedia.org/wiki/The_Mother_of_All_ Demos
  45. 45. Reader Fitts’ Law Dan Saffer (2007). Designing for Interaction: Creating Smart Applications and Clever Devices. New Riders, Berkeley, CA. (page 53) Speech recognition Microsoft. Microsoft Speech Technologies. http://www.microsoft.com/speech/speech2007/d efault.mspx
  46. 46. Reader Handwriting recognition Wacom. Unleash Windows Vista With A Pen. http://www.wacom.com/vista/index.php Gestural Interfaces Dan Saffer (2009). Designing Gestural Interfaces. O’Reilly Media, Sebastopol, CA Ergonomics Henry Dreyfuss (1955). Designing for People. Allworth Press, New York, NY.
  47. 47. Theme assignment
  48. 48. Today’s workshop assignment  Work together in teams of 2-3 students on one input device  Each team will be investigating the following:  What’s the typical application of this device?  What are typical patterns applied with this device?  How can this device connect to a computer?  What driver or other software is available for this device?  How can I adjust the parameters of this device?  How can I create application prototypes with this device?  Build a simple demonstrator for the device, using your laptop computer  Analyze the user experience with your demonstrator  Present your demonstrator at the end of the afternoon  Document your findings in a pdf document  Link the document to a post on your blog
  49. 49. Today’s workshop assignment Available devices  Touch screen (2)  Wii mote (4)  Xbox USB controller (2)  Wacom (3)  Web cam (5)  SpaceNavigator (1)  Presenter (3)  Smartboard (1)  iPhone (?)
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×