Tangible Interfaces

5,266 views

Published on

Published in: Technology
0 Comments
4 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
5,266
On SlideShare
0
From Embeds
0
Number of Embeds
1,242
Actions
Shares
0
Downloads
121
Comments
0
Likes
4
Embeds 0
No embeds

No notes for slide
  • Tangible Interfaces

    1. 1. 4. Tangible Interfaces
    2. 2. Main Source <ul><li>Kenneth P. Fishkin, A Taxonomy for and Analysis of Tangible Interfaces. Journal of Personal and Ubiquitous Computing, 8 (5), September 2004, 347-358. </li></ul>
    3. 3. Basics <ul><li>CHI 1997, Hiroshi Ishii & Brett Ullmer: “Tangible User Interfaces” (TUIs) = user interfaces that “augment the real physical world by coupling digital information to everyday physical objects and environments”. </li></ul><ul><li>Later refined: No distinction between input and output (Ullmer & Ishii, 2001) </li></ul><ul><li>Related terms: graspable, embodied, manipulative, haptic </li></ul>
    4. 4. Definitions <ul><li>hap-tic </li></ul><ul><ul><li>1. relating to or based on the sense of touch </li></ul></ul><ul><ul><li>2. characterized by a predilection for the sense of touch <a haptic person> </li></ul></ul><ul><li>tac-tile </li></ul><ul><ul><li>1. perceptible by touch: TANGIBLE </li></ul></ul><ul><ul><li>2. of or relating to the sense of touch </li></ul></ul><ul><li>tan-gi-ble </li></ul><ul><ul><li>1.a : capable of being perceived especially by the sense of touch: PALPABLE </li></ul></ul>Source: Merriam-Webster
    5. 5. Example of a Tangible Interface 1 <ul><li>Hug over a distance </li></ul>Source: M ü ller et al., 2005
    6. 6. Already Commercialized: Hug Shirt! <ul><li>Shirts with sensors and actuators </li></ul><ul><li>Works by cell phones </li></ul><ul><ul><li>Java software </li></ul></ul><ul><ul><li>BlueTooth </li></ul></ul><ul><li>Participatory design </li></ul><ul><ul><li>Taxonomy of hugs </li></ul></ul><ul><ul><li>Bodystorming (people hugging for a long time  ) </li></ul></ul>Source: http://www.cutecircuit.com/now/projects/wearables/fr-hugs/
    7. 7. Example of a Tangible Interface 2 <ul><li>I/O Brush: the world as the palette </li></ul>Source: Ryokai et al., 2004
    8. 8. Development of the I/O Brush Source: Ryokai et al., 2005 Robust design Pressure sensing Orientation sensing Smudging support Paint history Paint palette
    9. 9. Sample Art Work Source: Ryokai et al., 2005
    10. 10. Fishkin’s Framework <ul><li>Some input event occurs. This input event is typically a physical manipulation performed by a user with their hands on some “everyday physical object”, such as tilting, shaking, squeezing, pushing, or most often moving. </li></ul><ul><li>A computer system senses this input event, and alters its state. </li></ul><ul><li>The system provides feedback. This output event is via a change in the physical nature of some object – it alters its display surface, grows, shrinks, makes a sound, gives haptic feedback, etc. </li></ul>
    11. 11. Example 3: metaDESK Source: Ishii & Ullmer, 1997b “ The Great Dome”
    12. 12. Typical Components Source: Ishii & Ullmer, 1997b
    13. 13. Great Dome: Input and Output <ul><li>Input: rotation and translation </li></ul><ul><li>Input object: indicative of a building </li></ul><ul><li>Output: altered display of the workspace </li></ul><ul><li>Output object: augmented desktop </li></ul>
    14. 14. Example 4: Shakepad Source: Levin & Yarin, 1999 Inspiration: Many keychain devices, all with lots of buttons Shaking erases screen
    15. 15. Shakepad: Input and Output <ul><li>Input: shake </li></ul><ul><li>Input object: “non-everyday” object </li></ul><ul><li>Output: clear the screen </li></ul><ul><li>Output object: same as input object </li></ul>
    16. 16. Example 5: ToonTown <ul><li>Shared audio system </li></ul><ul><li>Location on tray controls volume </li></ul>Source: Singer et al., 1999
    17. 17. ToonTown: Input and Output <ul><li>Input: change of position </li></ul><ul><li>Input objects: two representative objects </li></ul><ul><li>Output: audio changes </li></ul>
    18. 18. Example 6: Photo Cubes Source: Want et al., 1999
    19. 19. Photo Cubes: Input and Output <ul><li>Input: spatial motion with orientation </li></ul><ul><li>Input objects: one everyday object </li></ul><ul><li>Output: altered display </li></ul><ul><li>Output objects: non-everyday object </li></ul>
    20. 20. What About These? Wellner, 1993: Digital Desktop Ishii, 1995: Tangible Bricks
    21. 21. Sorting Out the Design Space <ul><li>Taxonomy: “orderly classification of plants and animals according to their presumed natural relationships” (Merriam-Webster) </li></ul><ul><li>Taxonomy of Tangible User Interfaces </li></ul><ul><li>Two-dimensional </li></ul><ul><ul><li>Embodiment </li></ul></ul><ul><ul><li>Metaphor </li></ul></ul><ul><li>The higher an interface is in both dimensions, the more tangible it is </li></ul>
    22. 22. Embodiment <ul><li>Reflects how closely output is tied to input </li></ul><ul><li>Does the user have the illusion of the system state being “inside” the object that is manipulated? </li></ul><ul><li>Fishkin proposes a four-level scale </li></ul><ul><ul><li>Full </li></ul></ul><ul><ul><li>Nearby </li></ul></ul><ul><ul><li>Environmental </li></ul></ul><ul><ul><li>Distant </li></ul></ul>
    23. 23. Full Embodiment <ul><li>The output device is the input device </li></ul><ul><li>Analogy: clay sculpting </li></ul><ul><li>Examples: </li></ul>Rekimoto, 1996: Tilting interface
    24. 24. Gummi Source: Schwesig et al., 2004 Commercial version by Sony: http:// www.sony.net / Fun / SonyDesign /2003/ Gummi /
    25. 25. Platybus Amoeba <ul><li>As users approach the Platypus Amoeba they see a strange glowing creature. A translucent white blob with two glowing eyes, the Platy looks like the helpless baby of an alien creature. Platypus Amoeba emits small noises and a faint glow until a hand is moved over the soft skin and Platy gives a pleasant coo. Petting Platy has made it happy. What will more petting do? Perhaps you excite Platy and are rewarded with a display of colors or maybe it is displeased and truculent. Platy's reactions give you information as you use petting to interface with Platy and understand how Platy wants to be petted. </li></ul>Source: Churi & Lin, 2003
    26. 26. Nearby Embodiment <ul><li>The output takes place near the input object </li></ul><ul><li>First examples in HCI: light pen </li></ul><ul><li>Examples: </li></ul><ul><ul><li>Bricks </li></ul></ul><ul><ul><li>Great Dome (metaDESK) </li></ul></ul><ul><ul><li>Photo Cube </li></ul></ul>
    27. 27. Environmental Embodiment <ul><li>The output is around the user, often in audio </li></ul><ul><li>“ Non-graspable” interfaces </li></ul><ul><li>Examples: </li></ul><ul><ul><li>ToonTown </li></ul></ul>
    28. 28. Distant Embodiment <ul><li>The output is “over there” </li></ul><ul><ul><li>On another screen </li></ul></ul><ul><ul><li>In another room </li></ul></ul><ul><ul><li>… </li></ul></ul><ul><li>Examples: </li></ul><ul><ul><li>Doll’s Head </li></ul></ul>Source: Hinckley, 1994
    29. 29. Applications May Span Several Levels <ul><li>Platypus Amoeba </li></ul><ul><ul><li>Makes sounds  Environmental </li></ul></ul><ul><ul><li>Responds to patting  Full </li></ul></ul><ul><li>PingPongPlus </li></ul><ul><ul><li>Environmental and Nearby (Ishii et al., 1999) </li></ul></ul>
    30. 30. Metaphor <ul><li>“ figure of speech in which a word or phrase literally denoting one kind of object or idea is used in place of another to suggest a likeness or analogy between them” (Merriam-Webster) </li></ul><ul><ul><li>“ drown in money” </li></ul></ul><ul><li>Can use a variety of attributes: shape, size, color, weight, smell, texture, … </li></ul><ul><li>Here: Is the system effect of a user action analogous to the real-world effect of similar actions? </li></ul><ul><li>Fishkin proposes two sub-dimensions: </li></ul><ul><ul><li>Shape of object  “metaphor of noun” </li></ul></ul><ul><ul><li>Motion of object  “metaphor of verb” </li></ul></ul>
    31. 31. No Metaphor <ul><li>Analogy: command-line interface </li></ul><ul><li>Examples: </li></ul>The BitBall Programmable Beads Source: http://web.media.mit.edu/~mres/papers/chi-98/digital-manip.html
    32. 32. eTextiles Source: http://www.cs.colorado.edu/~buechley/projects/LED_clothing/tank.html
    33. 33. Noun Metaphor <ul><li>Object looks like the real thing </li></ul><ul><li>Actions are at most weakly related to real-world actions </li></ul><ul><li>Examples: </li></ul><ul><ul><li>Navigational blocks </li></ul></ul><ul><ul><li>Tagged objects </li></ul></ul>Source for Blocks: http://dmg.caup.washington.edu/xmlSiteEngine/browsers/static/project3.html
    34. 34. Tagged Objects Source: Want et al., 1999
    35. 35. Verb Metaphor <ul><li>Object acts like the real thing </li></ul><ul><li>Shape of object irrelevant </li></ul><ul><li>Example: </li></ul><ul><ul><li>Shakepad </li></ul></ul>
    36. 36. Noun and Verb Metaphor <ul><li>Object looks and acts like the real thing – but they are still different </li></ul><ul><li>Analogy: Desktop metaphor (to some extent) </li></ul><ul><li>Examples </li></ul><ul><ul><li>ToonTown </li></ul></ul><ul><ul><li>Urp </li></ul></ul>Source for Urps: http://tangible.media.mit.edu/projects/urbansim/
    37. 37. “Full” Metaphor <ul><li>“Really Direct Manipulation” </li></ul><ul><li>Example: Illuminating Clay </li></ul>Source: http://tangible.media.mit.edu/projects/IlluminatingClay/IlluminatingClay.htm
    38. 38. Applications May Span Several Levels <ul><li>Example: Barney </li></ul><ul><ul><li>Peek-A-Boo: full metaphor </li></ul></ul><ul><ul><li>Singing: noun metaphor </li></ul></ul>
    39. 39. Let’s Try It! 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Names: None Verb Noun and verb Full Full Noun Metaphor Nearby Environment Distant Embodiment

    ×