Flash Player 9 (or above) is needed to view presentations.
We have detected that you do not have it on your computer. To install it, go here.

Components Printed Onto Textiles


Published on

Published in: Business, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Components Printed Onto Textiles

    1. 1. Wearable Computing Research at The University of Birmingham Dr. Chris Baber EECE Human Interface Technologies
    2. 2. why wear a computer? <ul><li>Mobility </li></ul><ul><ul><ul><li>Access to communication / processing / information / data / people while mobile </li></ul></ul></ul><ul><ul><ul><li>Use while hands / eyes busy </li></ul></ul></ul><ul><li>Attachment </li></ul><ul><ul><ul><li>Computer worn on body </li></ul></ul></ul><ul><li>Attention </li></ul><ul><ul><ul><li>Maintain attention on primary task </li></ul></ul></ul><ul><li>Aware / Adaptive </li></ul><ul><ul><ul><li>Respond to changes in environment / context </li></ul></ul></ul><ul><ul><ul><li>Modify information to suit person </li></ul></ul></ul><ul><li>(Ultra) Personal Computing </li></ul><ul><ul><ul><li>Adapts to wearer </li></ul></ul></ul>
    3. 3. four primary approaches <ul><li>High-specification processor </li></ul><ul><ul><li>PC104, Pentium grade processors </li></ul></ul><ul><ul><li>Multiple sensors </li></ul></ul><ul><ul><li>Context-aware </li></ul></ul><ul><ul><li>Worn in bags, rucksacks etc. </li></ul></ul><ul><li>“ Boxes in Pockets” </li></ul><ul><ul><li>Lower Specification Processors </li></ul></ul><ul><ul><li>Limited sensing </li></ul></ul><ul><ul><li>Integrated into jackets, belts, vests, etc. </li></ul></ul><ul><li>“ Smart Textiles” </li></ul><ul><ul><li>Components printed onto textiles </li></ul></ul><ul><ul><li>Conductive threads woven or knitted into fabric </li></ul></ul><ul><ul><li>Components created from weaving </li></ul></ul><ul><li>“ Embedded Processors” </li></ul><ul><ul><li>Miniature components in clothing or in person </li></ul></ul><ul><ul><li>Wireless transmission to external processors </li></ul></ul>University of Wollongong Philips / Levi-Strauss
    4. 4. wearing computing devices <ul><li>Carriage </li></ul><ul><ul><li>Person-as-mule </li></ul></ul><ul><ul><li>Person-as-transducer </li></ul></ul><ul><ul><li>Person-as-audience </li></ul></ul><ul><ul><li>Person-as-intelligence </li></ul></ul><ul><li>Interaction </li></ul><ul><ul><li>Usable Controls </li></ul></ul><ul><ul><li>Readable Displays </li></ul></ul><ul><ul><li>Understandable Responses </li></ul></ul>
    5. 5. person-as-mule <ul><li>device represents a load on the body </li></ul><ul><li>some regions are better sites than others for mounting devices on the body </li></ul><ul><ul><li>Range of mobility </li></ul></ul><ul><ul><li>Surface area </li></ul></ul><ul><ul><li>Attachment </li></ul></ul>Gemperle et al., 1998, ISWC
    6. 6. ergonomics <ul><li>posture </li></ul><ul><ul><li>affected by load carriage and distribution </li></ul></ul><ul><ul><li>affected by interaction </li></ul></ul><ul><li>movement </li></ul><ul><ul><li>affected by load </li></ul></ul><ul><ul><li>smaller is not always better </li></ul></ul><ul><li>comfort </li></ul><ul><ul><li>social factors as well as physical </li></ul></ul>
    7. 7. interaction <ul><li>connection established </li></ul><ul><ul><li>presence of connection </li></ul></ul><ul><ul><li>strength of connection </li></ul></ul><ul><li>action performed </li></ul><ul><ul><li>command sent / received </li></ul></ul><ul><ul><li>response made </li></ul></ul><ul><li>ergonomics </li></ul><ul><ul><li>posture </li></ul></ul>Seiko Bluetooth watch
    8. 8. person-as-transducer <ul><li>person’s ‘activity’ lead to change in device state </li></ul><ul><ul><li>physiology </li></ul></ul><ul><ul><li>movement & location </li></ul></ul><ul><ul><li>sequences of actions </li></ul></ul><ul><ul><ul><li>hidden markov models to recognise activity </li></ul></ul></ul>
    9. 9. interaction <ul><li>action performed by person </li></ul><ul><li>action sensed by device </li></ul><ul><li>feedback provided to person </li></ul><ul><li>managing concurrent activity </li></ul>Enlightened Designs
    10. 10. person-as-audience <ul><li>delivery of media </li></ul><ul><li>tailoring of media </li></ul><ul><li>sharing of media </li></ul><ul><ul><li>file / data exchange </li></ul></ul><ul><ul><li>agent-based exchange </li></ul></ul><ul><li>manipulation of media </li></ul><ul><ul><li>user interaction </li></ul></ul>Sony Ericsson W710 Walkman phone Microsoft Zune
    11. 11. interaction <ul><li>manage media </li></ul><ul><ul><li>selection </li></ul></ul><ul><ul><li>quality </li></ul></ul><ul><ul><li>recommender systems </li></ul></ul><ul><ul><li>context-awareness </li></ul></ul><ul><ul><li>modify by user action </li></ul></ul>
    12. 12. person-as-actor <ul><li>action in physical world creates a digital record </li></ul><ul><li>annotation of photographs by user activity </li></ul><ul><li>what you point at is what you record </li></ul>
    13. 13. interaction <ul><li>connection established </li></ul><ul><ul><li>presence of connection </li></ul></ul><ul><li>action performed </li></ul><ul><ul><li>response made </li></ul></ul><ul><ul><li>message sent </li></ul></ul><ul><li>goal achieved </li></ul><ul><ul><li>outcome achieved </li></ul></ul>Karlsruhe ‘MediaCup’ NTT/DoCoMo FOMA phone
    14. 14. forms of engagement environmental morphological motor perceptual cognitive cultural
    15. 15. Paying for travel Cultural Purchase by action Cognitive Confirm purchase Perceptual Act on surface Motor Grasp phone Morphological Payment surface Environmental Buying a rail pass Form of Engagement
    16. 16. <ul><li>Activity in the physical domain becomes embedded in the digital domain </li></ul><ul><li>The activity of entering the platform merges with the act of purchasing a ticket </li></ul><ul><li>The ‘seamlessness’ is not from invisibility but from the complete visibility of this merging (c.f. Matthew Chalmers notion of ‘seamfulness’) </li></ul>
    17. 17. From Real vs . Virtual to Physical vs. Digital <ul><li>DigitalDesk [Wellner] </li></ul><ul><li>GeoSpace [Ishii] </li></ul><ul><li>DataTiles [Rekimoto] </li></ul>
    18. 18. <ul><ul><li>Plotting of objects of interest </li></ul></ul><ul><ul><li>Navigation, Identification & Recording </li></ul></ul><ul><ul><li>Mapping and Status Display at Control Room </li></ul></ul><ul><ul><li>Navigation: Trials show users can follow trails of targets much faster than using paper maps </li></ul></ul>Investigating Other Cues SmartScope Early Work Able to show relative position of objects
    19. 19. <ul><ul><li>Perceiving and responding to ‘affordance’ </li></ul></ul><ul><ul><ul><li>Environmental engagement </li></ul></ul></ul><ul><ul><li>Modifying behaviour to compensate for objects </li></ul></ul><ul><ul><ul><li>Morphological and motor engagement </li></ul></ul></ul><ul><ul><li>Interpreting and gauging appropriateness of response </li></ul></ul><ul><ul><ul><li>Perceptual engagement </li></ul></ul></ul><ul><ul><li>Predicting consequences of action and response </li></ul></ul><ul><ul><ul><li>Cognitive engagement </li></ul></ul></ul><ul><ul><li>Social acceptance of behaviours </li></ul></ul><ul><ul><ul><li>Cultural engagement </li></ul></ul></ul>human activity as ‘tuning’ of ubiquitous computing
    20. 20. minimal interaction Using a Person’s Location, Movement and Posture to define the information that can be pushed to the wearer... ...or information that can be pulled from the wearer… ...with the additional ability to comment on these data.
    21. 21. <ul><li>merging activity and digitisation </li></ul><ul><li>marking points in space to map bodies </li></ul><ul><li>photography as reporting </li></ul>tuning through physical activity
    22. 22. crime scene investigation <ul><li>Support annotation of images & sketching </li></ul><ul><li>Reduce wiring & auto-detect devices </li></ul><ul><li>Support different platforms </li></ul>
    23. 23. user interface for tablet and wearable platforms <ul><li>Commands on ‘buttons’ </li></ul><ul><li>Select by tapping or by speaking button’s name </li></ul><ul><li>Large area for photograph and annotation </li></ul><ul><li>Automatic generation and transmission of crime scene reports with photographs and evidence identification </li></ul>
    24. 24. portable encoding of data <ul><li>Standard XML format </li></ul><ul><li>JPEG images / .wav files encoded in base-64 </li></ul><ul><li>e.g. 1 image and 10s audio to around 200kb </li></ul>
    25. 26. collaborative annotation <ul><li>Capture slower with paper </li></ul><ul><li>Transmission slower with paper </li></ul><ul><li>No different in time to annotate </li></ul><ul><li>Perceived quality higher with paper </li></ul><ul><li>Perceived workload lower with computer </li></ul>
    26. 27. Wearable Computing Research at The University of Birmingham Dr. Chris Baber [email_address]