Google Glass and the Future of Wearable Computing


Published on

Google will release a wearable heads up display this fall, and it may help to usher in a new era of augmented reality and wearable computing. What does this mean for us as designers and developers? How do we build for the next generation of computers? Who was here before us, and how can we learn from them?

From it’s birthplace at MIT and PARC research, the field of wearable computing has focused on augmenting the human ability to compute freely. As pioneer Steve Mann and calm technology pioneer Mark Weiser wanted, “to free the human to not act as a machine”. Mann didn’t like the idea of crouching over a desktop computer. He instead felt that the computer should contort to the human naturally, so he began his own wearable computing mission.

This talk will focus on trends in wearable computing starting from the 1970’s-2010’s. I’ll cover various HUDs (heads up displays), new tech from Motorola, Google, various invasive and non-invasive tech and how mobile interfaces should take advantage of location, proximity and haptics to help improve our lives instead of get in the way. These are the machines that will be a part of our lives in only a few years from now, and the best way to learn about the future is to dig into the past.

Speech given at OSBridge 2012 by Amber Case:

Published in: Technology, Business
No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Google Glass and the Future of Wearable Computing

  1. Future of WearableComputing: Constraint,Context and LocationOSBridge 2012 Amber Case @caseorganic
  2. Flickr: cybertoad
  3. Flickr: soylentgreen23
  4. Univac 1 ~1950’s iPhone ~2000’s
  5. I. Problems
  6. Persistent Paleontology
  7. Navigation
  8. II. History
  9. Self-Portrait of Steve Mannwith WearableComputingApparatus 1981.
  10. Diminished Reality vs. Augmented Reality •  Adding your own layer onto reality •  Replacing public messages with your own
  11. Adblock: Image recognition, processing and replacement.
  12. Collaborative Reality
  13. Remember the Milk Contextual Notification Systems Virtual Post-It Notes with ImageProcessing 1995 @caseorganic
  14. Computer-Mediated Reality: Face Recognition and History
  15. Sousveillance:MaybeCam
  16. Evolution of Prosthesis
  17. Present-Day Steve Mann •  Extremely lightweight equipment •  Most people have this in their pocket
  18. Construction
  19. Text goes here
  20. Present-Day Steve Mann •  Extremely lightweight equipment •  Most people have this in their pocket
  21. Text goes here
  22. III. Persistent Architecture
  23. Englebart’sCyborgGlove
  24. Text goes here
  25. Twiddler by HandyKey Corporation One-Handed Key Chording USB Keyboard @caseorganic
  26. Inputs
  27. ThadStarner
  28. Borg Group – MIT Media Lab
  29. Sandy Pentland
  30. GA Tech
  31. IV: Next?
  32. V. Non-visual wearablecomputing
  33. HapticCompassBelt
  34. Heat Sink
  35. EEG
  36. Mann’s EEG Orchestra
  37. Mann’s EEG Orchestra
  38. Future
  39. IV. The Future
  40. context
  41. your phone is a remotecontrol for reality.
  42. •  Actions as buttons •  Invisible interfaces •  Trigger-based interactionsCalm technology
  43. TheInvisible 80mInterface 900m
  44. A Collectionof InvisibleButtons
  45. Types of Interactions•  Entering•  Exiting•  Dwelling at a place
  46. Proximal Notification
  47. AmbientNotification
  48. Location-based AR(Spotmetrix)
  49. Geonotes
  50. Real-Life Gaming
  52. Real-Time Hyperlocal Weather
  53. BringingWikipedia to Life
  54. Monitorsignificant eventsnot all events“Dave, who is normally in New York, isin town. Both of your schedules arefree at noon”.
  55. The interfacedisappears •  Actions are Reduced •  Queries are Eliminated
  56. Thank you. Amber Case@caseorganiccase@caseorganic.comSlides: