Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

The Glass Class Lecture 7: Future Research

1,469 views

Published on

Lecture 7 in the Glass Class course. Presented on February 21st 2014 by Mark Billinghurst. This lecture discusses directions for future research using Google Glass.

Published in: Technology

The Glass Class Lecture 7: Future Research

  1. 1. The Glass Class: Lecture 7 – Future Research Feb 17th – 21st 2014 Mark Billinghurst, Gun Lee HIT Lab NZ University of Canterbury
  2. 2. THE GLASS CLASS
  3. 3. THE GLASS CLASS “The best way to Predict the future is to Invent it.”                Alan  Kay      Computer  Scien3st  (1940-­‐  )  
  4. 4. THE GLASS CLASS Directions for Research   New devices   Input methods   User experience   Scaling up   Social Consequences
  5. 5. New Devices
  6. 6. THE GLASS CLASS Kopin Pupil   Eye-Glass display   428x240 resolution   Voice interactivity
  7. 7. THE GLASS CLASS GlassUp - http://www.glassup.net/   Glasses form factor – 320x240 pixel resolution   Secondary mobile display
  8. 8. THE GLASS CLASS Telepathy One -http://tele-pathy.org/   Minimal display
  9. 9. THE GLASS CLASS
  10. 10. THE GLASS CLASS Samsung Galaxy Gear   Watch based wearable
  11. 11. THE GLASS CLASS Samsung Galaxy Gear
  12. 12. THE GLASS CLASS Nike Fuelband   Activity/sleep tracking
  13. 13. THE GLASS CLASS Device Ecosystem
  14. 14. THE GLASS CLASS Wearable Attributes   fafds
  15. 15. Input Techniques/User Experience
  16. 16. THE GLASS CLASS The Vision of AR
  17. 17. THE GLASS CLASS To Make the Vision Real..   Hardware/software requirements  Intelligent systems  Contact lens displays  Free space hand/body tracking  Speech/gesture recognition  Etc..   Most importantly  Usability
  18. 18. THE GLASS CLASS Environment Sensing   Create virtual mesh over real world   Update at 10 fps – can move real objects   Use by physics engine for collision detection (virtual/real)   Use by OpenScenegraph for occlusion and shadows
  19. 19. THE GLASS CLASS Natural Hand Interaction   Using bare hands to interact with AR content   MS Kinect depth sensing   Real time hand tracking   Physics based simulation model
  20. 20. THE GLASS CLASS Meta Gesture Interaction   Depth sensor + Stereo see-through
  21. 21. THE GLASS CLASS Meta Video
  22. 22. THE GLASS CLASS Gesture Based Interaction   3 Gear Systems   Kinect/Primesense Sensor   Two hand tracking   http://www.threegear.com
  23. 23. THE GLASS CLASS Gesture Interaction + AR   HMD AR View   Viewpoint tracking   Two hand input   Skeleton interaction, occlusion
  24. 24. THE GLASS CLASS Multimodal Interaction   Combined speech and Gesture Input   Free-hand gesture tracking   Semantic fusion engine (speech + gesture input history)
  25. 25. THE GLASS CLASS User Evaluation   Change object shape, colour and position   Results   MMI signif. faster (11.8s) than gesture alone (12.4s)   70% users preferred MMI (vs. 25% speech only) Billinghurst, M., & Lee, M. (2012). Multimodal Interfaces for Augmented Reality. In Expanding the Frontiers of Visual Analytics and Visualization (pp. 449-465). Springer London.
  26. 26. THE GLASS CLASS Contact Lens Display   Babak Parviz   University Washington   MEMS components   Transparent elements   Micro-sensors   Challenges   Miniaturization   Assembly   Eye-safe
  27. 27. THE GLASS CLASS Contact Lens Prototype
  28. 28. THE GLASS CLASS Intelligent Feedback   Actively monitors user behaviour   Implicit vs. explicit interaction   Provides corrective feedback
  29. 29. Scaling Up
  30. 30. THE GLASS CLASS Ego-Vision Collaboration   Google Glass   camera + processing + display + connectivity
  31. 31. THE GLASS CLASS Ego-Vision Research   System   How do you capture the user's environment?   How do you provide good quality of service?   Interface   What visual and audio cues provide best experience?   How do you interact with the remote user?   Evaluation   How do you measure the quality of collaboration?
  32. 32. THE GLASS CLASS AR + Human Computation   Human Computation   Real people solving problems difficult for computers   Web-based, non real time   Little work on AR + HC   AR attributes   Shared point of view   Real world overlay   Location sensing What does this say?
  33. 33. THE GLASS CLASS Human Computation Architecture   Add AR front end to typical HC platform
  34. 34. THE GLASS CLASS AR + HC Research Questions   System   What architecture provides best performance?   What data is needed to be shared?   Interface   What cues are needed by the human computers?   What benefits does AR provide cf. web systems?   Evaluation   How can the system be evaluated?
  35. 35. THE GLASS CLASS Scaling Up   Seeing actions of millions of users in the world   Augmentation on city/country level
  36. 36. THE GLASS CLASS AR + Smart Sensors + Social Networks   Track population at city scale (mobile networks)   Match population data to external sensor data   medical, environmental, etc   Mine data to improve social services
  37. 37. THE GLASS CLASS
  38. 38. THE GLASS CLASS
  39. 39. THE GLASS CLASS Orange Data for Development   Orange made available 2.5 billion phone records   5 months calls from Ivory Coast   > 80 sample projects using data   eg: Monitoring human mobility for disease modeling
  40. 40. THE GLASS CLASS Research Questions   System   How can you capture the data reliably?   How can you aggregate and correlate the information?   Interface   What data provides the most values?   How can you visualize the information?   Evaluation   How do you measure the accuracy of the model?
  41. 41. Social Consequences
  42. 42. THE GLASS CLASS The Future of Wearables
  43. 43. THE GLASS CLASS Sight Video Demo
  44. 44. THE GLASS CLASS More Information   Mark Billinghurst   Email: mark.billinghurst@hitlabnz.org   Twitter: @marknb00   HIT Lab NZ   http://www.hitlabnz.org/

×