Siggraph 2014: The Glass Class - Designing Wearable Interfaces

4,651 views
4,482 views

Published on

Course on Designing Wearable interfaces taught by Mark Billinghurst at Siggraph 2014. Presented on August 10th, 2014 from 10:45am - 12:15pm. The course focuses mainly on design guidelines and tools for rapid prototyping for Google Glass.

Published in: Technology

Siggraph 2014: The Glass Class - Designing Wearable Interfaces

  1. 1. The Glass Class: Designing Wearable Interfaces Mark Billinghurst The HIT Lab NZ, University of Canterbury The 41st International Conference and Exhibition on Computer Graphics and Interactive Techniques
  2. 2. INTRODUCTION
  3. 3. Mark Billinghurst ▪  Director of The HIT Lab NZ, University of Canterbury ▪  PhD Univ. Washington ▪  Research on AR, mobile HCI, Collaborative Interfaces, Wearables ▪  Joined Glass team at Google [x] in 2013
  4. 4. How do you Design for this?
  5. 5. Course Goals In this course you will learn ▪  Introduction to head mounted wearable computers ▪  Understanding of current wearable technology ▪  Key design principles/interface metaphors ▪  Rapid prototyping tools ▪  Areas for future research
  6. 6. What You Won’t Learn ▪  Who are the companies/universities in this space ▪  See the Siggraph exhibit floor ▪  Designing for non-HMD based interfaces ▪  Watches, fitness bands, etc ▪  How to develop wearable hardware ▪  optics, sensor assembly, etc ▪  Evaluation methods ▪  Experimental design, statistics, etc
  7. 7. Schedule •  10:45 am Introduction •  10:55 am Technology Overview •  11:05 am Design Guidelines •  11:25 am Prototyping Tools •  11:55 am Example Applications •  12:05 am Research Directions/Resources
  8. 8. A Brief History of Computing Trend ▪  Smaller, cheaper, faster, more intimate ▪  Moving from fixed to handheld and onto body 1950’s 1980’s 1990’s
  9. 9. Wearable Computing ▪  Computer on the body that is: ▪  Always on ▪  Always accessible ▪  Always connected ▪  Other attributes ▪  Augmenting user actions ▪  Aware of user and surroundings
  10. 10. Desk Lap Hand Head
  11. 11. The Ideal Wearable ▪  Persists and Provides Constant Access: Designed for everyday and continuous use over a lifetime. ▪  Senses and Models Context: Observes and models the users environment, mental state, it’s own state. ▪  Augments and Mediates: Information support for the user in both the physical and virtual realities. ▪  Interacts Seamlessly: Adapts its input and output modalities to those most appropriate at the time. Starner, T. E. (1999). Wearable computing and contextual awareness (Doctoral dissertation, Massachusetts Institute of Technology).
  12. 12. History of Wearables ▪  1960-90: Early Exploration ▪  Gamblers and Custom build devices ▪  1990 - 2000: Academic, Military Research ▪  MIT, CMU, Georgia Tech, EPFL, etc ▪  1997: ISWC conference starts ▪  1995 – 2005+: First Commercial Uses ▪  Niche industry applications, Military ▪  2010 - : Second Wave of Wearables
  13. 13. Origins - The Gamblers •  Thorp and Shannon (1961) –  Wearable timing device for roulette prediction •  Keith Taft (1972) –  Wearable computer for blackjack card counting Belt computer Shoe Input Glasses Display
  14. 14. Steve Mann (1980s - ) http://wearcomp.org/
  15. 15. Thad Starner (1993 - )
  16. 16. MIT Wearable Computing (1993-) http://www.media.mit.edu/wearables/
  17. 17. CMU Wearables (1991–2000) ▪  Industry focused wearables ▪  Maintenance, repair ▪  Custom designed interface ▪  Dial/button input ▪  Rapid prototyping approach ▪  Industrial designed, ergonomic http://www.cs.cmu.edu/afs/cs/project/vuman/www/frontpage.html
  18. 18. Prototype Applications ▪  Remembrance Agent ▪  Rhodes (97) ▪  Augmented Reality ▪  Feiner (97), Thomas (98) ▪  Remote Collaboration ▪  Garner (97), Kraut (96) ■  Maintenance ■  Feiner (93), Caudell (92)
  19. 19. Mobile AR: Touring Machine (1997) ▪  University of Columbia ▪  Feiner, MacIntyre, Höllerer, Webster ▪  Combined ▪  See through head mounted display ▪  GPS tracking, Orientation sensor ▪  Backpack PC (custom) ▪  Tablet input Feiner, S., MacIntyre, B., Höllerer, T., & Webster, A. (1997). A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment. Personal Technologies, 1(4), 208-217.
  20. 20. Touring Machine View ▪  Virtual tags overlaid on the real world ▪  “Information in place”
  21. 21. Early Commercial Systems ▪  Xybernaut (1996 - 2007) ▪  Belt worn, HMD, 200 MHz ▪  ViA (1996 – 2001) ▪  Belt worn, Audio Interface ▪  700 MHz Crusoe ■  Symbol (1998 – 2006) ■  Wrist worn computer ■  Finger scanner
  22. 22. Google Glass (2011 - )
  23. 23. The Second Wave of Wearables ▪  Vuzix M-100 ▪  $999, professional ▪  Recon Jet ▪  $600, more sensors, sports ▪  Opinvent ▪  500 Euro, multi-view mode ▪  Motorola Golden-i ▪  Rugged, remote assistance
  24. 24. Projected Market
  25. 25. Summary Wearables are a new class of computing Intimate, persistent, aware, accessible Evolution over 50 year history Backpack to head worn Custom developed to consumer ready device Enables new applications Collaboration, memory, AR, industry, etc Many head worn wearables are coming
  26. 26. TECHNOLOGY
  27. 27. ▪  fafds
  28. 28. Enabling Technologies (1989-) ▪  Private Eye Display (Reflection Technologies) ▪  720 x 280 dipslay ▪  Vibrating mirror ▪  Twiddler (Handykey) ▪  Chording keypad ▪  Mouse emulation
  29. 29. Tin Lizzy (Platt, Starner, 1993) ▪  General Purpose Wearable ▪  150 MHz Pentium CPU ▪  32-64 Mb RAM, 6 GB HDD ▪  VGA display ▪  2 PCMCIA slots ▪  Cellular modem http://www.media.mit.edu/wearables/lizzy/lizzy/index.html
  30. 30. •  asda
  31. 31. ▪  Hardware ▪  CPU TI OMAP 4430 – 1 Ghz ▪  16 GB SanDisk Flash, 2 GB Ram ▪  Input ▪  5 mp camera, 720p recording, microphone ▪  InvenSense MPU-9150 inertial sensor ▪  Output ▪  Bone conducting speaker ▪  640x360 micro-projector display Google Glass Specs
  32. 32. Glass Display
  33. 33. ViewThrough Google Glass Always available peripheral information display Combining computing, communications and content capture
  34. 34. Google Glass Demo
  35. 35. Google Glass User Interface •  dfasdf
  36. 36. Timeline Metaphor
  37. 37. User Experience •  Truly Wearable Computing –  Less than 46 ounces •  Hands-free Information Access –  Voice interaction, Ego-vision camera •  Intuitive User Interface –  Touch, Gesture, Speech, Head Motion •  Access to all Google Services –  Map, Search, Location, Messaging, Email, etc
  38. 38. Types of Head Mounted Displays Occluded See-thru Multiplexed
  39. 39. Multiplexed Displays ▪  Above or below line of sight ▪  Strengths ▪  User has unobstructed view of real world ▪  Simple optics/cheap ▪  Weaknesses ▪  Direct information overlay difficult ▪  Display/camera offset from eyeline ▪  Wide FOV difficult
  40. 40. Vuzix M-100 ▪  Monocular multiplexed display ($1000) ▪  852 x 480 LCD display, 15 deg. FOV ▪  5 MP camera, HD video ▪  GPS, gyro, accelerometer
  41. 41. Optical see-through HMD Virtual images from monitors Real World Optical Combiners
  42. 42. Epson Moverio BT-200 ▪  Stereo see-through display ($700) ▪  960 x 540 pixels, 23 degree FOV, 60Hz, 88g ▪  Android Powered, separate controller ▪  VGA camera, GPS, gyro, accelerometer
  43. 43. Strengths of optical see-through ▪  Simpler (cheaper) ▪  Direct view of real world ▪  Full resolution, no time delay (for real world) ▪  Safety ▪  Lower distortion ▪  No eye displacement ▪  see directly through display
  44. 44. Video see-through HMD Video cameras Monitors Graphics Combiner Video
  45. 45. Vuzix Wrap 1200DXAR ▪  Stereo video see-through display ($1500) ■ Twin 852 x 480 LCD displays, 35 deg. FOV ■ Stereo VGA cameras ■ 3 DOF head tracking
  46. 46. Strengths of Video See-Through ▪  True occlusion ▪  Block image of real world ▪  Digitized image of real world ▪  Flexibility in composition, match time delays ▪  More registration, calibration strategies ▪  Wide FOV is easier to support ▪  wide FOV camera
  47. 47. Input Options ▪  Physical Devices ▪  Keyboard, Pointer, Stylus ▪  Natural Input ▪  Speech, Gesture ▪  Other ▪  Physiological sensors
  48. 48. Twiddler Input ▪  Chording or multi-tap input ▪  Possible to achieve 40 - 60 wpm after 30+ hours ▪  cf 20 wpm on T9, or 60+ wpm for QWERTY Lyons, K., Starner, T., Plaisted, D., Fusia, J., Lyons, A., Drew, A., & Looney, E. W. (2004, April). Twiddler typing: One-handed chording text entry for mobile phones. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 671-678). ACM.
  49. 49. Virtual Keyboards ▪  In air text input ▪  Virtual QWERTY keyboard up to 20 wpm ▪  Word Gesture up to 28 wpm ▪  Handwriting around 20-30 wpm A. Markussen, et. al. Vulture: A Mid-Air Word-Gesture Keyboard (CHI 2014)
  50. 50. Unobtrusive Input Devices ▪  GestureWrist ▪  Capacitive sensing, changes with hand shape Rekimoto, J. (2001). Gesturewrist and gesturepad: Unobtrusive wearable interaction devices. In Wearable Computers, 2001. Proceedings. Fifth International Symposium on (pp. 21-27). IEEE.
  51. 51. Unobtrusive Input Devices ▪  GesturePad ▪  Capacitive multilayered touchpads ▪  Supports interactive clothing
  52. 52. Skinput Using EMG to detect muscle activity Tan, D., Morris, D., & Saponas, T. S. (2010). Interfaces on the go. XRDS: Crossroads, The ACM Magazine for Students, 16(4), 30-34.
  53. 53. Issues to Consider ▪  Fatigue ▪  “Gorrilla” Arm from free-hand input ▪  Comfort ▪  People want to do small gestures by waist ▪  Interaction on the go ▪  Can input be done while moving?
  54. 54. DESIGN GUIDELINES
  55. 55. INTERACTION DESIGN
  56. 56. Design For the Device •  Simple, relevant information •  Complement existing devices
  57. 57. Last year Last week NowForever The Now machine Focus on location, contextual and timely information, and communication.
  58. 58. Don’t design an app Glass OS is time-based model, not an app model.
  59. 59. The  world  is  the  experience   Get  the  interface  and  interac-ons  out  of  the  way.  
  60. 60. It's  like  a  rear  view  mirror     Don't  overload  the  user.  S-ck  to  the   absolutely  essen-al,  avoid  long  interac-ons.   Be  explicit.      
  61. 61. Micro   Interac8ons   The  posi-on  of  the  display  and   limited  input  ability  makes   longer  interac-ons  less   comfortable.     Using  it  shouldn’t  take  longer   than  taking  out  your  phone.  
  62. 62. Micro-Interactions On mobiles people split attention between display and real world
  63. 63. Time Looking at Screen Oulasvirta, A. (2005). The fragmentation of attention in mobile interaction, and what to do with it. interactions, 12(6), 16-18.
  64. 64. Design for MicroInteractions ▪  Design interaction less than a few seconds –  Tiny bursts of interaction –  One task per interaction –  One input per interaction ▪  Benefits –  Use limited input –  Minimize interruptions –  Reduce attention fragmentation
  65. 65. Make it Glanceable •  Seek to rigorously reduce information density. •  Design for recognition, not reading. Bad Good
  66. 66. Reduce the Number of Info Chunks •  You are designing for recognition, not reading. •  Reducing the total # of information chunks will greatly increase the glanceability of your design. •  . 1 2 3 1 2 3 4 5 (6)
  67. 67. Design single interactions < 4 s Eye movements For 1: 1 230ms For 2: 1 230ms For 3: 1 230ms For 4: 3 690ms For 5: 2 460ms ~1,840ms Eye movements For 1: 1-2 460ms For 2: 1 230ms For 3: 1 230ms ~920ms 1 2 3 1 2 3 4 5 (6)
  68. 68. Test the glanceability of your design
  69. 69. Don’t Get in the Way •  Enhance, not replace, real world interaction
  70. 70. Design for Interruptions ▪  Gradually increase engagement and attention load ▪  Respond to user engagement Receiving SMS on Glass “Bing” Tap Swipe Glass Show Message Start Reply User Look Up Say Reply
  71. 71. Do one thing at a time
  72. 72. Keep it Relevant •  Information at the right time and place
  73. 73. Design for Context
  74. 74. Avoid the Unexpected •  Don’t send unexpected content at wrong times •  Make it clear to users what your application does
  75. 75. Build for People •  Use imagery, voice interaction, natural gestures •  Focus on fire and forget interaction model
  76. 76. VISUAL DESIGN
  77. 77. Transparent  displays  are  tricky    Colors  are  funny  and  inconsistent.    You  can  only  add  light  to  a  scene,  not  cover  anything  up.    Mo-on  can  be  disorien-ng.   Clarity,  contrast,  brightness,  visual  field  and  aHen-on  are  important.  
  78. 78. White is your new black
  79. 79. Establish hierarchy with color White is your <h1> and grey is your <h2> or <h3>. Footer text - establishing time, attribution, or distance - is the only place with smaller font size.
  80. 80. Use brand-specific typography
  81. 81. Test your design indoors + outdoors
  82. 82. EXAMPLE APPLICATIONS
  83. 83. •  https://glass.google.com/glassware Glassware Applications
  84. 84. Virtual Exercise Companion •  GlassFitGames –  http://www.glassfitgames.com
  85. 85. Vipaar Telemedicine •  Vipaar + UAB - http://www.vipaar.com •  Endoscopic view streamed remotely •  Remote expert adds hands – viewed in Glass
  86. 86. CityViewAR •  Using AR to visualize Christchurch city buildings – 3D models of buildings, 2D images, text, panoramas – ARView, Map view, List view – Available on Android/iOS market
  87. 87. CityViewAR on Glass •  AR overlay of virtual buildings in Christchurch
  88. 88. PROTOTYPING TOOLS
  89. 89. How can we quickly prototype Wearable experiences with little or no coding?
  90. 90. Why Prototype? ▪  Quick visual design ▪  Capture key interactions ▪  Focus on user experience ▪  Communicate design ideas ▪  “Learn by doing/experiencing”
  91. 91. Prototyping Tools ▪  Static/Low fidelity ▪  Sketching ▪  User interface templates ▪  Storyboards/Application flows ▪  Screen sharing ▪  Interactive/High fidelity ▪  Wireframing tools ▪  Mobile prototyping ▪  Native Coding
  92. 92. Important Note ▪  Most current wearables run Android OS ▪  eg Glass, Vuzix, Atheer, Epson, etc ▪  So many tools for prototyping on Android mobile devices will work for wearables ▪  If you want to learn to code, learn ▪  Java, Android, Javascript/PHP
  93. 93. Typical Development Steps ▪  Sketching ▪  Storyboards ▪  UI Mockups ▪  Interaction Flows ▪  Video Prototypes ▪  Interactive Prototypes ▪  Final Native Application Increased Fidelity & Interactivity
  94. 94. Low Fidelity Tools •  Sketching •  GlassSim •  UI Templates •  Storyboards •  GlassWare flow designer •  Android Design Preview •  Video sketches
  95. 95. High Fidelity Tools •  UXPin/Proto.io •  JustinMind •  Processing •  WearScript •  Unity3D •  Native Coding
  96. 96. Sketched Interfaces ▪  Sketch + Powerpoint/Photoshop/Illustrator
  97. 97. GlassSim – http://glasssim.com/ ▪  Simulate the view through Google Glass ▪  Multiple card templates
  98. 98. GlassSim Card Builder ▪  Use HTML for card details ▪  Multiple templates ▪  Change background ▪  Own image ▪  Camera view
  99. 99. GlassSim Samples
  100. 100. Glass UI Templates ▪  Google Glass Photoshop Templates ▪  http://glass-ui.com/ ▪  http://dsky9.com/glassfaq/the-google-glass-psd-template/
  101. 101. Application Storyboard ▪  http://dsky9.com/glassfaq/google-glass- storyboard-template-download/
  102. 102. Glassware Flow Designer •  Features –  Design using common patterns and layouts –  Specify interactions and card flow –  Share with other designers •  Available from: –  https://developers.google.com/glass/tools- downloads/glassware-flow-designer
  103. 103. Example Flow •  Blah
  104. 104. Screen Sharing ▪  Android Design Preview –  Tool for sharing screen content onto Glass –  https://github.com/romannurik/ AndroidDesignPreview/releases Mac Screen Glass
  105. 105. ▪ Series of still photos in a movie format. ▪ Demonstrates the experience of the product ▪ Discover where concept needs fleshing out. ▪ Communicate experience and interface ▪ You can use whatever tools, from Flash to iMovie. Video Sketching
  106. 106. See https://vine.co/v/bgIaLHIpFTB Example: Glass Vine UI
  107. 107. Limitations ▪  Positives ▪  Good for documenting screens ▪  Can show application flow ▪  Negatives ▪  No interactivity/transitions ▪  Can’t be used for testing ▪  Can’t deploy on wearable ▪  Can be time consuming to create
  108. 108. Interactive Wireframing ▪  Developing interactive interfaces/wireframes ▪  Transitions, user feedback, interface design ▪  Web based tools ▪  UXpin - http://www.uxpin.com/ ▪  proto.io - http://www.proto.io/ ▪  Native tools ▪  Justinmind - http://www.justinmind.com/ ▪  Axure - http://www.axure.com/
  109. 109. UXpin - www.uxpin.com ▪  Web based wireframing tool ▪  Mobile/Desktop applications ▪  Glass templates, run in browser
  110. 110. Proto.io - http://www.proto.io/ ▪  Web based mobile prototyping tool ▪  Features ▪  Prototype for multiple devices ▪  Gesture input, touch events, animations ▪  Share with collaborators ▪  Test on device
  111. 111. Proto.io - Interface
  112. 112. Demo: Building a Simple Flow
  113. 113. Gesture Flow Scr1 Scr2 Scr3 Scr4 Scr5 Scr6 Tap Swipe
  114. 114. Start Transitions
  115. 115. Justinmind ▪  Native wireframing tool ▪  Build mobile apps without programming ▪  drag and drop, interface templates ▪  web based simulation ▪  test on mobile devices ▪  collaborative project sharing ▪  Templates for Glass, custom templates
  116. 116. User Interface - Glass Templates
  117. 117. Web Simulation Tool
  118. 118. Wireframe Limitations ▪  Can’t deploy on Glass ▪  No access to sensor data ▪  Camera, orientation sensor ▪  No multimedia playback ▪  Audio, video ▪  Simple transitions ▪  No conditional logic
  119. 119. Processing ▪  Programming tool for Artists/Designers ▪  http://processing.org ▪  Easy to code, Free, Open source, Java based ▪  2D, 3D, audio/video support ▪  Processing For Android ▪  http://wiki.processing.org/w/Android ▪  Strong Android support, builds .apk file
  120. 120. Basic Processing Sketch /* Notes comment */ //set up global variables float moveX = 50; //Initialize the Sketch void setup (){ } //draw every frame void draw(){ }
  121. 121. Importing Libraries ▪  Can add functionality by Importing Libraries ▪  java archives - .jar files ▪  Include import code import processing.opengl.*; ▪  Popular Libraries ▪  Minim - audio library, OCD - 3D camera views ▪  bluetoothDesktop - bluetooth networking
  122. 122. Processing and Glass ▪  One of the easiest ways to build rich interactive wearable applications ▪  focus on interactivity, not coding ▪  Collects all sensor input ▪  camera, accelerometer, touch ▪  Can build native Android .apk files ▪  Side load onto Glass
  123. 123. Hello World Image PImage img; // Create an image variable void setup() { size(640, 360); //load the ok glass home screen image img = loadImage("okGlass.jpg"); // Load the image into the program } void draw() { // Displays the image at its actual size at point (0,0) image(img, 0, 0); }
  124. 124. Demo
  125. 125. Touch Pad Input ▪  Tap recognized as DPAD input void keyPressed() { if (key == CODED){ if (keyCode == DPAD) { // Do something .. ▪  Java code to capture rich motion events ▪  import android.view.MotionEvent;
  126. 126. Motion Event //Glass Touch Events - reads from touch pad public boolean dispatchGenericMotionEvent(MotionEvent event) { float x = event.getX(); // get x/y coords float y = event.getY(); int action = event.getActionMasked(); // get code for action switch (action) { // let us know which action code shows up case MotionEvent.ACTION_MOVE: touchEvent = "MOVE"; xpos = myScreenWidth-x*touchPadScaleX; ypos = y*touchPadScaleY; break;
  127. 127. Demo
  128. 128. Sensors ▪  Ketai Library for Processing ▪  https://code.google.com/p/ketai/ ▪  Support all phone sensors ▪  GPS, Compass, Light, Camera, etc ▪  Include Ketai Library ▪  import ketai.sensors.*; ▪  KetaiSensor sensor;
  129. 129. Using Sensors ▪  Setup in Setup( ) function ▪  sensor = new KetaiSensor(this); ▪  sensor.start(); ▪ sensor.list(); ▪  Event based sensor reading void onAccelerometerEvent(…){ accelerometer.set(x, y, z); }
  130. 130. Sensor Demo
  131. 131. Using the Camera ▪  Import camera library ▪  import ketai.camera.*; ▪  KetaiCamera cam; ▪  Setup in Setup( ) function cam = new KetaiCamera(this,640,480,15); ▪  Draw camera image void draw() { //draw the camera image image(cam, width/2, height/2);
  132. 132. Camera Demo
  133. 133. Native Coding ▪  For best performance need native coding ▪  Low level algorithms etc ▪  Most current wearables based on Android OS ▪  Need Java/Android skills ▪  Many devices have custom API/SDK ▪  Vusix M-100: Vusix SDK ▪  Glass: Mirror API, Glass Developer Kit (GDK)
  134. 134. Glassware Development ▪  Mirror API ▪  Server programming, online/web application ▪  Static cards / timeline management ▪  GDK ▪  Android programming, Java (+ C/C++) ▪  Live cards ▪  See: https://developers.google.com/glass/
  135. 135. ▪  REST API ▪  Java servlet, PHP, Go, Python, Ruby, .NET ▪  Timeline based apps ▪  Static cards -  Text, HTML, media attachment (image & video) ▪  Manage timeline -  Subscribe to timeline notifications, contacts -  Location based services Mirror API
  136. 136. GDK ▪  Glass Development Kit ▪  Android 4.0.3 ICS + Glass specific APIs ▪  Use standard Android Development Tools
  137. 137. ▪  GDK add-on features ▪  Timeline and cards ▪  Menu and UI ▪  Touch pad and gesture ▪  Media (sound, camera and voice input) GDK
  138. 138. Glass Summary ▪  Use Mirror API if you need ... ▪  Use GDK if you need ... ▪  Or use both
  139. 139. Hardware Prototyping
  140. 140. Build Your Own Wearable ▪  MyVu display + phone + sensors
  141. 141. Beady-i ▪  http://www.instructables.com/id/DIY- Google-Glasses-AKA-the-Beady-i/
  142. 142. Rasberry Pi Glasses ▪  Modify video glasses, connect to Rasberry Pi ▪  $200 - $300 in parts, simple assembly ▪  https://learn.adafruit.com/diy-wearable-pi-near-eye-kopin-video-glasses
  143. 143. Physical Input Devices ▪  Can we develop unobtrusive input devices ? ▪  Reduce need for speech, touch pad input ▪  Socially more acceptable ▪  Examples ▪  Ring, pendant, ▪  bracelet, gloves, etc
  144. 144. Prototyping Platform Arduino Kit Bluetooth Shield Google Glass
  145. 145. Example: Glove Input ▪  Buttons on fingertips ▪  Map touches to commands
  146. 146. Example: Ring Input ▪  Touch strip, button, accelerometer ▪  Tap, swipe, flick actions
  147. 147. How it works Bracelet Armband Gloves 1,2, 3,4 Values/ output
  148. 148. Other Tools ▪  Wireframing ▪  Pidoco, FluidUI ▪  Rapid Development ▪  Phone Gap, AppMachine ▪  Interactive ▪  App Inventor, Unity3D, WearScript
  149. 149. WearScript ▪  JavaScript development for Glass ▪  http://www.wearscript.com/en/ ▪  Script directory ▪  http://weariverse.com/
  150. 150. WearScript Features •  Community of Developers •  Easy development of Glass Applications –  GDK card format –  Support for all sensor input •  Support for advanced features –  Augmented Reality –  Eye tracking –  Arduino input
  151. 151. WearScript Playground •  Test code and run on Glass –  https://api.wearscript.com/
  152. 152. Summary ▪  Prototyping for wearables is similar to mobiles ▪  Tools for UI design, storyboarding, wireframing ▪  Android tools to create interactive prototypes ▪  App Inventor, Processing, etc ▪  Arduino can be used for hardware prototypes ▪  Once prototyped Native Apps can be built
  153. 153. RESEARCH DIRECTIONS
  154. 154. Challenges for the Future (2001) ▪  Privacy ▪  Power use ▪  Networking ▪  Collaboration ▪  Heat dissipation ▪  Interface design ▪  Intellectual tools ▪  Augmented Reality systems Starner, T. (2001). The challenges of wearable computing: Part 1. IEEE Micro,21(4), 44-52. Starner, T. (2001). The challenges of wearable computing: Part 2. IEEE Micro,21(4), 54-67.
  155. 155. Interface Design
  156. 156. Gesture Interaction
  157. 157. Gesture Interaction With Glass ▪  3 Gear Systems ▪  Hand tracking ▪  Hand data sent to glass ▪  Wifi networking ▪  Hand joint position ▪  AR application rendering ▪  Vuforia tracking
  158. 158. Capturing Behaviours ▪  3 Gear Systems ▪  Kinect/Primesense Sensor ▪  Two hand tracking ▪  http://www.threegear.com
  159. 159. Performance ▪  Full 3d hand model input ▪  10 - 15 fps tracking, 1 cm fingertip resolution
  160. 160. Meta Gesture Interaction ▪ Depth sensor + Stereo see-through ▪ https://www.spaceglasses.com/
  161. 161. Collaboration
  162. 162. Social Panoramas
  163. 163. Ego-Vision Collaboration ▪  Wearable computer ▪  camera + processing + display + connectivity
  164. 164. Current Collaboration ▪  First person remote conferencing/hangouts ▪  Limitations -  Single POV, no spatial cues, no annotations, etc
  165. 165. Social Panoramas ▪  Capture and share social spaces in real time ▪  Enable remote people to feel like they’re with you
  166. 166. Key Technology ▪  Google Glass ▪  Capture live panorama (compass + camera) ▪  Capture spatial audio, live video ▪  Remote device (desktop, tablet) ▪  Immersive viewing, live annotation
  167. 167. Awareness Cues ▪  Where is my partner looking? ▪  Enhanced radar display, Context compass
  168. 168. Interaction ▪  Glass Touchpad Input/Tablet Input ▪  Shared pointers, Shared drawing
  169. 169. Cognitive Models
  170. 170. Modeling Cognitive Processes •  Model cognitive processes – Based on cognitive psychology •  Use model to: – Identify opportunity for wearable – Predict user’s cognitive load
  171. 171. Typical Cognitive Model 1.  Functional Modularity: cognitive system divided into functionally separate systems 2.  Parallel Module Operation: cognitive modules operate in parallel, independent of each other 3. Limited Capacity: cognitive modules are limited in capacity with respect to time or content 4. Serial Central Operation: central coordination of modules (eg monitoring) is serial
  172. 172. Cognitive Interference ▪  Structural interference ▪  Two or more tasks compete for limited resources of a peripheral system -  eg two cognitive processes needing vision ▪  Capacity interference ▪  Total available central processing overwhelmed by multiple concurrent tasks -  eg trying to add and count at same time
  173. 173. Example: Going to work .. Which is the most cognitively demanding?
  174. 174. Cognitive Resources & Limitations asdfasdf
  175. 175. Application of Cognitive Model Busy street > Escalator > Café > Laboratory. But if you made Wayfinding, Path Planning, Estimating Time to Target, Collision Avoidance easier?
  176. 176. Social Perception
  177. 177. How is the User Perceived?
  178. 178. GlassHoles •  safa
  179. 179. TAT Augmented ID
  180. 180. The Future of Wearables
  181. 181. RESOURCES
  182. 182. Online Wearables Exhibit Online at http://wcc.gatech.edu/exhibition
  183. 183. Glass Developer Resources ▪  Main Developer Website ▪  https://developers.google.com/glass/ ▪  Glass Apps Developer Site ▪  http://glass-apps.org/glass-developer ▪  Google Design Guidelines Site ▪  https://developers.google.com/glass/design/ index?utm_source=tuicool
  184. 184. Other Resources ▪  AR for Glass Website ▪  http://www.arforglass.org/ ▪  Vandrico Database of wearable devices ▪  http://vandrico.com/database
  185. 185. Glass UI Design Guidelines •  More guidelines –  https://developers.google.com/glass/design/index
  186. 186. Books ▪  Programming Google Glass ▪  Eric Redmond ▪  Rapid Android Development: Build Rich, Sensor-Based Applications with Processing ▪  Daniel Sauter
  187. 187. •  Beginning Google Glass Development by Jeff Tang
  188. 188. •  Microinteractions: Designing with Details – Dan Saffer – http://microinteractions.com/
  189. 189. Conclusions •  Wearable computing represents a fourth generation of computing devices •  Google Glass is the first consumer wearable –  Lightweight, usable, etc •  A range of wearables will appear in 2014 –  Ecosystem of devices •  Significant research opportunities exist –  User interaction, displays, social impact
  190. 190. Contact Details Mark Billinghurst ▪  email: mark.billinghurst@hitlabnz.org ▪  twitter: @marknb00 Feedback + followup form ▪  goo.gl/6SdgzA

×