The document discusses the design of wearable interfaces and head mounted displays. It provides biographies of the two speakers, Mark Billinghurst and Rob Lindeman, and an agenda for their presentation on the history, principles, and future of wearable design. The presentation will cover wearable technologies, prototyping tools, application examples, and research directions. Attendees will have the opportunity to try various head mounted displays.
NO1 Qualified Best Black Magic Specialist Near Me Spiritual Healer Powerful L...
Designing Wearable Interfaces: A History and Overview
1. The Glass Class
Designing Wearable Interfaces
May 27th, AWE 2014
Mark Billinghurst
HIT Lab NZ
University of Canterbury
mark.billinghurst@canterbury.ac.nz
Rob Lindeman
VIVE Lab
Worcester Polytechnic Institute
gogo@wpi.edu
3. Mark Billinghurst
▪ Director of HIT Lab NZ, University
of Canterbury
▪ PhD Univ. Washington
▪ Research on AR, mobile HCI,
Collaborative Interfaces
▪ More than 250 papers in AR, VR,
interface design
▪ Sabbatical in Glass team at
Google [x] in 2013
4. Rob Lindeman
▪ Director of HIVE Lab, Worcester
Polytechnic Instiute
▪ PhD The George Washington Univ.
▪ Research on 3DUI, VR, Gaming,
HRI since 1993
▪ Been wearing Glass non-stop
(mostly, anayway) since Sept. 2013
▪ Sabbatical at HIT Lab NZ in 2011-12
▪ Program Co-Chair, ISMAR 2014
▪ Love geocaching, soccer, skiing
8. Course Goals
In this course you will learn
▪ Introduction to head mounted wearable computers
▪ Understanding of current wearable technology
▪ Key design principles/interface metaphors
▪ Relevant human cognition/perception principles
▪ Rapid prototyping tools
▪ Overview of native coding/application development
▪ Areas for future research
▪ Hands on experience with the technology
9. What You Won’t Learn
▪ Who are the companies/universities in this space
▪ See the AWE exhibit floor
▪ Designing for non-HMD based interfaces
▪ Watches, fitness bands, etc
▪ How to develop wearable hardware
▪ optics, sensor assembly, etc
▪ Evaluation methods
▪ Experimental design, statistics, etc
10. Schedule
1:30 1. Introduction (Mark + Rob)
1:35 2. History and Technology (Mark)
1:55 3. User Experience and Design Principles (Mark)
2:15 4. Prototyping Tools (Mark)
2:50 Break/Demo
3:15 5. Native Programming (Rob)
4:00 6. Application Case studies (Det)
4:30 7. Technical Q & A (Everyone)
5:00 8. Research Directions (Mark + Rob)
5:30 Finish
11. Display Demos You Can Try
Google Glass Display
Glass UI, AR demos, Games, multimedia capture
Vuzix M-100 Display
Monocular display
Epson BT-100, Epson BT-200
See through displays
AR Rift
Occulus Rift for AR
Recon Snow
Micro-display integrated into ski goggles
More at the AWE Exhibits
13. A Brief History of Time
▪ Trend
▪ smaller, cheaper, more functions, more intimate
▪ Time pieces moved from public space onto the body
18th Century
20th Century
13th Century
14. A Brief History of Computing
Trend
▪ Smaller, cheaper, faster, more intimate
▪ Moving from fixed to handheld and onto body
1950’s
1980’s
1990’s
16. What is a Wearable Computer ?
▪ A computer that is:
▪ Portable while operational
▪ Enables hands-free/hands-limited use
▪ Able to get the user’s attention
▪ Is always on, acting on behalf of the user
▪ Able to sense the user’s current context
Rhodes, B. J. (1997). The wearable remembrance agent: A system for
augmented memory. Personal Technologies, 1(4), 218-224.
17. In Other Words ..
▪ A computer that is ..
▪ Eudaemonic: User considers it part of him/herself
▪ Existential: User has complete control of the system
▪ Ephemeral: System always operating at some level
Mann, S. (1997). Wearable computing: A first step toward personal
imaging. Computer, 30(2), 25-32.
18. Wearable Computing
▪ Computer on the body that is:
▪ Always on
▪ Always accessible
▪ Always connected
▪ Other attributes
▪ Augmenting user actions
▪ Aware of user and surroundings
19. The Ideal Wearable
▪ Persists and Provides Constant Access: Designed
for everyday and continuous user over a lifetime.
▪ Senses and Models Context: Observes and models
the users environment, mental state, it’s own state.
▪ Augments and Mediates: Information support for
the user in both the physical and virtual realities.
▪ Interacts Seamlessly: Adapts its input and output
modalities to those most appropriate at the time.
Starner, T. E. (1999). Wearable computing and contextual awareness
(Doctoral dissertation, Massachusetts Institute of Technology).
21. Augmented Interaction
Rekimoto, J., & Nagao, K. (1995, December). The world through the computer:
Computer augmented interaction with real world environments. In Proceedings of the
8th annual ACM symposium on User interface and software technology (pp. 29-36).
22. ● Mixed Reality Continuum
Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays.
IEICE TRANSACTIONS on Information and Systems, 77(12), 1321-1329.
25. History of Wearables
▪ 1960-90: Early Exploration
▪ Custom build devices
▪ 1990 - 2000: Academic, Military Research
▪ MIT, CMU, Georgia Tech, EPFL, etc
▪ 1997: ISWC conference starts
▪ 1995 – 2005+: First Commercial Uses
▪ Niche industry applications, Military
▪ 2010 - : Second Wave of Wearables
▪ Consumer applications, Head Worn
26. Thorp and Shannon (1961)
▪ Wearable timing device for roulette prediction
▪ Audio feedback, four button input
Ed Thorp
Thorp, E. O. (1998, October). The invention of the first wearable computer. In
Wearable Computers, 1998. Second International Symposium on (pp. 4-8). IEEE.
27. Keith Taft (1972)
▪ Wearable computer for blackjack card
counting
▪ Toe input, LED in Glasses for feedback
Belt computer Shoe Input Glasses Display
34. Early Technology
▪ Computing
▪ Belt or Backpack
▪ Displays
▪ Head Mounted, LCD Panel, Audio
▪ Input Devices
▪ Chording Keyboard, Speech, Camera
▪ Networking
▪ Wireless LAN, Infra-Red, Cellular
35. US Military Wearables (1989- )
▪ Early experimentation
▪ 386 computer, VGA display
▪ GPS, mapping software
▪ Land Warrior (1991-)
▪ Integrated wearable system
▪ Camera, colour display, radio
▪ Navigation, reports, photos
Zieniewicz, M. J., Johnson, D. C., Wong, C., & Flatt, J. D. (2002). The evolution of
army wearable computers. IEEE Pervasive Computing, 1(4), 30-40.
40. Mobile AR: Touring Machine (1997)
▪ University of Columbia
▪ Feiner, MacIntyre, Höllerer, Webster
▪ Combines
▪ See through head mounted display
▪ GPS tracking
▪ Orientation sensor
▪ Backpack PC (custom)
▪ Tablet input
Feiner, S., MacIntyre, B., Höllerer, T., & Webster, A. (1997). A touring machine: Prototyping 3D mobile
augmented reality systems for exploring the urban environment. Personal Technologies, 1(4), 208-217.
41. MARS View
▪ Virtual tags overlaid on the real world
▪ “Information in place”
42. Backpack/Wearable Systems
1997 Backpack Wearables
▪ Feiner’s Touring Machine
▪ AR Quake (Thomas)
▪ Tinmith (Piekarski)
▪ MCAR (Reitmayr)
▪ Bulky, HMD based
Piekarski, W., & Thomas, B. (2002). ARQuake: the outdoor
augmented reality gaming system. Communications of the
ACM, 45(1), 36-38.
43. PCI 3D Graphics Board
Hard Drive
Serial
Ports
CPU
PC104 Sound Card
PC104 PCMCIA
GPS
Antenna
RTK correction Antenna
HMD
Controller
Tracker
Controller
DC to DC
Converter
Battery
Wearable
Computer
GPS RTK
correction
Radio
Example self-built working
solution with PCI-based 3D graphics
Columbia Touring Machine
Mobile AR - Hardware
44. HIT Lab NZ Wearable AR (2004)
▪ Highly accurate outdoor AR
tracking system
▪ GPS, Inertial, RTK system
▪ HMD
▪ First prototype
▪ Laptop based
▪ Video see-through HMD
▪ 2-3 cm tracking accuracy
46. 2009 - Layar (www.layar.com)
• Location based data
– GPS + compass location
– Map + camera view
• AR Layers on real world
– Customized data
– Audio, 3D, 2D content
• Easy authoring
• Android, iPhone
47. Wearable Evolution
Backpack+HMD:
…10+ kg
Handheld + HMD
… Separate sensors
.... UMPC 1.1GHz
…1.5kg
…still >$5K
Scale it down more:
Smartphone…$500
…Integrated
…0.1kg
…billions of units
1997 2003 2007
60. Summary
Wearables are a new class of computing
Intimate, persistent, aware, accessible, connected
Evolution over 50 year history
Backpack to head worn
Custom developed to consumer ready device
Enables new applications
Collaboration, memory, AR, industry, etc
Many head worn wearables are coming
Android based, sensor package, micro-display
64. Key Properties of HMD
▪ Field of View
▪ Human eye 95 deg. H, 60/70 deg. V
▪ Resolution
▪ > 320x240 pixel
▪ Refresh Rate
▪ Focus
▪ Fixed/manual
▪ Size, Weight
▪ < 350g for long term
▪ Power
65. Types of Head Mounted
Displays
Occluded
See-thru
Multiplexed
69. Strengths of optical see-through
▪ Simpler (cheaper)
▪ Direct view of real world
▪ Full resolution, no time delay (for real world)
▪ Safety
▪ Lower distortion
▪ No eye displacement
▪ see directly through display
72. Vuzix Wrap 1200DXAR
▪ Stereo video see-through display ($1500)
■ Twin 852 x 480 LCD displays, 35 deg. FOV
■ Stereo VGA cameras
■ 3 DOF head tracking
73. Strengths of Video See-Through
▪ True occlusion
▪ Block image of real world
▪ Digitized image of real world
▪ Flexibility in composition
▪ Matchable time delays
▪ More registration, calibration strategies
▪ Wide FOV is easier to support
▪ wide FOV camera
74. Multiplexed Displays
▪ Above or below line of sight
▪ Strengths
▪ User has unobstructed view of real world
▪ Simple optics/cheap
▪ Weaknesses
▪ Direct information overlay difficult
• Display/camera offset from eyeline
▪ Wide FOV difficult
75. Vuzix M-100
▪ Monocular multiplexed display ($1000)
■ 852 x 480 LCD display, 15 deg. FOV
■ 5 MP camera, HD video
■ GPS, gyro, accelerometer
76. Display Types
▪ Curved Mirror
▪ off-axis projection
▪ curved mirrors in front of eye
▪ high distortion, small eye-box
▪ Waveguide
▪ use internal reflection
▪ unobstructed view of world
▪ large eye-box
77. See-through thin displays
▪ Waveguide techniques for thin see-through displays
▪ Wider FOV, enable AR applications
▪ Social acceptability
Opinvent Ora
82. Twiddler Input
▪ Chording or multi-tap input
▪ Possible to achieve 40 - 60 wpm after 30+ hours
▪ Chording input about 50% faster than multi-tap
▪ cf 20 wpm on T9, or 60+ wpm for QWERTY
Lyons, K., Starner, T., Plaisted, D., Fusia, J., Lyons, A., Drew, A., & Looney, E. W. (2004, April).
Twiddler typing: One-handed chording text entry for mobile phones. In Proceedings of the SIGCHI
conference on Human factors in computing systems (pp. 671-678). ACM.
83. Virtual Keyboards
▪ In air text input
▪ Virtual QWERTY keyboard up to 20 wpm
- On real keyboard around 45-60+ wpm
▪ Word Gesture up to 28 wpm
- On tablet/phone Word Gesture up to 47 wpm
▪ Handwriting around 20-30 wpm
A. Markussen, et. al. Vulture: A Mid-Air Word-Gesture Keyboard (CHI 2014)
84. Unobtrusive Input Devices
▪ GestureWrist
▪ Capacitive sensing
▪ Change signal depending on hand shape
Rekimoto, J. (2001). Gesturewrist and gesturepad: Unobtrusive wearable interaction devices. In
Wearable Computers, 2001. Proceedings. Fifth International Symposium on (pp. 21-27). IEEE.
86. Skinput
Using EMG to detect muscle activity
Tan, D., Morris, D., & Saponas, T. S. (2010). Interfaces on the go. XRDS:
Crossroads, The ACM Magazine for Students, 16(4), 30-34.
87. Issues to Consider
▪ Fatigue
▪ “Gorrilla” Arm from free-hand input
▪ Comfort
▪ People want to do small gestures by waist
▪ Interaction on the go
▪ Can input be done while moving?
88. Interaction on the Go
▪ Fitt’s law still applies while interacting on the go
▪ Eg: Tapping while walking reduces speed by > 35%
▪ Increased errors while walking
Lin, M., Goldman, R., Price, K. J., Sears, A., & Jacko, J. (2007). How do people tap when walking? An
empirical investigation of nomadic data entry.International Journal of Human-Computer Studies, 65(9),
759-769.
95. User Experience
• Truly Wearable Computing
– Less than 46 ounces
• Hands-free Information Access
– Voice interaction, Ego-vision camera
• Intuitive User Interface
– Touch, Gesture, Speech, Head Motion
• Access to all Google Services
– Map, Search, Location, Messaging, Email, etc
105. ● CityViewAR
▪ Using AR to visualize Christchurch city buildings
▪ 3D models of buildings, 2D images, text, panoramas
▪ AR View, Map view, List view
▪ Available on Android/iOS market
111. ● Time Looking at Screen
Oulasvirta, A. (2005). The fragmentation of attention in mobile
interaction, and what to do with it. interactions, 12(6), 16-18.
113. Design for MicroInteractions
▪ Design interaction less than a few seconds
▪ Tiny bursts of interaction
▪ One task per interaction
▪ One input per interaction
▪ Benefits
▪ Use limited input
▪ Minimize interruptions
▪ Reduce attention fragmentation
114. ● Design for Cognitive Load
Cognitive continuums (a) Input, (b) Output
Increase cognitive load from left to right
115. Design for Interruptions
▪ Gradually increase engagement and attention load
▪ Respond to user engagement
Receiving SMS on Glass
“Bing”
Tap
Swipe
Glass
Show Message Start Reply
User
Look
Up
Say
Reply
116. Nomadic Radio (2000)
▪ Spatial audio wearable interface
Sawhney, N., & Schmandt, C. (2000). Nomadic radio: speech and audio interaction for contextual
messaging in nomadic environments. ACM transactions on Computer-Human interaction (TOCHI),
7(3), 353-383.
120. ● 1. Design For the Device
▪ Simple, relevant information
▪ Complement existing devices
121. ● 2. Don’t Get in the Way
▪ Enhance, not replace, real world interaction
122. ● 3. Keep it Relevant
▪ Information at the right time and place
123. ● 4. Avoid the Unexpected
▪ Don’t send unexpected content at wrong times
▪ Make it clear to users what your application does
124. ● 5. Build for People
▪ Use imagery, voice interaction, natural gestures
▪ Focus on fire and forget interaction model
125. Other Guidelines
▪ Don’t design a mobile app
▪ Design for emotion
▪ Make it glanceable
▪ Do one thing at a time
▪ Reduce number of information chunks
▪ Design for indoor and outdoor use
126. ● As technology becomes more
personal and immediate, it can
start to disappear.
Distant Intimate
131. Important Note
▪ Most current wearables run Android OS
▪ eg Glass, Vuzix, Atheer, Epson, etc
▪ So many tools for prototyping on Android
mobile devices will work for wearables
▪ If you want to learn to code, learn
▪ Java, Android, Javascript/PHP
132. Typical Development Steps
▪ Sketching
▪ Storyboards
▪ UI Mockups
▪ Interaction Flows
▪ Video Prototypes
▪ Interactive Prototypes
▪ Final Native Application
Increased
Fidelity &
Interactivity
143. Limitations
▪ Positives
▪ Good for documenting screens
▪ Can show application flow
▪ Negatives
▪ No interactivity/transitions
▪ Can’t be used for testing
▪ Can’t deploy on wearable
▪ Can be time consuming to create
146. ▪ Series of still photos in a movie format.
▪ Demonstrates the experience of the product
▪ Discover where concept needs fleshing out.
▪ Communicate experience and interface
▪ You can use whatever tools, from Flash to
iMovie.
Video Sketching
151. UXpin - www.uxpin.com
▪ Web based wireframing tool
▪ Mobile/Desktop applications
▪ Glass templates, run in browser
https://www.youtube.com/watch?v=0XtS5YP8HcM
152. Proto.io - http://www.proto.io/
▪ Web based mobile prototyping tool
▪ Features
▪ Prototype for multiple devices
▪ Gesture input, touch events, animations
▪ Share with collaborators
▪ Test on device
158. Justinmind - http://www.justinmind.com/
▪ Native wireframing tool
▪ Build mobile apps without programming
▪ drag and drop, interface templates
▪ web based simulation
▪ test on mobile devices
▪ collaborative project sharing
▪ Templates for Glass, custom templates
161. Wireframe Limitations
▪ Can’t deploy on Glass
▪ No access to sensor data
▪ Camera, orientation sensor
▪ No multimedia playback
▪ Audio, video
▪ Simple transitions
▪ No conditional logic
▪ No networking
163. Processing
▪ Programming tool for Artists/Designers
▪ http://processing.org
▪ Easy to code, Free, Open source, Java based
▪ 2D, 3D, audio/video support
▪ Processing For Android
▪ http://wiki.processing.org/w/Android
▪ Strong Android support
▪ Generates Android .apk file
164. Processing - Motivation
▪ Language of Interaction
▪ Sketching with code
▪ Support for rich interaction
▪ Large developer community
▪ Active help forums
▪ Dozens of plug-in libraries
▪ Strong Android support
▪ Easy to run on wearables
168. Basic Parts of a Processing
Sketch/* Notes comment */
//set up global variables
float moveX = 50;
//Initialize the Sketch
void setup (){
}
//draw every frame
void draw(){
}
169. Importing Libraries
▪ Can add functionality by Importing
Libraries
▪ java archives - .jar files
▪ Include import code
import processing.opengl.*;
▪ Popular Libraries
▪ Minim - audio library
▪ OCD - 3D camera views
▪ Physics - physics engine
▪ bluetoothDesktop - bluetooth networking
171. Processing and Glass
▪ One of the easiest ways to build rich
interactive wearable applications
▪ focus on interactivity, not coding
▪ Collects all sensor input
▪ camera, accelerometer, touch
▪ Can build native Android .apk files
▪ Side load onto Glass
172. Example: Hello World
//called initially at the start of the Processing sketch
void setup() {
size(640, 360);
background(0);
}
//called every frame to draw output
void draw() {
background(0);
//draw a white text string showing Hello World
fill(255);
text("Hello World", 50, 50);
}
174. Hello World Image
PImage img; // Create an image variable
void setup() {
size(640, 360);
//load the ok glass home screen image
img = loadImage("okGlass.jpg"); // Load the image into
the program
}
void draw() {
// Displays the image at its actual size at point (0,0)
image(img, 0, 0);
}
176. Touch Pad Input
▪ Tap recognized as DPAD input
void keyPressed() {
if (key == CODED){
if (keyCode == DPAD) {
// Do something ..
▪ Java code to capture rich motion events
▪ import android.view.MotionEvent;
177. Motion Event
//Glass Touch Events - reads from touch pad
public boolean dispatchGenericMotionEvent(MotionEvent event) {
float x = event.getX(); // get x/y coords
float y = event.getY();
int action = event.getActionMasked(); // get code for action
switch (action) { // let us know which action code shows up
case MotionEvent.ACTION_DOWN:
touchEvent = "DOWN";
fingerTouch = 1;
break;
case MotionEvent.ACTION_MOVE:
touchEvent = "MOVE";
xpos = myScreenWidth-x*touchPadScaleX;
ypos = y*touchPadScaleY;
break;
179. Sensors
▪ Ketai Library for Processing
▪ https://code.google.com/p/ketai/
▪ Support all phone sensors
▪ GPS, Compass, Light, Camera, etc
▪ Include Ketai Library
▪ import ketai.sensors.*;
▪ KetaiSensor sensor;
180. Using Sensors
▪ Setup in Setup( ) function
▪ sensor = new KetaiSensor(this);
▪ sensor.start();
▪ sensor.list();
▪ Event based sensor reading
void onAccelerometerEvent(…)
{
accelerometer.set(x, y, z);
}
182. Using the Camera
▪ Import camera library
▪ import ketai.camera.*;
▪ KetaiCamera cam;
▪ Setup in Setup( ) function
▪ cam = new KetaiCamera(this, 640, 480, 15);
▪ Draw camera image
void draw() {
//draw the camera image
image(cam, width/2, height/2);
}
184. Timeline Demo
▪ Create Card Class
▪ load image, card number, children/parent cards
▪ Timeline Demo
▪ Load cards in order
▪ Translate cards with finger motion
▪ Swipe cards in both directions
▪ Snap cards into position
190. Rasberry Pi Glasses
▪ Modify video glasses, connect to Rasberry Pi
▪ $200 - $300 in parts, simple assembly
▪ https://learn.adafruit.com/diy-wearable-pi-near-eye-kopin-video-
glasses
191. Physical Input Devices
▪ Can we develop unobtrusive input
devices ?
▪ Reduce need for speech, touch pad input
▪ Socially more acceptable
▪ Examples
▪ Ring,
▪ pendant,
▪ bracelet,
▪ gloves, etc
196. Summary
▪ Prototyping for wearables is similar to mobiles
▪ Tools for UI design, storyboarding, wireframing
▪ Android tools to create interactive prototypes
▪ App Inventor, Processing, etc
▪ Arduino can be used for hardware prototypes
▪ Once prototyped Native Apps can be built
▪ Android + SDK for each platform
197. Other Tools
▪ Wireframing
▪ pidoco
▪ FluidUI
▪ Rapid Development
▪ Phone Gap
▪ AppMachine
▪ Interactive
▪ App Inventor
▪ Unity3D
▪ WearScript
198. App Inventor - http://appinventor.mit.edu/
▪ Visual Programming for Android Apps
▪ Features
▪ Access to Android Sensors
▪ Multimedia output
▪ Drag and drop web based interface
▪ Designer view – app layout
▪ Blocks view – program logic/control
202. ● Unity for Glass Dev
Unity has built-in support for sensors on Android devices on a low level. Third-party
plugins like GyroDroid provides high-level access to every single sensor.
rotation vector
gyroscope
accelerometer
linear acceleration
gravity
light
proximity
orientation
pressure
magnetic field
processor temperature
ambient temperature
relative humidity
204. ● Unity for Glass Dev
Unity + GDK for Glass Touchpad
Use the AndroidInput.touchCountSecondary method to
get touch numbers on the Glass touchpad.
Use the AndroidInput.GetSecondaryTouch() static
method to get a specific touch on the Glass touchpad.
Use the AndroidInput.GetSecondaryTouch().phase to
detect the touch gesture on the Glass touchpad
205. ● Unity for Glass Dev
Example
if(AndroidInput.touchCountSecondary == 2)
…… // if there are two touches
if(AndroidInput.GetSecondaryTouch(0).phase ==
TouchPhase.Moved)
…… // if the first touch is moving
float pos1X = AndroidInput.GetSecondaryTouch(1).position.x;
// get the second touch postion x value
206. ● Android API in Unity for Glass
Support Touchpad Input for Google Glass
API:
//Indicating whether the system provides secondary touch input.
AndroidInput.secondaryTouchEnabled
//Indicating the height of the secondary touchpad.
AndroidInput.secondaryTouchHeight
//Indicating the width of the secondary touchpad.
AndroidInput.secondaryTouchWidth
//Number of secondary touches..
AndroidInput.touchCountSecondary
//Returns object representing status of a specific touch on a secondary touchpad .
AndroidInput.GetSecondaryTouch
207. ● Android API in Unity for Glass
Example:
/* Detect touch number on the Glass touchpad*/
Debug.Log("Touchpad", "Touch count: " +
AndroidInput.touchCountSecondary);
if(AndroidInput.touchCountSecondary >= 2) {
......
}
/* Detect touch gesture on the Glass touchpad*/
if(AndroidInput.GetSecondaryTouch(0).phase ==
TouchPhase.Moved
}
http://docs.unity3d.com/Documentation/ScriptReference/TouchPhase.html
http://stackoverflow.com/questions/20441090/how-can-create-touch-screen-android-
scroll-in-unity3d
208. ● Android API in Unity for Glass
Detect Google Glass in Unity C# Script
API:
SystemInfo.deviceModel
Functionality:
Provides the model of the device.
209. ● Android API in Unity for Glass
Example:
/* Show different GUIs for different devices */
Debug.Log("Android model", SystemInfo.deviceModel);
if(SystemInfo.deviceModel.contains("Glass")) {
Debug.Log("Android", "Google Glass detected");
// Active GUI for Glass
......
} else {
Debug.Log("Android", "Phone or Tablet detected");
// Active GUI for Phone or Tablet
......
}
211. ● WearScript Features
▪ Community of Developers
▪ Easy development of Glass Applications
▪ GDK card format
▪ Support for all sensor input
▪ Support for advanced features
▪ Augmented Reality
▪ Eye tracking
▪ Arduino input
214. Overview
▪ For best performance need native coding
▪ Low level algorithms etc
▪ Most current wearables based on Android OS
▪ Need Java/Android skills
▪ Many devices have custom API/SDK
▪ Vusix M-100: Vusix SDK
▪ Glass: Mirror API, Glass Developer Kit (GDK)
217. Glassware and Timeline
▪ Static Cards
▪ Static content with text, HTML, images, and video.
- e.g. notification messages, news clip
▪ Live Cards
▪ Dynamic content updated frequently.
- e.g. compass, timer
▪ Immersions
▪ Takes over the whole control, out from timeline.
- e.g. interactive game
218. Glassware Development
▪ Mirror API
▪ Server programming, online/web application
▪ Static cards / timeline management
▪ GDK
▪ Android programming, Java (+ C/C++)
▪ Live cards & Immersions
▪ See: https://developers.google.com/glass/
219. ▪ REST API
▪ Java servlet, PHP, Go,
Python, Ruby, .NET
▪ Timeline based apps
▪ Static cards
- Text, HTML, media attachment (image & video)
- Standard and custom menu items
▪ Manage timeline
- Subscribe to timeline notifications
- Sharing with contacts
- Location based services
Mirror API
220. GDK
▪ Glass Development Kit
▪ Android 4.0.3 ICS + Glass specific APIs
▪ Use standard Android Development Tools
221. ▪ GDK add-on features
▪ Timeline and cards
▪ Menu and UI
▪ Touch pad and gesture
▪ Media (sound, camera and voice input)
GDK
222. Glass Summary
▪ Use Mirror API if you need ...
▪ Use GDK if you need ...
▪ Or use both
223. ● An Introduction to Glassware
Development
- GDK -
Rob Lindeman
gogo@wpi.edu
Human Interaction in Virtual Environments (HIVE) Lab
Worcester Polytechnic Institute
Worcester, MA, USA
http://www.cs.wpi.edu/~gogo/hive/
* Images in the slides are from variety of sources,
including http://developer.android.com and http://developers.google.com/glass
224. ● Thanks to Gun Lee!
▪ Most of this material was developed by
Gun Lee at the HIT Lab NZ.
▪ He’s a rock star!
225. Rob Lindeman
▪ Director of HIVE Lab, Worcester
Polytechnic Instiute
▪ PhD The George Washington Univ.
▪ Research on 3DUI, VR, Gaming,
HRI since 1993
▪ Been wearing Glass non-stop
(mostly, anayway) since Sept. 2013
▪ Sabbatical at HIT Lab NZ in 2011-12
▪ Program Co-Chair, ISMAR 2014
▪ Love geocaching, soccer, skiing
226. ● Glassware Development
▪ Mirror API
▪ Server programming, online/web application
▪ Static cards / timeline management
▪ GDK
▪ Android programming, Java (+ C/C++)
▪ Live cards & Immersions
https://developers.google.com/glass/
227. ● Mirror API
▪ REST API
▪ Java servlet, PHP, Go,
Python, Ruby, .NET
▪ Timeline based apps
▪ Static cards
- Text, HTML, media attachment (image & video)
- Standard and custom menu items
▪ Manage timeline
- Subscribe to timeline notifications
- Sharing with contacts
- Location based services
230. ● Development Environment Setup
▪ JDK (1.6 or above, using 1.8 for the tutorial)
▪ http://www.oracle.com/technetwork/java/javase/
downloads/index.html
▪ ADT Bundle (Eclipse + Android SDK)
▪ http://developer.android.com/sdk/index.html
▪ With Android SDK Manager (select Window>Android
SDK Manager from Eclipse menu) install:
- Tools > Android Build-tools (latest version)
- Android 4.4.2 (API 19) SDK Platform, ARM System Image,
Google APIs, Glass Development Kit Preview
- Extras > Google USB Driver (only for Windows Platform)
231. ● Create an Android App Project
▪ In Eclipse
▪ File > New > (Other > Android>)
Android Application Project
▪ Fill in the Application name, Project name, and Java
package namespace to use
▪ Choose SDK API 19: Android 4.4.2 for all SDK settings
▪ Use default values for the rest
232. ● Virtual Device Definition for Glass
▪ Window > Android Virtual Device Manager >
Device Definitions > New Device
▪ 640x360px
▪ 3 in. (hdpi)
▪ Landscape
233. ● Live Cards vs. Immersions
▪ Live cards display frequently updated information to the
left of the Glass clock.
▪ Integrate rich content into the timeline
▪ Simple text/images to full-blown 3D graphics
▪ Immersions let you build a user experience outside of
the timeline.
▪ Build interactive experiences
▪ Extra control, fewer user input constraints
237. ● Develop with GDK
▪ Android 4.4.2 (API 19) SDK and GDK Preview
from the Android SDK Manager.
▪ Project settings:
▪ Minimum and Target SDK Versions: 19
▪ Compile with: GDK Preview
▪ Theme: None (allows the Glass theme to be applied.)
▪ GDK samples
▪ File > New Project > Android Sample Project
▪ On Glass, turn on USB debugging
▪ Settings > Device Info > Turn on debug
238. ● Hello World - Immersion
▪ App/Activity without theme
▪ Allows the Glass theme to be applied.
▪ Add voice trigger for launching
▪ Touch input and Menu
▪ Voice recognition for text input
239. ● Voice Trigger for Launching
▪ Add intent filter to your main Acivity in
AndroidManifest.xml
▪ Add xml/voice_trigger.xml to res folder
▪ Can use additional follow up voice recognition prompts
if needed
<uses-permission
android:name="com.google.android.glass.permission.DEVELOPMENT" />
…
<intent-filter>
<action android:name="com.google.android.glass.action.VOICE_TRIGGER" />
</intent-filter>
<meta-data android:name="com.google.android.glass.VoiceTrigger“
android:resource="@xml/voice_trigger" />
<?xml version="1.0" encoding="utf-8"?>
<trigger keyword="hello world" />
https://developers.google.com/glass/develop/gdk/input/voice
240. ● Official Voice Triggers on MyGlass
▪ listen to
▪ take a note
▪ post an update
▪ show a compass
▪ start a run
▪ start a bike ride
▪ find a recipe
▪ record a recipe
▪ check me in
• start a stopwatch
• start a timer
• start a round of golf
• translate this
• learn a song
• tune an instrument
• play a game
• start a workout
https://developers.google.com/glass/develop/gdk/input/voice
241. ● Touch Input as Key Input
▪ Touch input translated as DPAD key input
▪ Tap => KEYCODE_DPAD_CENTER
▪ Swipe down => KEYCODE_BACK
▪ Camera button => KEYCODE_CAMERA
@Override
public boolean onKeyDown( int keycode, KeyEvent event ) {
if( keycode == KeyEvent.KEYCODE_DPAD_CENTER ) {
// user tapped touchpad, do something
return true;
}
…
return false;
}
https://developers.google.com/glass/develop/gdk/input/touch
242. ● Touch Input
▪ onGenericMotionEvent( MotionEvent event )
@Override
public boolean onGenericMotionEvent( MotionEvent event ) {
switch( event.getAction( ) ) {
case MotionEvent.ACTION_DOWN:
break;
case MotionEvent.ACTION_MOVE:
break;
case MotionEvent.ACTION_UP:
break;
}
return super.onGenericMotionEvent( event );
}
https://developers.google.com/glass/develop/gdk/input/touch
245. ● Menu
▪ Open options menu on tap
▪ openOptionsMenu( )
▪ Add 50x50 pixel icons in the menu resource XML
▪ android:icon="@drawable/icon"
- https://developers.google.com/glass/tools-
downloads/menu_icons.zip
▪ Show/hide/update menu items if needed
▪ onPrepareOptionsMenu( )
https://developers.google.com/glass/develop/gdk/ui/immersion-menus
247. ● Voice Input
▪ Start activity for result with system action
▪ Customize prompt with intent extra
▪ Recognized strings returned in intent data of
onActivityResult( )
intent = new
Intent( RecognizerIntent.ACTION_RECOGNIZE_SPEECH );
startActivityForResult( intent, 0 );
intent.putExtra( RecognizerIntent.EXTRA_PROMPT,
"What is your name?” );
intentData.getStringArrayListExtra( RecognizerIntent.EXTRA_RESULTS );
https://developers.google.com/glass/develop/gdk/input/voice
http://developer.android.com/reference/android/speech/RecognizerIntent.html
249. ● Hello World - Immersion ++
▪ Play Sounds & Text-to-speech
▪ Take a picture with camera
▪ Card based info page
250. ● Playing Sounds & TTS
▪ Glass system sounds
▪ Text-to-speech
▪ Create/destroy TTS in onCreate/onDestroy( )
https://developers.google.com/glass/develop/gdk/reference/com/google/android/glass/media/Sounds
AudioManager am =
( AudioManager )getSystemService( Context.AUDIO_SERVICE );
am.playSoundEffect( Sounds.ERROR )
// DISALLOWED, DISMISSED, ERROR, SELECTED, SUCCESS, TAP
TextToSpeech tts = new TextToSpeech( context, ttsOnInitListener );
…
tts.speak( “Hello world!”, TextToSpeech.QUEUE_FLUSH, null );
tts.shutdown( );
http://developer.android.com/reference/android/speech/tts/TextToSpeech.html
251. ● Playing Custom Sounds
▪ Put sound files in res/raw
▪ Load sounds to SoundPool object to play
soundPool = new SoundPool( MAX_STREAM,
AudioManager.STREAM_MUSIC, 0 );
int soundOneID = soundPool.load( context, R.raw.sound1, 1 );
int soundTwoID = soundPool.load( context, R.raw.sound2, 1 );
…
soundPool.play( int soundID, float leftVolume, float rightVolume,
int priority, int loop, float rate )
http://developer.android.com/reference/android/media/SoundPool.html
253. ● Camera Input
▪ Calling the Glass built-in camera activity with
startActivityForResult( ) and Action Intent, returned with file
path to image/video through Intent extra data.
▪ Low level access to camera with the Android Camera API.
▪ http://developer.android.com/reference/android/
hardware/Camera.html
https://developers.google.com/glass/develop/gdk/media-camera/camera
254. ● Camera with Action Intent
private void takePicture( ) {
Intent intent = new Intent( MediaStore.ACTION_IMAGE_CAPTURE );
startActivityForResult( intent, TAKE_PICTURE );
}
@Override
protected void onActivityResult( int requestCode, int resultCode, Intent data ) {
if( requestCode == TAKE_PICTURE && resultCode == RESULT_OK ) {
String picturePath =
data.getStringExtra( CameraManager.EXTRA_PICTURE_FILE_PATH );
// smaller picture available with EXTRA_THUMBNAIL_FILE_PATH
processPictureWhenReady( picturePath ); // file might not be ready for a while
}
super.onActivityResult( requestCode, resultCode, data );
}
256. ● Scrolling Cards in Activity
▪ Set a CardScrollView as the content view
▪ Use a custom class extending the CardScrollAdapter
class to populate the CardScrollView
https://developers.google.com/glass/develop/gdk/ui/theme-widgets
https://developers.google.com/glass/develop/gdk/reference/com/google/android/glass/widget/package-
summary
CardScrollView cardScrollView = new CardScrollView( this );
cardScrollView.setAdapter( new InfoCardScrollAdapter( ) );
cardScrollView.activate( );
setContentView( cardScrollView );
257. ● Scrolling Cards in Activity
▪ In your custom CardScrollAdapter class
▪ Create a list of cards
▪ Implement abstract methods in your custom
CardScrollAdapter class
- int getCount( ) => return the number of cards (items)
- Object getItem( int position ) => return the card at position
- View getView( int position, View convertView, ViewGroup
parentView ) => return the view of the card at position
- int getPosition( Object item ) => find and return the position
of the given item (card) in the list. (return -1 for error)
https://developers.google.com/glass/develop/gdk/ui/theme-widgets
https://developers.google.com/glass/develop/gdk/reference/com/google/android/glass/widget/package-
summary
279. Challenges for the Future (2001)
▪ Privacy
▪ Power use
▪ Networking
▪ Collaboration
▪ Heat dissipation
▪ Interface design
▪ Intellectual tools
▪ Augmented Reality systems
Starner, T. (2001). The challenges of wearable computing: Part 1. IEEE Micro,21(4), 44-52.
Starner, T. (2001). The challenges of wearable computing: Part 2. IEEE Micro,21(4), 54-67.
282. Capturing Behaviours
▪ 3 Gear Systems
▪ Kinect/Primesense Sensor
▪ Two hand tracking
▪ http://www.threegear.com
283. Gesture Interaction With Glass
▪ 3 Gear Systems
▪ Hand tracking
▪ Hand data sent to glass
▪ Wifi networking
▪ Hand joint position
▪ AR application rendering
▪ Vuforia tracking
284. Performance
▪ Full 3d hand model input
▪ 10 - 15 fps tracking, 1 cm fingertip resolution
285. User Study
▪ Gesture vs. Touch pad vs. Combined input
▪ Gesture 3x faster, no difference in accuracy
297. Resource Competition Framework
▪ Mobility tasks compete for cognitive resources
with other tasks
▪ the most important given higher priority
▪ RCF is a method for analyzing this, based on:
▪ task analysis
▪ modelling cognitive resources
▪ a resource approach to attention
Oulasvirta, A., Tamminen, S., Roto, V., & Kuorelahti, J. (2005, April). Interaction in 4-second bursts: the
fragmented nature of attentional resources in mobile HCI. In Proceedings of the SIGCHI conference on
Human factors in computing systems (pp. 919-928). ACM.
298. RCF Key Assumptions
Four Key Assumptions
1. Functional Modularity: cognitive system divided into
functionally separate systems with diff. representations
2. Parallel Module Operation: cognitive modules operate
in parallel, independent of each other
3. Limited Capacity: cognitive modules are limited in
capacity with respect to time or content
4. Serial Central Operation: central coordination of
modules (eg monitoring) is serial
299. Cognitive Interference
▪ Structural interference
▪ Two or more tasks compete for limited
resources of a peripheral system
- eg two cognitive processes needing vision
▪ Capacity interference
▪ Total available central processing
overwhelmed by multiple concurrent tasks
- eg trying to add and count at same time
301. Using RCF
1. Map cognitive faculty to task
2. Look for conflicts/overloads
3. Analyse for competition for attention
4. Look for opportunities for technology to
reduce conflicts/competition
304. Application of RCF
Busy street > Escalator > Café > Laboratory.
But if you made Wayfinding, Path Planning, Estimating Time
to Target, Collision Avoidance easier?
319. ● Conclusions
▪ Wearable computing represents a fourth generation of
computing devices
▪ Google Glass is the first consumer wearable
▪ Lightweight, usable, etc
▪ A range of wearables will appear in 2014
▪ Ecosystem of devices
▪ Significant research opportunities exist
▪ User interaction, displays, social impact
320. Contact Details
Mark Billinghurst
▪ email: mark.billinghurst@hitlabnz.org
▪ twitter: @marknb00
Rob Lindeman
▪ email: gogo@wpi.edu
Feedback + followup form
▪ goo.gl/6SdgzA