SlideShare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.
SlideShare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.
Successfully reported this slideshow.
Activate your 14 day free trial to unlock unlimited reading.
Siggraph 2014: The Glass Class - Designing Wearable Interfaces
Course on Designing Wearable interfaces taught by Mark Billinghurst at Siggraph 2014. Presented on August 10th, 2014 from 10:45am - 12:15pm. The course focuses mainly on design guidelines and tools for rapid prototyping for Google Glass.
Course on Designing Wearable interfaces taught by Mark Billinghurst at Siggraph 2014. Presented on August 10th, 2014 from 10:45am - 12:15pm. The course focuses mainly on design guidelines and tools for rapid prototyping for Google Glass.
Siggraph 2014: The Glass Class - Designing Wearable Interfaces
1.
The Glass Class: Designing Wearable Interfaces
Mark Billinghurst
The HIT Lab NZ, University of Canterbury
The 41st International
Conference and Exhibition
on Computer Graphics and
Interactive Techniques
3.
Mark Billinghurst
▪ Director of The HIT Lab NZ, University of
Canterbury
▪ PhD Univ. Washington
▪ Research on AR, mobile HCI,
Collaborative Interfaces, Wearables
▪ Joined Glass team at Google [x] in 2013
5.
Course Goals
In this course you will learn
▪ Introduction to head mounted wearable computers
▪ Understanding of current wearable technology
▪ Key design principles/interface metaphors
▪ Rapid prototyping tools
▪ Areas for future research
6.
What You Won’t Learn
▪ Who are the companies/universities in this space
▪ See the Siggraph exhibit floor
▪ Designing for non-HMD based interfaces
▪ Watches, fitness bands, etc
▪ How to develop wearable hardware
▪ optics, sensor assembly, etc
▪ Evaluation methods
▪ Experimental design, statistics, etc
7.
Schedule
• 10:45 am Introduction
• 10:55 am Technology Overview
• 11:05 am Design Guidelines
• 11:25 am Prototyping Tools
• 11:55 am Example Applications
• 12:05 am Research Directions/Resources
8.
A Brief History of Computing
Trend
▪ Smaller, cheaper, faster, more intimate
▪ Moving from fixed to handheld and onto body
1950’s
1980’s
1990’s
9.
Wearable Computing
▪ Computer on the body that is:
▪ Always on
▪ Always accessible
▪ Always connected
▪ Other attributes
▪ Augmenting user actions
▪ Aware of user and surroundings
11.
The Ideal Wearable
▪ Persists and Provides Constant Access: Designed for
everyday and continuous use over a lifetime.
▪ Senses and Models Context: Observes and models the
users environment, mental state, it’s own state.
▪ Augments and Mediates: Information support for the user in
both the physical and virtual realities.
▪ Interacts Seamlessly: Adapts its input and output modalities
to those most appropriate at the time.
Starner, T. E. (1999). Wearable computing and contextual awareness
(Doctoral dissertation, Massachusetts Institute of Technology).
12.
History of Wearables
▪ 1960-90: Early Exploration
▪ Gamblers and Custom build devices
▪ 1990 - 2000: Academic, Military Research
▪ MIT, CMU, Georgia Tech, EPFL, etc
▪ 1997: ISWC conference starts
▪ 1995 – 2005+: First Commercial Uses
▪ Niche industry applications, Military
▪ 2010 - : Second Wave of Wearables
13.
Origins - The Gamblers
• Thorp and Shannon (1961)
– Wearable timing device for roulette prediction
• Keith Taft (1972)
– Wearable computer for blackjack card counting
Belt computer Shoe Input Glasses Display
19.
Mobile AR: Touring Machine (1997)
▪ University of Columbia
▪ Feiner, MacIntyre, Höllerer, Webster
▪ Combined
▪ See through head mounted display
▪ GPS tracking, Orientation sensor
▪ Backpack PC (custom)
▪ Tablet input
Feiner, S., MacIntyre, B., Höllerer, T., & Webster, A. (1997). A touring machine: Prototyping 3D mobile
augmented reality systems for exploring the urban environment. Personal Technologies, 1(4), 208-217.
20.
Touring Machine View
▪ Virtual tags overlaid on the real world
▪ “Information in place”
21.
Early Commercial Systems
▪ Xybernaut (1996 - 2007)
▪ Belt worn, HMD, 200 MHz
▪ ViA (1996 – 2001)
▪ Belt worn, Audio Interface
▪ 700 MHz Crusoe
■ Symbol (1998 – 2006)
■ Wrist worn computer
■ Finger scanner
25.
Summary
Wearables are a new class of computing
Intimate, persistent, aware, accessible
Evolution over 50 year history
Backpack to head worn
Custom developed to consumer ready device
Enables new applications
Collaboration, memory, AR, industry, etc
Many head worn wearables are coming
37.
User Experience
• Truly Wearable Computing
– Less than 46 ounces
• Hands-free Information Access
– Voice interaction, Ego-vision camera
• Intuitive User Interface
– Touch, Gesture, Speech, Head Motion
• Access to all Google Services
– Map, Search, Location, Messaging, Email, etc
38.
Types of Head Mounted Displays
Occluded
See-thru
Multiplexed
39.
Multiplexed Displays
▪ Above or below line of sight
▪ Strengths
▪ User has unobstructed view of real world
▪ Simple optics/cheap
▪ Weaknesses
▪ Direct information overlay difficult
▪ Display/camera offset from eyeline
▪ Wide FOV difficult
40.
Vuzix M-100
▪ Monocular multiplexed display ($1000)
▪ 852 x 480 LCD display, 15 deg. FOV
▪ 5 MP camera, HD video
▪ GPS, gyro, accelerometer
41.
Optical see-through HMD
Virtual images
from monitors
Real
World
Optical
Combiners
43.
Strengths of optical see-through
▪ Simpler (cheaper)
▪ Direct view of real world
▪ Full resolution, no time delay (for real world)
▪ Safety
▪ Lower distortion
▪ No eye displacement
▪ see directly through display
44.
Video see-through HMD
Video
cameras
Monitors
Graphics
Combiner
Video
45.
Vuzix Wrap 1200DXAR
▪ Stereo video see-through display ($1500)
■ Twin 852 x 480 LCD displays, 35 deg. FOV
■ Stereo VGA cameras
■ 3 DOF head tracking
46.
Strengths of Video See-Through
▪ True occlusion
▪ Block image of real world
▪ Digitized image of real world
▪ Flexibility in composition, match time delays
▪ More registration, calibration strategies
▪ Wide FOV is easier to support
▪ wide FOV camera
48.
Twiddler Input
▪ Chording or multi-tap input
▪ Possible to achieve 40 - 60 wpm after 30+ hours
▪ cf 20 wpm on T9, or 60+ wpm for QWERTY
Lyons, K., Starner, T., Plaisted, D., Fusia, J., Lyons, A., Drew, A., & Looney, E. W. (2004, April).
Twiddler typing: One-handed chording text entry for mobile phones. In Proceedings of the SIGCHI
conference on Human factors in computing systems (pp. 671-678). ACM.
49.
Virtual Keyboards
▪ In air text input
▪ Virtual QWERTY keyboard up to 20 wpm
▪ Word Gesture up to 28 wpm
▪ Handwriting around 20-30 wpm
A. Markussen, et. al. Vulture: A Mid-Air Word-Gesture Keyboard (CHI 2014)
50.
Unobtrusive Input Devices
▪ GestureWrist
▪ Capacitive sensing, changes with hand shape
Rekimoto, J. (2001). Gesturewrist and gesturepad: Unobtrusive wearable interaction devices. In
Wearable Computers, 2001. Proceedings. Fifth International Symposium on (pp. 21-27). IEEE.
52.
Skinput
Using EMG to detect muscle activity
Tan, D., Morris, D., & Saponas, T. S. (2010). Interfaces on the go.
XRDS: Crossroads, The ACM Magazine for Students, 16(4), 30-34.
53.
Issues to Consider
▪ Fatigue
▪ “Gorrilla” Arm from free-hand input
▪ Comfort
▪ People want to do small gestures by waist
▪ Interaction on the go
▪ Can input be done while moving?
56.
Design For the Device
• Simple, relevant information
• Complement existing devices
57.
Last year Last week NowForever
The Now machine
Focus on location, contextual and timely
information, and communication.
58.
Don’t design an app
Glass OS is time-based model, not an app model.
59.
The
world
is
the
experience
Get
the
interface
and
interac-ons
out
of
the
way.
60.
It's
like
a
rear
view
mirror
Don't
overload
the
user.
S-ck
to
the
absolutely
essen-al,
avoid
long
interac-ons.
Be
explicit.
61.
Micro
Interac8ons
The
posi-on
of
the
display
and
limited
input
ability
makes
longer
interac-ons
less
comfortable.
Using
it
shouldn’t
take
longer
than
taking
out
your
phone.
62.
Micro-Interactions
On mobiles people split attention
between display and real world
63.
Time Looking at Screen
Oulasvirta, A. (2005). The fragmentation of attention in mobile
interaction, and what to do with it. interactions, 12(6), 16-18.
64.
Design for MicroInteractions
▪ Design interaction less than a few seconds
– Tiny bursts of interaction
– One task per interaction
– One input per interaction
▪ Benefits
– Use limited input
– Minimize interruptions
– Reduce attention fragmentation
65.
Make it Glanceable
• Seek to rigorously reduce information density.
• Design for recognition, not reading.
Bad Good
66.
Reduce the Number of Info Chunks
• You are designing for recognition, not reading.
• Reducing the total # of information chunks will
greatly increase the glanceability of your design.
• .
1
2
3
1
2
3
4
5 (6)
67.
Design single interactions < 4 s
Eye movements
For 1: 1 230ms
For 2: 1 230ms
For 3: 1 230ms
For 4: 3 690ms
For 5: 2 460ms
~1,840ms
Eye movements
For 1: 1-2 460ms
For 2: 1 230ms
For 3: 1 230ms
~920ms
1
2
3
1
2
3
4
5 (6)
69.
Don’t Get in the Way
• Enhance, not replace, real world interaction
70.
Design for Interruptions
▪ Gradually increase engagement and attention load
▪ Respond to user engagement
Receiving SMS on Glass
“Bing”
Tap
Swipe
Glass
Show Message Start Reply
User Look
Up
Say
Reply
77.
Transparent
displays
are
tricky
Colors
are
funny
and
inconsistent.
You
can
only
add
light
to
a
scene,
not
cover
anything
up.
Mo-on
can
be
disorien-ng.
Clarity,
contrast,
brightness,
visual
field
and
aHen-on
are
important.
79.
Establish hierarchy with color
White is your <h1> and grey is your <h2> or <h3>.
Footer text - establishing time, attribution, or
distance - is the only place with smaller font size.
86.
CityViewAR
• Using AR to visualize Christchurch city buildings
– 3D models of buildings, 2D images, text, panoramas
– ARView, Map view, List view
– Available on Android/iOS market
87.
CityViewAR on Glass
• AR overlay of virtual buildings in Christchurch
92.
Important Note
▪ Most current wearables run Android OS
▪ eg Glass, Vuzix, Atheer, Epson, etc
▪ So many tools for prototyping on Android
mobile devices will work for wearables
▪ If you want to learn to code, learn
▪ Java, Android, Javascript/PHP
93.
Typical Development Steps
▪ Sketching
▪ Storyboards
▪ UI Mockups
▪ Interaction Flows
▪ Video Prototypes
▪ Interactive Prototypes
▪ Final Native Application
Increased
Fidelity &
Interactivity
102.
Glassware Flow Designer
• Features
– Design using common patterns and layouts
– Specify interactions and card flow
– Share with other designers
• Available from:
– https://developers.google.com/glass/tools-
downloads/glassware-flow-designer
105.
▪ Series of still photos in a movie format.
▪ Demonstrates the experience of the product
▪ Discover where concept needs fleshing out.
▪ Communicate experience and interface
▪ You can use whatever tools, from Flash to iMovie.
Video Sketching
106.
See https://vine.co/v/bgIaLHIpFTB
Example: Glass Vine UI
107.
Limitations
▪ Positives
▪ Good for documenting screens
▪ Can show application flow
▪ Negatives
▪ No interactivity/transitions
▪ Can’t be used for testing
▪ Can’t deploy on wearable
▪ Can be time consuming to create
109.
UXpin - www.uxpin.com
▪ Web based wireframing tool
▪ Mobile/Desktop applications
▪ Glass templates, run in browser
110.
Proto.io - http://www.proto.io/
▪ Web based mobile prototyping tool
▪ Features
▪ Prototype for multiple devices
▪ Gesture input, touch events, animations
▪ Share with collaborators
▪ Test on device
115.
Justinmind
▪ Native wireframing tool
▪ Build mobile apps without programming
▪ drag and drop, interface templates
▪ web based simulation
▪ test on mobile devices
▪ collaborative project sharing
▪ Templates for Glass, custom templates
118.
Wireframe Limitations
▪ Can’t deploy on Glass
▪ No access to sensor data
▪ Camera, orientation sensor
▪ No multimedia playback
▪ Audio, video
▪ Simple transitions
▪ No conditional logic
119.
Processing
▪ Programming tool for Artists/Designers
▪ http://processing.org
▪ Easy to code, Free, Open source, Java based
▪ 2D, 3D, audio/video support
▪ Processing For Android
▪ http://wiki.processing.org/w/Android
▪ Strong Android support, builds .apk file
120.
Basic Processing Sketch
/* Notes comment */
//set up global variables
float moveX = 50;
//Initialize the Sketch
void setup (){
}
//draw every frame
void draw(){
}
121.
Importing Libraries
▪ Can add functionality by Importing Libraries
▪ java archives - .jar files
▪ Include import code
import processing.opengl.*;
▪ Popular Libraries
▪ Minim - audio library, OCD - 3D camera views
▪ bluetoothDesktop - bluetooth networking
122.
Processing and Glass
▪ One of the easiest ways to build rich
interactive wearable applications
▪ focus on interactivity, not coding
▪ Collects all sensor input
▪ camera, accelerometer, touch
▪ Can build native Android .apk files
▪ Side load onto Glass
123.
Hello World Image
PImage img; // Create an image variable
void setup() {
size(640, 360);
//load the ok glass home screen image
img = loadImage("okGlass.jpg"); // Load the image into
the program
}
void draw() {
// Displays the image at its actual size at point (0,0)
image(img, 0, 0);
}
125.
Touch Pad Input
▪ Tap recognized as DPAD input
void keyPressed() {
if (key == CODED){
if (keyCode == DPAD) {
// Do something ..
▪ Java code to capture rich motion events
▪ import android.view.MotionEvent;
126.
Motion Event
//Glass Touch Events - reads from touch pad
public boolean dispatchGenericMotionEvent(MotionEvent event) {
float x = event.getX(); // get x/y coords
float y = event.getY();
int action = event.getActionMasked(); // get code for action
switch (action) { // let us know which action code shows up
case MotionEvent.ACTION_MOVE:
touchEvent = "MOVE";
xpos = myScreenWidth-x*touchPadScaleX;
ypos = y*touchPadScaleY;
break;
128.
Sensors
▪ Ketai Library for Processing
▪ https://code.google.com/p/ketai/
▪ Support all phone sensors
▪ GPS, Compass, Light, Camera, etc
▪ Include Ketai Library
▪ import ketai.sensors.*;
▪ KetaiSensor sensor;
129.
Using Sensors
▪ Setup in Setup( ) function
▪ sensor = new KetaiSensor(this);
▪ sensor.start();
▪ sensor.list();
▪ Event based sensor reading
void onAccelerometerEvent(…){
accelerometer.set(x, y, z);
}
131.
Using the Camera
▪ Import camera library
▪ import ketai.camera.*;
▪ KetaiCamera cam;
▪ Setup in Setup( ) function
cam = new KetaiCamera(this,640,480,15);
▪ Draw camera image
void draw() { //draw the camera image
image(cam, width/2, height/2);
133.
Native Coding
▪ For best performance need native coding
▪ Low level algorithms etc
▪ Most current wearables based on Android OS
▪ Need Java/Android skills
▪ Many devices have custom API/SDK
▪ Vusix M-100: Vusix SDK
▪ Glass: Mirror API, Glass Developer Kit (GDK)
134.
Glassware Development
▪ Mirror API
▪ Server programming, online/web application
▪ Static cards / timeline management
▪ GDK
▪ Android programming, Java (+ C/C++)
▪ Live cards
▪ See: https://developers.google.com/glass/
135.
▪ REST API
▪ Java servlet, PHP, Go,
Python, Ruby, .NET
▪ Timeline based apps
▪ Static cards
- Text, HTML, media attachment (image & video)
▪ Manage timeline
- Subscribe to timeline notifications, contacts
- Location based services
Mirror API
136.
GDK
▪ Glass Development Kit
▪ Android 4.0.3 ICS + Glass specific APIs
▪ Use standard Android Development Tools
137.
▪ GDK add-on features
▪ Timeline and cards
▪ Menu and UI
▪ Touch pad and gesture
▪ Media (sound, camera and voice input)
GDK
138.
Glass Summary
▪ Use Mirror API if you need ...
▪ Use GDK if you need ...
▪ Or use both
142.
Rasberry Pi Glasses
▪ Modify video glasses, connect to Rasberry Pi
▪ $200 - $300 in parts, simple assembly
▪ https://learn.adafruit.com/diy-wearable-pi-near-eye-kopin-video-glasses
143.
Physical Input Devices
▪ Can we develop unobtrusive input devices ?
▪ Reduce need for speech, touch pad input
▪ Socially more acceptable
▪ Examples
▪ Ring, pendant,
▪ bracelet, gloves, etc
144.
Prototyping Platform
Arduino Kit Bluetooth Shield Google Glass
145.
Example: Glove Input
▪ Buttons on fingertips
▪ Map touches to commands
149.
WearScript
▪ JavaScript development for Glass
▪ http://www.wearscript.com/en/
▪ Script directory
▪ http://weariverse.com/
150.
WearScript Features
• Community of Developers
• Easy development of Glass Applications
– GDK card format
– Support for all sensor input
• Support for advanced features
– Augmented Reality
– Eye tracking
– Arduino input
151.
WearScript Playground
• Test code and run on Glass
– https://api.wearscript.com/
152.
Summary
▪ Prototyping for wearables is similar to mobiles
▪ Tools for UI design, storyboarding, wireframing
▪ Android tools to create interactive prototypes
▪ App Inventor, Processing, etc
▪ Arduino can be used for hardware prototypes
▪ Once prototyped Native Apps can be built
154.
Challenges for the Future (2001)
▪ Privacy
▪ Power use
▪ Networking
▪ Collaboration
▪ Heat dissipation
▪ Interface design
▪ Intellectual tools
▪ Augmented Reality systems
Starner, T. (2001). The challenges of wearable
computing: Part 1. IEEE Micro,21(4), 44-52.
Starner, T. (2001). The challenges of wearable
computing: Part 2. IEEE Micro,21(4), 54-67.
157.
Gesture Interaction With Glass
▪ 3 Gear Systems
▪ Hand tracking
▪ Hand data sent to glass
▪ Wifi networking
▪ Hand joint position
▪ AR application rendering
▪ Vuforia tracking
158.
Capturing Behaviours
▪ 3 Gear Systems
▪ Kinect/Primesense Sensor
▪ Two hand tracking
▪ http://www.threegear.com
159.
Performance
▪ Full 3d hand model input
▪ 10 - 15 fps tracking, 1 cm fingertip resolution
170.
Modeling Cognitive Processes
• Model cognitive processes
– Based on cognitive psychology
• Use model to:
– Identify opportunity for wearable
– Predict user’s cognitive load
171.
Typical Cognitive Model
1. Functional Modularity: cognitive system divided
into functionally separate systems
2. Parallel Module Operation: cognitive modules
operate in parallel, independent of each other
3. Limited Capacity: cognitive modules are limited in
capacity with respect to time or content
4. Serial Central Operation: central coordination of
modules (eg monitoring) is serial
172.
Cognitive Interference
▪ Structural interference
▪ Two or more tasks compete for limited
resources of a peripheral system
- eg two cognitive processes needing vision
▪ Capacity interference
▪ Total available central processing
overwhelmed by multiple concurrent tasks
- eg trying to add and count at same time
173.
Example: Going to work ..
Which is the most cognitively demanding?
175.
Application of Cognitive Model
Busy street > Escalator > Café > Laboratory.
But if you made Wayfinding, Path Planning, Estimating
Time to Target, Collision Avoidance easier?
182.
Online Wearables Exhibit
Online at http://wcc.gatech.edu/exhibition
183.
Glass Developer Resources
▪ Main Developer Website
▪ https://developers.google.com/glass/
▪ Glass Apps Developer Site
▪ http://glass-apps.org/glass-developer
▪ Google Design Guidelines Site
▪ https://developers.google.com/glass/design/
index?utm_source=tuicool
184.
Other Resources
▪ AR for Glass Website
▪ http://www.arforglass.org/
▪ Vandrico Database of wearable devices
▪ http://vandrico.com/database
185.
Glass UI Design Guidelines
• More guidelines
– https://developers.google.com/glass/design/index
186.
Books
▪ Programming Google Glass
▪ Eric Redmond
▪ Rapid Android
Development: Build Rich,
Sensor-Based Applications
with Processing
▪ Daniel Sauter
187.
• Beginning Google
Glass Development
by Jeff Tang
188.
• Microinteractions: Designing
with Details
– Dan Saffer
– http://microinteractions.com/
189.
Conclusions
• Wearable computing represents a fourth
generation of computing devices
• Google Glass is the first consumer wearable
– Lightweight, usable, etc
• A range of wearables will appear in 2014
– Ecosystem of devices
• Significant research opportunities exist
– User interaction, displays, social impact
190.
Contact Details
Mark Billinghurst
▪ email: mark.billinghurst@hitlabnz.org
▪ twitter: @marknb00
Feedback + followup form
▪ goo.gl/6SdgzA