SlideShare a Scribd company logo
1 of 80
Lecture 3: Conducting a Gesture Elicitation Study:
How to Get the Best Gestures From People?
Jean Vanderdonckt, UCLouvain
Vrije Universiteit Brussel, 20 April 2021, on-line
How to get the best gestures from people?
Case study: sketching a graphical UI
(Interactā€™07)
https://www.youtube.com/watch?v=SBNB1O-8pGw&t=5s
How to determine the best gestures for widgets?
How to determine the best gestures for widgets?
1: none
How to determine the best gestures for widgets?
1: none
2: low
How to determine the best gestures for widgets?
1: none
2: low
3: medium
How to determine the best gestures for widgets?
1: none
2: low
3: medium
4: high
Use de Borda Count method (Condorcet method)
ā€¢ Each participant assigns a score equal to the number of
representations to whom he or she is preferred in
decreasing order: n-1, n-2, n-3,ā€¦
n=5
Participant #1 0 n-3=2 n-4=1 n-1=4 n-2=3
Participant #2 n-1=4 0 n-3=2 n-2=3 n-4=1
Score= 4
Score= 7
Score= 3
Score= 2
Score= 4
Use de Borda Count method (Condorcet method)
ā€¢ Each participant assigns a score equal to the number of
representations to whom he or she is preferred in
decreasing order: n-1, n-2, n-3,ā€¦
Further classification: complexity of the representation
ST=sketching time
DEL=delete operations
Further classification: complexity of the representation
ST=sketching time
DEL=delete operations
Source: Suzanne Kieffer, Adrien Coyette, Jean Vanderdonckt, User interface design by sketching: a complexity analysis of widget representations.
EICS 2010: 57-66
How to get the best gestures from people?
Considered insofar:
ā€¢ Human factors: preference, sketching time, delete
operations
ā€¢ System factors: recognition rate
Other method for preference
ā€¢ By elicitation
ā€¢ By wizard of Oz
Gesture Elicitation Study (GES)
ā€¢ To elicit
ā€¢ To call forth or draw out something, such as information or a
response (Merriam Webster)
ā€¢ To evoke or draw out a reaction, answer, or fact from someone
(Dictionary.com)
ā€¢ To draw forth something that is latent or potential into existence
(Dictionary.com)
ā€¢ Elicitation
ā€¢ Acquisition of information from a person or a group of persons in a
manner that does not disclose the intent of the interview,
discussion or conversation
ā€¢ Elicitation technique
ā€¢ A technique used to discreetly extract information, not readily
available for a specific purpose
Source: Wobbrock et al., 2009
Gesture Elicitation Study (GES)
ā€¢ Definition
ā€¢ Is a study for asking end-users to elicit their own gestures for a set
of predefined functions through referents and reach a consensus
ā€¢ Aims and goals
ā€¢ To let end users elicit the gestures they want for a function
ā€¢ To compute agreement between end users (=participants, subjects)
ā€¢ To define a vocabulary of potential gestures
ā€¢ To identify a consensus set, which is a sub-vocabulary of agreed
gestures. Output = Consensus Set
ā€¢ For one context of use at a time: C= (U,P,E)
ā€¢ User : any person, Functions: IoT functions
ā€¢ Platform/device/sensor: an armband
ā€¢ Environment: smart home, usability lab, controlled room
Gesture Elicitation Study (GES)
ā€¢ Advantages
ā€¢ Free from any form of stress or burden of working
ā€¢ Information comes deliberately once the favorable context is
created
ā€¢ No adverse effect
ā€¢ Easy and cheap to run
ā€¢ Disadvantages
ā€¢ Legacy bias
ā€¢ Previous experience with a similar system, another device
ā€¢ No experience
ā€¢ Lack of creativity
GES Step 1: Prepare setup C=(U,P,E)
ā€¢ Create a representative
sampling of the user
population
GES Step 1: Prepare setup C=(U,P,E)
ā€¢ Find a representative sampling of the user population
ā€¢ Example of stratified sampling
Sampling Female Male
Age group % 32 51 49
18-24 8.9 3 2 1
25-34 15.4 5 3 2
35-44 16.5 5 3 2
45-54 18.3 6 3 3
55-64 18.7 6 3 3
65+ 22.2 7 4 3
Total 100 32 18 14
GES Step 1: Prepare setup C=(U,P,E)
ā€¢ Find a representative sampling of the user population
ā€¢ Find n=30 participants
ā€¢ Ideally 50% female, 50% male
ā€¢ Different ages (not all students!)
ā€¢ Different backgrounds (not all in your discipline!)
ā€¢ Different levels of experience (e.g., with touch or not)
ā€¢ Each participant takes part in the experiment individually
ā€¢ Define an experiment schedule
ā€¢ Location (should be constant): place to choose
ā€¢ Time: 30 min for each participant + resting time
ā€¢ Schedule: organize a Doodle
ā€¢ Test the whole protocol with a guinea pig
ā€¢ Different from the participants!
GES Step 1: Prepare setup C=(U,P,E)
ā€¢ Identify referents for the following tasks
ā€¢ A referent is the expected result of an action (e.g., key
pressing, touch, gesture) for a function
ā€¢ Distribution the definition of the 14 referents across groups
ā€¢ Example: IoT functions:
(1) Turn the TV on/off (2) Start player
(3) Turn up the volume (4) Turn down the volume
(5) Go to the next item in a list (6) Go to the previous item
(7) Turn AC on/off (8) Turn lights on/off
(9) Brighten lights (10) Dim lights
(11)Turn heat on/off (12) Turn alarm on/off
(13)Answer phone call (14) End phone call
ā€¢ Randomize the order of the referents before the study for each
participant (create lists with www.random.org)
Is symmetric to
GES Step 1: Prepare setup C=(U,P,E)
ā€¢ What is a referent?
Referent:
- State before action
- Action description
- State after action
GES Step 1: Prepare setup C=(U,P,E)
ā€¢ Referent: textual, graphical, animation, video (visual priming)
20. Permute files remotely
Before
After
My Network
My Disk
My Network
My Disk
Our Network
Partner Disk
Our Network
Partner Disk
Our Network
Partner Disk
Our Network
Partner Disk
Yellow Document
Red Document
Red Document
Yellow Document
Blue Document
Blue Document
GES Step 1: Prepare setup C=(U,P,E)
ā€¢ Referent: textual, graphical, animation, video (visual priming)
https://www.youtube.com/watch?v=iVzE4EzfGoc&t=12s
GES Step 1: Prepare setup C=(U,P,E)
ā€¢ Prepare the hardware, software
ā€¢ Keep them constant throughout the experiment
ā€¢ Determine a software for recording gestures in Log Files
ā€¢ Example: Python software for Leap Motion
ā€¢ Example: MyO Gesture Control
GES Step 1: Prepare setup C=(U,P,E)
ā€¢ Identify the physical location: office, room, usability lab,ā€¦
ā€¢ Keep it constant throughout the experiment
ā€¢ Sometimes determined by the device
ā€¢ Example: Usability lab
GES Step 2: Pre-test (before experiment)
ā€¢ Each participant
ā€¢ Comes to the experiment location according to schedule
ā€¢ Signs a consent form
ā€¢ Fills in a demographic questionnaire
ā€¢ Notoriety and experience if any for the target device
ā€¢ System experience in general (e.g., for a tablet, frequency)
ā€¢ Performs a Motor-Skill test: to touch their thumb to each of
the other fingers of the same hand 2 times consecutively
without flaw
ā€¢ Is presented with an introduction to the device (e.g., a video)
ā€¢ Experimenter
ā€¢ Assigns an ID to each participant on the consent form
ā€¢ Checks that all forms are completely filled in, before and
after
GES Step 3: Test (conduct experiment)
ā€¢ Each participant
ā€¢ Is presented with the list of referents (randomly selected)
ā€¢ Confirms she understood the tasks
ā€¢ Is asked to think of a suitable gesture
ā€¢ Says ā€œI am readyā€
ā€¢ Gives a rating (Goodness-of-fit) between 1 and 10
ā€¢ To describe how well the proposed gesture fits the referent
ā€¢ 1 = very poor fit ļ‚® 10 = excellent fit
ā€¢ Optionally gives a value between 1 and 7 for
ā€¢ Complexity: 1= most complex ļ‚® 7= simplest gesture
ā€¢ Memorability: 1=hardest to remember ļ‚® 7=easiest to remember
ā€¢ Fun: 1=is the least funny ļ‚® 7=is the funniest
for
each
gesture
in
the
random
set
GES Step 3: Test (conduct experiment)
ā€¢ The experimenter, on the gesture sheet
ā€¢ Records the thinking time = time since the task started (i.e., when
the referent was presented to the participant) and the moment
when the participant knows what gesture to propose
ā€¢ Fills in the gesture sheet: for each gesture collected
ā€¢ Writes the referent ID
ā€¢ Records the thinking time
ā€¢ Write all scores: Goodness-of-fit, Complexity, Memorability, Fun
ā€¢ Videotapes the whole session for further analysis
ā€¢ Keep video files for the final report
ā€¢ Takes some pictures of the room for some interesting gestures
ā€¢ Records the gestures into files with the very right device
ā€¢ Example: X, Y, Z, timestamp
GES Step 4: Post-test (after experiment)
ā€¢ The participant
ā€¢ Fills in the IBM PSSUQ questionnaire
ā€¢ Is being asked a few open questions
ā€¢ What did you like the most in the experiment?
ā€¢ What did you hate the most in the experiment?
ā€¢ The experimenter
ā€¢ Checks that the questionnaire is properly filled in
ā€¢ Asks questions to the participant if any
ā€¢ Encourages the participant to answer the open questions
ā€¢ All
ā€¢ Encode all data into spreadsheet
ā€¢ Template file available
GES Step 5: Gesture classification
ā€¢ Give each individual collected gesture a consistent,
structured name
ā€¢ Adopt a structure (action verb)+(limb)+(parameter)
ā€¢ Examples:
ā€¢ Swipe right with two fingers, swipe left with the dominant hand
ā€¢ References
ā€¢ Hand gestures (e.g., LeapMotion):
http://gestureml.org/doku.php/gestures/motion/gesture_index
ā€¢ Arm gestures (see in body motion gestures section above)
ā€¢ arm_raised_forward (left/right), arm_raised_side (left/right),
ā€¢ arm_point (left/right)
ā€¢ arm_wave (left/right), arm_push (left/right), arm_throw (left/right)
ā€¢ arm_punch (left/right), arm_folded (left/right), arm_on_hip (left/right)
GES Step 5: Gesture classification
ā€¢ Assign each individual gesture to a category ID
ā€¢ Example:
ā€¢ Swipe right with two, three fingers, swipe right with the dominant hand
are assigned to the category ā€œ1: Swipe rightā€ however the gesture is
performed
ā€¢ Create your own classification
ā€¢ You can draw inspiration from existing classifications
ā€¢ You may depart from existing classifications by
ā€¢ Adding, deleting, modifying your own categories
ā€¢ Refining, generalizing existing categories
ā€¢ Detail enough: minimum 10 categories
ā€¢ Do not overdetail (e.g., by creating a category for each
gesture): maximum 20 gestures
ā€¢ In the Excel file, tab ā€œElicited Gestureā€, fill in the light grey area
ā€¢ Input other data, like Thinking time, Goodness-of-fit, etc.
GES Step 5: Gesture classification
ā€¢ Descriptive labeling: Gesture Cards
ā€¢ Examples
ā€¢ Touch the ring once, twice, or multiple times in a row. Tap a
rhythmic pattern on the ringā€™s surface
ā€¢ Rotate the ring on the finger. Rotate the finger wearing the ring.
Rotate the hand wearing the ring
ā€¢ Slide the ring along the finger. Pull out the ring. Place the ring
back on the finger. Change the ring to a different finger,
GES Step 5: Gesture classification
ā€¢ Gesture type: describes the underlying meaning of a
gesture. Possible values are:
ā€¢ P= Pointing gestures (= deictic gestures) indicate people, objects,
directions
ā€¢ Semaphoric gestures are hand postures and movements conveying
specific meanings
ā€¢ T= Static semaphorics are identified by a specific hand posture. Examples:
thumbs-up means ā€œokayā€, a flat palm facing from the actor means ā€œstopā€.
ā€¢ D= Dynamic semaphorics convey information through their temporal aspects.
Example: a circular hand motion means ā€œrotateā€
ā€¢ S= Semaphoric strokes represent hand flicks are single, stroke-like movements.
Example: a left flick of the hand means ā€œdismiss this objectā€
ā€¢ A = Pantomimic gestures demonstrate a specific task to be performed or
imitated, which mostly involves motion and particular hand postures.
ā€¢ Examples: filling an imaginary glass with water, by tilting an imaginary bucket.
They often consist of multiple low-level gestures, grabbing an object, moving it,
and releasing it again
GES Step 5: Gesture classification
ā€¢ Gesture type: describes the
underlying meaning of a gesture.
Possible values are (contā€™d):
ā€¢ Iconic gestures communicate information about objects
or entities, such as specific sizes, shapes, and motion
paths:
ā€¢ I= Static iconics are performed by spontaneous
static hand postures. Example: Draw an ā€œOā€ with
index finger and thumb means a ā€œcircleā€
ā€¢ Y= Dynamic iconics are often used to describe
paths or shapes, such as moving the hand in
circles, meaning ā€œthe circleā€.
ā€¢ - M= Manipulation gestures a guide movement in a
short feedback loop. Thus, they feature a tight
relationship between the movements of the actor and
the movements of the object to be manipulated. The
actor waits for the entity to ā€œfollowā€ before continuing
Source: https://www.microsoft.com/en-us/research/publication/understanding-mid-air-hand-gestures-a-study-of-human-preferences-in-usage-of-gesture-types-for-hci/
GES Step 5: Gesture classification
ā€¢ Gesture form: specifies which form of gesture is elicited.
Possible values are:
ā€¢ S= stroke when the gesture only consists of taps and flicks
ā€¢ T= static when the gesture is performed in only one location
ā€¢ M= static with motion (when the gesture is performed with a
static pose while the rest is moving)
ā€¢ D= dynamic when the gesture does capture any change or
motion
GES Step 5: Gesture classification
ā€¢ Range of motion: relates the distance between the
position of the human body producing the gesture and
the location of the gesture. Possible values are
ā€¢ C= Close intimate, I= Intimate, P= Personal, S=Social,
U= Public, R= Remote
Range of
motion
GES Step 5: Gesture classification
ā€¢ Laterality: characterizes how the two hands are
employed to produce gestures, with two categories, as
done in many studies. Possible values are:
ā€¢ D= dominant unimanual, N= non-dominant unimanual,
S= symmetric bimanual, A= asymmetric bimanual
Source: https://www.tandfonline.com/doi/abs/10.1080/00222895.1987.10735426
D
(right handed)
N
(right handed)
S
(right handed)
A
(right handed)
GES Step 5: Gesture classification
ā€¢ Classify each gesture category according to
the following criteria and enter corresponding
code
ā€¢ Use other classification criteria depending on
ā€¢ User
ā€¢ Platform/device
ā€¢ Environment
GES Step 5: Compute agreement among gestures
ā€¢ Download AGaTE from
http://depts.washington.edu/acelab/proj/dollar/agate.html
ā€¢ Prepare a CSV file with participant ID (column), referent
name (line), Gesture category ID (cell)
Participant ID
Referent
name
Gesture category
ID
GES Step 5: Compute agreement among gestures
ā€¢ Compute Agreement Rate (AR) for all referents
ā€¢ Agreement Rate = the number of pairs of participants in
agreement with each other divided by the total number
of pairs of participants that could be in agreement
ā€¢ Compute co-agreement for pairs, groups (eg male vs
female), categories of referents (eg basic vs. advanced)
agreement rate disagreement rate co-agreement rate
Source: https://dl.acm.org/citation.cfm?id=2669511
GES Step 5: Compute agreement among gestures
ā€¢ Example: agreement rate for one referent with 5
participants proposing 2 gestures A and B.
ā€¢ Connected pairs represent how two participants
performed the same gesture
Source: https://dl.acm.org/citation.cfm?id=2669511
ā€¢ Ring Gestures
ā€¢ 41
24 participants
= 672 gesture proposals
x 2 rings (Ring Zero)
x 14 referents
Source: B. Gheran, J. Vanderdonckt, R.-D. Vatavu, Gestures for Smart Rings: Empirical Results, Insights, and Design Implications.
Proc. of ACM DISā€™18, 623-635.
Example
ā€¢ Ring Gestures: samples
Example
https://www.youtube.com/watch?v=FHT-5aFNhsA
ā€¢ Ring Gestures: agreement scores
Example
ā€¢ Ring Gestures: consensus set
Example
GES Step 6: Analyze agreement among gestures
ā€¢ Plot all referents in decreasing order of their AR, with error bars
denoting confidence interval (95%) and gesture category
45
0.193 0.173 0.157 0.157 0.14 0.13 0.117 0.11 0.11 0.107 0.107 0.097 0.08 0.07 0.067 0.06 0.053 0.053 0.043
0.107
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
Turn Light
On
Turn TV
On
Begin
Player
Turn Light
Off
Turn TV
Off
Go to
previous
item
Go to next
item
Turn Heat
On
Answer a
Call
Increase
Volume
Turn Heat
Off
Increase
Light
Decrease
Volume
Hang Up
Call
Decrease
Light
Turn AC
On
Turn
Alarm On
Turn
Alarm Off
Turn AC
Off
Average
Agreement
rate
Low agreement
Medium agreement
Most
elicited
gesture
15: Splay 27: Button 27: Button
3: Point 14: Fist 10: Swipe 10: Swipe 8: Rotate 30: Phone 30: Phone
8: Rotate
28: Dimmer 28: Dimmer
15: Splay 6: Flat 10: Swipe 27: Button 27: Button 10: Swipe
Referent
Category
Source: https://dl.acm.org/citation.cfm?id=2702223
GES Step 7: Analyze Gestures
ā€¢ Provide a detailed analysis of gestures elicited
ā€¢ By criteria
ā€¢ By agreement rate
ā€¢ By observation (e.g., field notes, video notes, gesture sheet)
ā€¢ By interview
ā€¢ By IBM PSSUQ questionnaire
ā€¢ Identify potential correlations between data (e.g., t test,
ANOVA, etc.)
ā€¢ Decide Consensus set = set of agreed gestures based
on results
ā€¢ Example: https://ieeexplore.ieee.org/document/8890919
Example of Gesture Elicitation Study
Head and Shoulders Gestures:
Exploring User-Defined Gestures with Upper Body
Source: Jean Vanderdonckt, Nathan Magrofuoco, Suzanne Kieffer, Jorge PĆ©rez, Ysabelle Rase, Paolo Roselli, Santiago Villarreal:
Head and Shoulders Gestures: Exploring User-Defined Gestures with Upper Body. HCI (19) 2019: 192-213
48
What are head and shoulders gestures?
ā€¢ A head gesture is any movement of the head leaving the rest
of the body unaffected (stationary)
ā€¢ A head gesture could occur in any plane (sagittal, transverse,
frontal)
ā€¢ A shoulder gesture is any movement of the shoulder joint that
leaves the rest of the arm unaffected (stationary).
ā€¢ A shoulder gesture occurs in any plane of motion (sagittal,
transverse, frontal) or direction (forward, backward, or circular)
49
Y
Yaw
Z
Roll
X
Pitch
50
Why are head and shoulders gestures interesting?
ā€¢ Hands-free interaction is made possible
ā€¢ Fixed-Gaze Head Movement is appropriate when
ā€¢ No device needed
ā€¢ Both hands should be free
ā€¢ No need to move the gaze
ā€¢ Capability to trigger actions from a moderate set
ā€¢ Medium command duration
ā€¢ Accurate recognizers for head and shoulders gestures start to
appear
M13 M31
M65
M56
M42
M24
[Qinjie et al., 2019]
51
Why are head and shoulders gestures interesting?
ā€¢ They are used in physical exercises
52
Why are head and shoulders gestures interesting?
ā€¢ They exhibit some potential for a novel vocabulary
Head Label Alias Movement
(frontal,
transversal,
sagittal)
X trans-
lation
Move the head
left, right
Face left, face
right
Lateral translation
(v,c,c)
Y trans-
lation
Move the head
up, down
Face up, face
down
Neck elevation,
depression (c,v,c)
Z trans-
lation
Move the head
forward,
backward
Thrust,
retreat
Protraction,
retraction (c,c,v)
53
Why are head and shoulders gestures interesting?
ā€¢ They exhibit some potential for a novel vocabulary
Head Label Alias Movement
(frontal,
transversal,
sagittal)
Frontal
tilting
Tilt the head to
the left, right
Bend left,
right
Lateral flexion
(v,v,c)
Transvers
al tilting
Tilt the head up,
down
Bend up,
down
Extension, flexion
(v,c,v)
Saggital
tilting
Tilt the head
forward,
backward
Bend
forward,
backward
Extension, flexion
(c,v,v)
54
Why are head and shoulders gestures interesting?
ā€¢ They exhibit some potential for a novel vocabulary
Head Label Alias Movement
(frontal,
transversal,
sagittal)
X rotation Turn the head
up, down
Uphead,
downhead
Horizontal
rotation (c,v,v)
Y rotation Turn the head
left, right
Lefthead,
righthead
Vertical rotation
(v,c,v)
Z rotation Turn the head
forward,
backward
Forehead,
backhead
Facial rotation
(v,v,c)
55
Why are head and shoulders gestures interesting?
ā€¢ They exhibit some potential for a novel vocabulary
Shoulders Label Alias Movement (frontal, transversal,
sagittal)
X translation Move shoulder horizontally to left, right Decontract,
contract
Extension, flexion (v,c,c)
Y translation Raise shoulder, lower shoulder Raise, lower Shoulder elevation, depression
(c,v,c)
Z translation Move shoulder forward/backward Protract, retract Shoulder protraction, retraction
(c,c,v)
ā€¢ There are also common head and shoulders gestures
ā€¢ Larger design space than www.gestureml.org
Source: http://gestureml.org/doku.php/gestures/motion/gesture_index
ā€œThese head-motion based gestures can be great for adding subtle context cues to
game controls and metrics or even used to directly modify the way digital content is
presented on a display.ā€
58
Experiment: Gesture Elicitation Study (GES)
ā€¢ Participants
ā€¢ 10 females + 12 males = 22 participants
ā€¢ Aged from 18 to 62y (M=29, SD=13)
ā€¢ Various occupations: secretary, teacher, employee,...
ā€¢ Device usage frequencies
ā€¢ Creativity score
y = 0.1467x + 52.482
RĀ² = 0.0775
0
10
20
30
40
50
60
70
80
18 28 38 48 58
Creativity
[score]
Age [years]
6.09 6.05
2.68
1.73
1.05
0.00
0.00
1.00
2.00
3.00
4.00
5.00
6.00
7.00
Computer
Smartphone
Tablet
Game
console
MS
Kinect
Studied
device
Average
fequency
of
usage
Device
59
Experiment: Gesture Elicitation Study (GES)
ā€¢ Stimuli: 14 referents for IoT tasks:
Turn the TV On/Off, Start Player, Turn the Volume up, Turn the volume
down, Go to the next channel, Go to the previous channel, Turn Air
Conditioning On/Off, Turn Lights On/Off, Brighten Lights, Dim Lights,
Turn Heating system On/Off, Turn Alarm On/Off, Answer a phone call,
and End Phone Call.
Example: 3. INCREASE: Brighten lights
Before After
60
Experiment: Results
ā€¢ 22 participants X 14 referents = 308 elicited gestures
resulting into 10 categories
Head single gesture, 102
Concurrent compound
gesture, 70
Sequential
compound
gesture, 44
Both shoulders single
gesture, 29
Dominant shoulder single
gesture, 19
Non-dominant shoulder
single gesture, 14
Head repeated gesture, 10
Both shoulders
repeated
gestures, 9
Dominant shoulder
repeated gesture, 4
Non-dominant shoulder
repeated gesture, 3
Other, 26
61
Experiment: Results
ā€¢ Aggregated measures per referent
62
Experiment: Results
ā€¢ Evolution of aggregated measures per referent
0.00
1.00
2.00
3.00
4.00
5.00
6.00
7.00
8.00
0.00
5.00
10.00
15.00
20.00
25.00
Value
Time
[Sec]
Referent
Average thinking time Goodness-of-fit
Linear (Average thinking time) Linear (Goodness-of-fit)
63
Experiment: Results
ā€¢ Breakdown per criteria
Upface/downface, 3%
Thrust, 1% Bend, 31%
Nod,
4%
Rotate,
4%
Left/right, 12%
Backhead, 0%
Raise
11%
Lower,
4%
Shrug, 11% Clog, 9%
Protract
6%
Retract,
3%
Head (%), 50.87 Shoulders (%), 30.62
Head and Shoulders (%),
18.51
One stroke
68.09%
Two strokes
22.37%
Three or more strokes
9.54%
Body part
Elicited gestures
Amount of strokes
0.390 0.390
0.316
0.283
0.267 0.260
0.263
0.250 0.250 0.250 0.248
0.229
0.215
0.188
0.138
0.368 0.368
0.286
0.251
0.238 0.234
0.232
0.221 0.221 0.216 0.215
0.195
0.182
0.156
0.104
0.00
0.05
0.10
0.15
0.20
0.25
0.30
0.35
0.40
0.45
0.50
Go to Next
Channel
Go to
Previous
Channel
Answer
Phone Call
End Phone
Call
Play/pause Turn TV
On/Off
Average Turn Lights
On/Off
Turn Alarm
On/Off
Decrease
Volume
Brighten
Lights
Increase
Volume
Dim Lights Turn Air
Conditioning
On/Off
Turn Heating
System
On/Off
High Medium
64
Experiment: 14 Consensus gestures
Agreement score [Vatavu & Wobbrock, 2015]
Agreement rate [Vatavu & Wobbrock, 2016]
65
Conclusion
ā€¢ Contributions
ā€¢ Design space for head and shoulders gestures
ā€¢ Corpus of 308 elicited gestures with measures
ā€¢ Classification into 10 categories
ā€¢ Consensus set of 14 head and shoulders gestures
ā€¢ Design guidelines
ā€¢ Use bending gestures as a first-class citizen
ā€¢ Use Upface/downface for infrequent tasks
ā€¢ Use thrust only for play/pause
ā€¢ Forehead and backhead gestures should not be used,
apart for exceptional assignation
Gesture Elicitation Study
Shortcomings and variants
ā€¢ Shortcomings
ā€¢ Legacy bias (Morris et al, 2010)
Source: M. Morris et al., Reducing Legacy Bias in Gesture Elicitation Studies, 2010
https://interactions.acm.org/archive/view/may-june-2014/reducing-legacy-bias-in-gesture-elicitation-studies
Priming. Priming users to think about the
capabilities of a new form factor or sensing
technology is another approach that may
reduce the impact of legacy bias
Partners. Inviting users to participate in
elicitation studies in groups, rather than
individually, can be another approach to
overcoming legacy bias
ā€¢ Shortcomings
ā€¢ Manual (variable) classification of gestures
Source: Radu-Daniel Vatavu, The Dissimilarity-Consensus Approach to Agreement Analysis in Gesture Elicitation Studies. CHI
2019: 224
ā€œIt is possible to design a highly guessable symbol set by acquiring guesses from participants.ā€
ā€œParticipants are first recruited to propose symbols for specified referents within a given domain.
The more participants, the more likely the resulting symbol set will be guessable to external users.ā€
The goal is to obtain a rich set of symbols from which to create the resultant symbol set.ā€
ā€¢ Shortcomings
ā€¢ Manual (variable) classification of gestures
N=30 children, Referent: ā€œScratch like a cat!ā€
ļ¶ Grouping criteria #1: same hand ļƒ  39% agreement
ļ¶ Grouping criteria #2: same hand & body pose ļƒ  19.6%
ļ¶ Grouping criteria #3: same hand & body pose & pattern of
movement ļƒ  12.2%
Source: Radu-Daniel Vatavu, The Dissimilarity-Consensus Approach to Agreement Analysis in Gesture Elicitation Studies. CHI
2019: 224
ā€¢ Shortcomings
ā€¢ Automatic classification of gestures
1. A dissimilarity function for gestures Ī”(š‘”š‘–,š‘”š‘—)
ļ¶ for example, the Euclidean distance
ļ¶ the Dynamic Time Warping cost function
ļ¶ or any distance function that takes two gestures as
input and returns a real, positive value of how
dissimilar they are
2. A threshold (Ļ„, tau) for values computed by Ī”
Source: Radu-Daniel Vatavu, The Dissimilarity-Consensus Approach to Agreement Analysis in Gesture Elicitation Studies. CHI
2019: 224
ā€¢ Shortcomings
ā€¢ Automatic classification of gestures
Source: Radu-Daniel Vatavu, The Dissimilarity-Consensus Approach to Agreement Analysis in Gesture Elicitation Studies. CHI
2019: 224
ā€¢ Shortcomings and variants
ā€¢ Open vs Closed elicitation
Open
Closed
ā€¢ Shortcomings and variants
ā€¢ Gesture elicitation distributed in time and space
GestMAN
Source: GestMan: a cloud-based tool for stroke-gesture datasets
ā€¢ Shortcomings and variants
ā€¢ Referent-free elicitation
Open
ā€¢ Shortcomings and variants
ā€¢ Some limbs are privileged
Source: Villarreal et al., A Systematic Review of Gesture Elicitation Studies: What Can We Learn from 216 Studies?
Body parts:
1.- The
Individual
Frequency of
Body Parts
(IFBP)
2. The
Combination
Frequency of
Body Parts
(CFBP)
ā€¢ Shortcomings and variants
ā€¢ Eliciting other symbols than gestures, other modalities
Source: Villarreal et al., A Systematic Review of Gesture Elicitation Studies: What Can We Learn from 216 Studies?
ā€¢ Shortcomings and variants
ā€¢ Eliciting more than one symbol
Source: Villarreal et al., A Systematic Review of Gesture Elicitation Studies: What Can We Learn from 216 Studies?
Multiple symbols could
be combined to trigger a
function by relying on
hierarchical structure and
congruence
ā€¢ Conclusion
ā€¢ Elicitation study is a practical tool for eliciting (symbol)
proposals from participants for commands, icons,
shortcuts, gestures, vocal commands, etc.
ā€¢ Efficient, natural
ā€¢ Subject to manual classification
ā€¢ Not always sure for agreement rates (other scores)
ā€¢ Legacy bias
ā€¢ Continuity from elicitation to recognition
ā€¢ Subject to Context variability (U,P,E)
ā€¢ Conclusion
ā€¢ Many GES already exist, but no consolidation of this
knowledge!
y = 3.8909x - 4.1636
RĀ² = 0.9052
0
5
10
15
20
25
30
35
40
45
2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019
Number
of
studies
Year of publication
Frequency of Gesture ElicitationStudies
How to get the best gestures from people?
Considered insofar:
ā€¢ Human factors: preference, sketching time, delete
operations, agreement, (dis)similarity
ā€¢ System factors: recognition rate
More to come and to consider:
ā€¢ Human factors: hedonic value, memorability,
naturalness, discoverability, consistency, congruence, ā€¦
ā€¢ System factors: recognition rate, execution time,
computational complexity, more measuresā€¦

More Related Content

Similar to Conducting a Gesture Elicitation Study: How to Get the Best Gestures From People?

Lean Ethnography
Lean EthnographyLean Ethnography
Lean EthnographyKate Lawrence
Ā 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARMark Billinghurst
Ā 
The 5-step Approach to Controlled Experiment Design for Human Computer Intera...
The 5-step Approach to Controlled Experiment Design for Human Computer Intera...The 5-step Approach to Controlled Experiment Design for Human Computer Intera...
The 5-step Approach to Controlled Experiment Design for Human Computer Intera...National University of Singapore
Ā 
Wellness at hand: Exploring interactive technology to support smokers
Wellness at hand: Exploring interactive technology to support smokersWellness at hand: Exploring interactive technology to support smokers
Wellness at hand: Exploring interactive technology to support smokersUniversity of Melbourne, Australia
Ā 
Technology intergrationplan kellieouzts_application
Technology intergrationplan kellieouzts_applicationTechnology intergrationplan kellieouzts_application
Technology intergrationplan kellieouzts_applicationBarrow County Schools
Ā 
U Penn Wharton design challenge '17
U Penn Wharton design challenge '17U Penn Wharton design challenge '17
U Penn Wharton design challenge '17HJ Kwon
Ā 
IOC 2004 - Learning Styles and Student Performance in an E-Learning Environment
IOC 2004 - Learning Styles and Student Performance in an E-Learning EnvironmentIOC 2004 - Learning Styles and Student Performance in an E-Learning Environment
IOC 2004 - Learning Styles and Student Performance in an E-Learning EnvironmentMichael Barbour
Ā 
Analytic emperical Mehods
Analytic emperical MehodsAnalytic emperical Mehods
Analytic emperical MehodsM Surendar
Ā 
Learning Analytics for the Evaluation of Competencies and Behaviors in Seriou...
Learning Analytics for the Evaluation of Competencies and Behaviors in Seriou...Learning Analytics for the Evaluation of Competencies and Behaviors in Seriou...
Learning Analytics for the Evaluation of Competencies and Behaviors in Seriou...MIT
Ā 
Aplicando AnalĆ­tica de Aprendizaje para la EvaluaciĆ³n de Competencias y Compo...
Aplicando AnalĆ­tica de Aprendizaje para la EvaluaciĆ³n de Competencias y Compo...Aplicando AnalĆ­tica de Aprendizaje para la EvaluaciĆ³n de Competencias y Compo...
Aplicando AnalĆ­tica de Aprendizaje para la EvaluaciĆ³n de Competencias y Compo...Facultad de InformĆ”tica UCM
Ā 
Usability testing / Nearly everything you need to know to get started
Usability testing / Nearly everything you need to know to get startedUsability testing / Nearly everything you need to know to get started
Usability testing / Nearly everything you need to know to get startedRebecca Destello
Ā 
Barry Ryan_Alternative Laboratories: Near Peer Constructed Technology Enhanc...
Barry Ryan_Alternative Laboratories: Near Peer Constructed Technology Enhanc...Barry Ryan_Alternative Laboratories: Near Peer Constructed Technology Enhanc...
Barry Ryan_Alternative Laboratories: Near Peer Constructed Technology Enhanc...Barry Ryan
Ā 
Session 3 Research Methods - Data Analysis - Case Study Example
Session 3 Research Methods - Data Analysis - Case Study ExampleSession 3 Research Methods - Data Analysis - Case Study Example
Session 3 Research Methods - Data Analysis - Case Study Examplemilolostinspace
Ā 
Learning Analytics Dashboards
Learning Analytics DashboardsLearning Analytics Dashboards
Learning Analytics DashboardsSten Govaerts
Ā 
Design, Create, Evaluate Process (1).pptx
Design, Create, Evaluate Process (1).pptxDesign, Create, Evaluate Process (1).pptx
Design, Create, Evaluate Process (1).pptxLe Hung
Ā 
PH-HISTO-PPT.FINAL DEF.pptx
PH-HISTO-PPT.FINAL DEF.pptxPH-HISTO-PPT.FINAL DEF.pptx
PH-HISTO-PPT.FINAL DEF.pptxJcModelo
Ā 
Ud the motion en_jtaboada
Ud the motion en_jtaboadaUd the motion en_jtaboada
Ud the motion en_jtaboadaJordi Taboada
Ā 
Design and Application of Experiments and User Studies
Design and Application of Experiments and User StudiesDesign and Application of Experiments and User Studies
Design and Application of Experiments and User StudiesVictor Adriel Oliveira
Ā 

Similar to Conducting a Gesture Elicitation Study: How to Get the Best Gestures From People? (20)

Tests
TestsTests
Tests
Ā 
Lean Ethnography
Lean EthnographyLean Ethnography
Lean Ethnography
Ā 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
Ā 
The 5-step Approach to Controlled Experiment Design for Human Computer Intera...
The 5-step Approach to Controlled Experiment Design for Human Computer Intera...The 5-step Approach to Controlled Experiment Design for Human Computer Intera...
The 5-step Approach to Controlled Experiment Design for Human Computer Intera...
Ā 
Wellness at hand: Exploring interactive technology to support smokers
Wellness at hand: Exploring interactive technology to support smokersWellness at hand: Exploring interactive technology to support smokers
Wellness at hand: Exploring interactive technology to support smokers
Ā 
Technology intergrationplan kellieouzts_application
Technology intergrationplan kellieouzts_applicationTechnology intergrationplan kellieouzts_application
Technology intergrationplan kellieouzts_application
Ā 
U Penn Wharton design challenge '17
U Penn Wharton design challenge '17U Penn Wharton design challenge '17
U Penn Wharton design challenge '17
Ā 
Lecture-2.pdf
Lecture-2.pdfLecture-2.pdf
Lecture-2.pdf
Ā 
IOC 2004 - Learning Styles and Student Performance in an E-Learning Environment
IOC 2004 - Learning Styles and Student Performance in an E-Learning EnvironmentIOC 2004 - Learning Styles and Student Performance in an E-Learning Environment
IOC 2004 - Learning Styles and Student Performance in an E-Learning Environment
Ā 
Analytic emperical Mehods
Analytic emperical MehodsAnalytic emperical Mehods
Analytic emperical Mehods
Ā 
Learning Analytics for the Evaluation of Competencies and Behaviors in Seriou...
Learning Analytics for the Evaluation of Competencies and Behaviors in Seriou...Learning Analytics for the Evaluation of Competencies and Behaviors in Seriou...
Learning Analytics for the Evaluation of Competencies and Behaviors in Seriou...
Ā 
Aplicando AnalĆ­tica de Aprendizaje para la EvaluaciĆ³n de Competencias y Compo...
Aplicando AnalĆ­tica de Aprendizaje para la EvaluaciĆ³n de Competencias y Compo...Aplicando AnalĆ­tica de Aprendizaje para la EvaluaciĆ³n de Competencias y Compo...
Aplicando AnalĆ­tica de Aprendizaje para la EvaluaciĆ³n de Competencias y Compo...
Ā 
Usability testing / Nearly everything you need to know to get started
Usability testing / Nearly everything you need to know to get startedUsability testing / Nearly everything you need to know to get started
Usability testing / Nearly everything you need to know to get started
Ā 
Barry Ryan_Alternative Laboratories: Near Peer Constructed Technology Enhanc...
Barry Ryan_Alternative Laboratories: Near Peer Constructed Technology Enhanc...Barry Ryan_Alternative Laboratories: Near Peer Constructed Technology Enhanc...
Barry Ryan_Alternative Laboratories: Near Peer Constructed Technology Enhanc...
Ā 
Session 3 Research Methods - Data Analysis - Case Study Example
Session 3 Research Methods - Data Analysis - Case Study ExampleSession 3 Research Methods - Data Analysis - Case Study Example
Session 3 Research Methods - Data Analysis - Case Study Example
Ā 
Learning Analytics Dashboards
Learning Analytics DashboardsLearning Analytics Dashboards
Learning Analytics Dashboards
Ā 
Design, Create, Evaluate Process (1).pptx
Design, Create, Evaluate Process (1).pptxDesign, Create, Evaluate Process (1).pptx
Design, Create, Evaluate Process (1).pptx
Ā 
PH-HISTO-PPT.FINAL DEF.pptx
PH-HISTO-PPT.FINAL DEF.pptxPH-HISTO-PPT.FINAL DEF.pptx
PH-HISTO-PPT.FINAL DEF.pptx
Ā 
Ud the motion en_jtaboada
Ud the motion en_jtaboadaUd the motion en_jtaboada
Ud the motion en_jtaboada
Ā 
Design and Application of Experiments and User Studies
Design and Application of Experiments and User StudiesDesign and Application of Experiments and User Studies
Design and Application of Experiments and User Studies
Ā 

More from Jean Vanderdonckt

To the end of our possibilities with Adaptive User Interfaces
To the end of our possibilities with Adaptive User InterfacesTo the end of our possibilities with Adaptive User Interfaces
To the end of our possibilities with Adaptive User InterfacesJean Vanderdonckt
Ā 
Engineering the Transition of Interactive Collaborative Software from Cloud C...
Engineering the Transition of Interactive Collaborative Software from Cloud C...Engineering the Transition of Interactive Collaborative Software from Cloud C...
Engineering the Transition of Interactive Collaborative Software from Cloud C...Jean Vanderdonckt
Ā 
UsyBus: A Communication Framework among Reusable Agents integrating Eye-Track...
UsyBus: A Communication Framework among Reusable Agents integrating Eye-Track...UsyBus: A Communication Framework among Reusable Agents integrating Eye-Track...
UsyBus: A Communication Framework among Reusable Agents integrating Eye-Track...Jean Vanderdonckt
Ā 
ĀµV: An Articulation, Rotation, Scaling, and Translation Invariant (ARST) Mult...
ĀµV: An Articulation, Rotation, Scaling, and Translation Invariant (ARST) Mult...ĀµV: An Articulation, Rotation, Scaling, and Translation Invariant (ARST) Mult...
ĀµV: An Articulation, Rotation, Scaling, and Translation Invariant (ARST) Mult...Jean Vanderdonckt
Ā 
RepliGES and GEStory: Visual Tools for Systematizing and Consolidating Knowle...
RepliGES and GEStory: Visual Tools for Systematizing and Consolidating Knowle...RepliGES and GEStory: Visual Tools for Systematizing and Consolidating Knowle...
RepliGES and GEStory: Visual Tools for Systematizing and Consolidating Knowle...Jean Vanderdonckt
Ā 
Gesture-based information systems: from DesignOps to DevOps
Gesture-based information systems: from DesignOps to DevOpsGesture-based information systems: from DesignOps to DevOps
Gesture-based information systems: from DesignOps to DevOpsJean Vanderdonckt
Ā 
Engineering Slidable User Interfaces with Slime
Engineering Slidable User Interfaces with SlimeEngineering Slidable User Interfaces with Slime
Engineering Slidable User Interfaces with SlimeJean Vanderdonckt
Ā 
Evaluating Gestural Interaction: Models, Methods, and Measures
Evaluating Gestural Interaction: Models, Methods, and MeasuresEvaluating Gestural Interaction: Models, Methods, and Measures
Evaluating Gestural Interaction: Models, Methods, and MeasuresJean Vanderdonckt
Ā 
User-centred Development of a Clinical Decision-support System for Breast Can...
User-centred Development of a Clinical Decision-support System for Breast Can...User-centred Development of a Clinical Decision-support System for Breast Can...
User-centred Development of a Clinical Decision-support System for Breast Can...Jean Vanderdonckt
Ā 
Simplifying the Development of Cross-Platform Web User Interfaces by Collabo...
Simplifying the Development of  Cross-Platform Web User Interfaces by Collabo...Simplifying the Development of  Cross-Platform Web User Interfaces by Collabo...
Simplifying the Development of Cross-Platform Web User Interfaces by Collabo...Jean Vanderdonckt
Ā 
Attach Me, Detach Me, Assemble Me like you Work
Attach Me, Detach Me, Assemble Me like you WorkAttach Me, Detach Me, Assemble Me like you Work
Attach Me, Detach Me, Assemble Me like you WorkJean Vanderdonckt
Ā 
The Impact of Comfortable Viewing Positions on Smart TV Gestures
The Impact of Comfortable Viewing Positions on Smart TV GesturesThe Impact of Comfortable Viewing Positions on Smart TV Gestures
The Impact of Comfortable Viewing Positions on Smart TV GesturesJean Vanderdonckt
Ā 
Head and Shoulders Gestures: Exploring User-De fined Gestures with Upper Body
Head and Shoulders Gestures: Exploring User-Defined Gestures with Upper BodyHead and Shoulders Gestures: Exploring User-Defined Gestures with Upper Body
Head and Shoulders Gestures: Exploring User-De fined Gestures with Upper BodyJean Vanderdonckt
Ā 
G-Menu: A Keyword-by-Gesture based Dynamic Menu Interface for Smartphones
G-Menu: A Keyword-by-Gesture based Dynamic Menu Interface for SmartphonesG-Menu: A Keyword-by-Gesture based Dynamic Menu Interface for Smartphones
G-Menu: A Keyword-by-Gesture based Dynamic Menu Interface for SmartphonesJean Vanderdonckt
Ā 
Vector-based, Structure Preserving Stroke Gesture Recognition
Vector-based, Structure Preserving Stroke Gesture RecognitionVector-based, Structure Preserving Stroke Gesture Recognition
Vector-based, Structure Preserving Stroke Gesture RecognitionJean Vanderdonckt
Ā 
An ontology for reasoning on body-based gestures
 An ontology for reasoning on body-based gestures An ontology for reasoning on body-based gestures
An ontology for reasoning on body-based gesturesJean Vanderdonckt
Ā 
AB4Web: An On-Line A/B Tester for Comparing User Interface Design Alternatives
AB4Web: An On-Line A/B Tester for Comparing User Interface Design AlternativesAB4Web: An On-Line A/B Tester for Comparing User Interface Design Alternatives
AB4Web: An On-Line A/B Tester for Comparing User Interface Design AlternativesJean Vanderdonckt
Ā 
Gelicit: A Cloud Platform for Distributed Gesture Elicitation Studies
 Gelicit: A Cloud Platform for Distributed Gesture Elicitation Studies Gelicit: A Cloud Platform for Distributed Gesture Elicitation Studies
Gelicit: A Cloud Platform for Distributed Gesture Elicitation StudiesJean Vanderdonckt
Ā 
MoCaDiX: Designing Cross-Device User Interfaces of an Information System base...
MoCaDiX: Designing Cross-Device User Interfaces of an Information System base...MoCaDiX: Designing Cross-Device User Interfaces of an Information System base...
MoCaDiX: Designing Cross-Device User Interfaces of an Information System base...Jean Vanderdonckt
Ā 
Specification of a UX process reference model towards the strategic planning ...
Specification of a UX process reference model towards the strategic planning ...Specification of a UX process reference model towards the strategic planning ...
Specification of a UX process reference model towards the strategic planning ...Jean Vanderdonckt
Ā 

More from Jean Vanderdonckt (20)

To the end of our possibilities with Adaptive User Interfaces
To the end of our possibilities with Adaptive User InterfacesTo the end of our possibilities with Adaptive User Interfaces
To the end of our possibilities with Adaptive User Interfaces
Ā 
Engineering the Transition of Interactive Collaborative Software from Cloud C...
Engineering the Transition of Interactive Collaborative Software from Cloud C...Engineering the Transition of Interactive Collaborative Software from Cloud C...
Engineering the Transition of Interactive Collaborative Software from Cloud C...
Ā 
UsyBus: A Communication Framework among Reusable Agents integrating Eye-Track...
UsyBus: A Communication Framework among Reusable Agents integrating Eye-Track...UsyBus: A Communication Framework among Reusable Agents integrating Eye-Track...
UsyBus: A Communication Framework among Reusable Agents integrating Eye-Track...
Ā 
ĀµV: An Articulation, Rotation, Scaling, and Translation Invariant (ARST) Mult...
ĀµV: An Articulation, Rotation, Scaling, and Translation Invariant (ARST) Mult...ĀµV: An Articulation, Rotation, Scaling, and Translation Invariant (ARST) Mult...
ĀµV: An Articulation, Rotation, Scaling, and Translation Invariant (ARST) Mult...
Ā 
RepliGES and GEStory: Visual Tools for Systematizing and Consolidating Knowle...
RepliGES and GEStory: Visual Tools for Systematizing and Consolidating Knowle...RepliGES and GEStory: Visual Tools for Systematizing and Consolidating Knowle...
RepliGES and GEStory: Visual Tools for Systematizing and Consolidating Knowle...
Ā 
Gesture-based information systems: from DesignOps to DevOps
Gesture-based information systems: from DesignOps to DevOpsGesture-based information systems: from DesignOps to DevOps
Gesture-based information systems: from DesignOps to DevOps
Ā 
Engineering Slidable User Interfaces with Slime
Engineering Slidable User Interfaces with SlimeEngineering Slidable User Interfaces with Slime
Engineering Slidable User Interfaces with Slime
Ā 
Evaluating Gestural Interaction: Models, Methods, and Measures
Evaluating Gestural Interaction: Models, Methods, and MeasuresEvaluating Gestural Interaction: Models, Methods, and Measures
Evaluating Gestural Interaction: Models, Methods, and Measures
Ā 
User-centred Development of a Clinical Decision-support System for Breast Can...
User-centred Development of a Clinical Decision-support System for Breast Can...User-centred Development of a Clinical Decision-support System for Breast Can...
User-centred Development of a Clinical Decision-support System for Breast Can...
Ā 
Simplifying the Development of Cross-Platform Web User Interfaces by Collabo...
Simplifying the Development of  Cross-Platform Web User Interfaces by Collabo...Simplifying the Development of  Cross-Platform Web User Interfaces by Collabo...
Simplifying the Development of Cross-Platform Web User Interfaces by Collabo...
Ā 
Attach Me, Detach Me, Assemble Me like you Work
Attach Me, Detach Me, Assemble Me like you WorkAttach Me, Detach Me, Assemble Me like you Work
Attach Me, Detach Me, Assemble Me like you Work
Ā 
The Impact of Comfortable Viewing Positions on Smart TV Gestures
The Impact of Comfortable Viewing Positions on Smart TV GesturesThe Impact of Comfortable Viewing Positions on Smart TV Gestures
The Impact of Comfortable Viewing Positions on Smart TV Gestures
Ā 
Head and Shoulders Gestures: Exploring User-De fined Gestures with Upper Body
Head and Shoulders Gestures: Exploring User-Defined Gestures with Upper BodyHead and Shoulders Gestures: Exploring User-Defined Gestures with Upper Body
Head and Shoulders Gestures: Exploring User-De fined Gestures with Upper Body
Ā 
G-Menu: A Keyword-by-Gesture based Dynamic Menu Interface for Smartphones
G-Menu: A Keyword-by-Gesture based Dynamic Menu Interface for SmartphonesG-Menu: A Keyword-by-Gesture based Dynamic Menu Interface for Smartphones
G-Menu: A Keyword-by-Gesture based Dynamic Menu Interface for Smartphones
Ā 
Vector-based, Structure Preserving Stroke Gesture Recognition
Vector-based, Structure Preserving Stroke Gesture RecognitionVector-based, Structure Preserving Stroke Gesture Recognition
Vector-based, Structure Preserving Stroke Gesture Recognition
Ā 
An ontology for reasoning on body-based gestures
 An ontology for reasoning on body-based gestures An ontology for reasoning on body-based gestures
An ontology for reasoning on body-based gestures
Ā 
AB4Web: An On-Line A/B Tester for Comparing User Interface Design Alternatives
AB4Web: An On-Line A/B Tester for Comparing User Interface Design AlternativesAB4Web: An On-Line A/B Tester for Comparing User Interface Design Alternatives
AB4Web: An On-Line A/B Tester for Comparing User Interface Design Alternatives
Ā 
Gelicit: A Cloud Platform for Distributed Gesture Elicitation Studies
 Gelicit: A Cloud Platform for Distributed Gesture Elicitation Studies Gelicit: A Cloud Platform for Distributed Gesture Elicitation Studies
Gelicit: A Cloud Platform for Distributed Gesture Elicitation Studies
Ā 
MoCaDiX: Designing Cross-Device User Interfaces of an Information System base...
MoCaDiX: Designing Cross-Device User Interfaces of an Information System base...MoCaDiX: Designing Cross-Device User Interfaces of an Information System base...
MoCaDiX: Designing Cross-Device User Interfaces of an Information System base...
Ā 
Specification of a UX process reference model towards the strategic planning ...
Specification of a UX process reference model towards the strategic planning ...Specification of a UX process reference model towards the strategic planning ...
Specification of a UX process reference model towards the strategic planning ...
Ā 

Recently uploaded

Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Orbitshub
Ā 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyKhushali Kathiriya
Ā 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamUiPathCommunity
Ā 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...DianaGray10
Ā 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusZilliz
Ā 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024The Digital Insurer
Ā 
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...apidays
Ā 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processorsdebabhi2
Ā 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century educationjfdjdjcjdnsjd
Ā 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FMESafe Software
Ā 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...apidays
Ā 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024The Digital Insurer
Ā 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native ApplicationsWSO2
Ā 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsNanddeep Nachan
Ā 
Cyberprint. Dark Pink Apt Group [EN].pdf
Cyberprint. Dark Pink Apt Group [EN].pdfCyberprint. Dark Pink Apt Group [EN].pdf
Cyberprint. Dark Pink Apt Group [EN].pdfOverkill Security
Ā 
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdfRising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdfOrbitshub
Ā 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobeapidays
Ā 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Jeffrey Haguewood
Ā 
Spring Boot vs Quarkus the ultimate battle - DevoxxUK
Spring Boot vs Quarkus the ultimate battle - DevoxxUKSpring Boot vs Quarkus the ultimate battle - DevoxxUK
Spring Boot vs Quarkus the ultimate battle - DevoxxUKJago de Vreede
Ā 

Recently uploaded (20)

Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Ā 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
Ā 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
Ā 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Ā 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with Milvus
Ā 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
Ā 
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Ā 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
Ā 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
Ā 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Ā 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Ā 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
Ā 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
Ā 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
Ā 
Cyberprint. Dark Pink Apt Group [EN].pdf
Cyberprint. Dark Pink Apt Group [EN].pdfCyberprint. Dark Pink Apt Group [EN].pdf
Cyberprint. Dark Pink Apt Group [EN].pdf
Ā 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
Ā 
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdfRising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Ā 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Ā 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Ā 
Spring Boot vs Quarkus the ultimate battle - DevoxxUK
Spring Boot vs Quarkus the ultimate battle - DevoxxUKSpring Boot vs Quarkus the ultimate battle - DevoxxUK
Spring Boot vs Quarkus the ultimate battle - DevoxxUK
Ā 

Conducting a Gesture Elicitation Study: How to Get the Best Gestures From People?

  • 1. Lecture 3: Conducting a Gesture Elicitation Study: How to Get the Best Gestures From People? Jean Vanderdonckt, UCLouvain Vrije Universiteit Brussel, 20 April 2021, on-line
  • 2. How to get the best gestures from people? Case study: sketching a graphical UI (Interactā€™07) https://www.youtube.com/watch?v=SBNB1O-8pGw&t=5s
  • 3. How to determine the best gestures for widgets?
  • 4. How to determine the best gestures for widgets? 1: none
  • 5. How to determine the best gestures for widgets? 1: none 2: low
  • 6. How to determine the best gestures for widgets? 1: none 2: low 3: medium
  • 7. How to determine the best gestures for widgets? 1: none 2: low 3: medium 4: high
  • 8. Use de Borda Count method (Condorcet method) ā€¢ Each participant assigns a score equal to the number of representations to whom he or she is preferred in decreasing order: n-1, n-2, n-3,ā€¦ n=5 Participant #1 0 n-3=2 n-4=1 n-1=4 n-2=3 Participant #2 n-1=4 0 n-3=2 n-2=3 n-4=1 Score= 4 Score= 7 Score= 3 Score= 2 Score= 4
  • 9. Use de Borda Count method (Condorcet method) ā€¢ Each participant assigns a score equal to the number of representations to whom he or she is preferred in decreasing order: n-1, n-2, n-3,ā€¦
  • 10. Further classification: complexity of the representation ST=sketching time DEL=delete operations
  • 11. Further classification: complexity of the representation ST=sketching time DEL=delete operations Source: Suzanne Kieffer, Adrien Coyette, Jean Vanderdonckt, User interface design by sketching: a complexity analysis of widget representations. EICS 2010: 57-66
  • 12. How to get the best gestures from people? Considered insofar: ā€¢ Human factors: preference, sketching time, delete operations ā€¢ System factors: recognition rate Other method for preference ā€¢ By elicitation ā€¢ By wizard of Oz
  • 13. Gesture Elicitation Study (GES) ā€¢ To elicit ā€¢ To call forth or draw out something, such as information or a response (Merriam Webster) ā€¢ To evoke or draw out a reaction, answer, or fact from someone (Dictionary.com) ā€¢ To draw forth something that is latent or potential into existence (Dictionary.com) ā€¢ Elicitation ā€¢ Acquisition of information from a person or a group of persons in a manner that does not disclose the intent of the interview, discussion or conversation ā€¢ Elicitation technique ā€¢ A technique used to discreetly extract information, not readily available for a specific purpose Source: Wobbrock et al., 2009
  • 14. Gesture Elicitation Study (GES) ā€¢ Definition ā€¢ Is a study for asking end-users to elicit their own gestures for a set of predefined functions through referents and reach a consensus ā€¢ Aims and goals ā€¢ To let end users elicit the gestures they want for a function ā€¢ To compute agreement between end users (=participants, subjects) ā€¢ To define a vocabulary of potential gestures ā€¢ To identify a consensus set, which is a sub-vocabulary of agreed gestures. Output = Consensus Set ā€¢ For one context of use at a time: C= (U,P,E) ā€¢ User : any person, Functions: IoT functions ā€¢ Platform/device/sensor: an armband ā€¢ Environment: smart home, usability lab, controlled room
  • 15. Gesture Elicitation Study (GES) ā€¢ Advantages ā€¢ Free from any form of stress or burden of working ā€¢ Information comes deliberately once the favorable context is created ā€¢ No adverse effect ā€¢ Easy and cheap to run ā€¢ Disadvantages ā€¢ Legacy bias ā€¢ Previous experience with a similar system, another device ā€¢ No experience ā€¢ Lack of creativity
  • 16. GES Step 1: Prepare setup C=(U,P,E) ā€¢ Create a representative sampling of the user population
  • 17. GES Step 1: Prepare setup C=(U,P,E) ā€¢ Find a representative sampling of the user population ā€¢ Example of stratified sampling Sampling Female Male Age group % 32 51 49 18-24 8.9 3 2 1 25-34 15.4 5 3 2 35-44 16.5 5 3 2 45-54 18.3 6 3 3 55-64 18.7 6 3 3 65+ 22.2 7 4 3 Total 100 32 18 14
  • 18. GES Step 1: Prepare setup C=(U,P,E) ā€¢ Find a representative sampling of the user population ā€¢ Find n=30 participants ā€¢ Ideally 50% female, 50% male ā€¢ Different ages (not all students!) ā€¢ Different backgrounds (not all in your discipline!) ā€¢ Different levels of experience (e.g., with touch or not) ā€¢ Each participant takes part in the experiment individually ā€¢ Define an experiment schedule ā€¢ Location (should be constant): place to choose ā€¢ Time: 30 min for each participant + resting time ā€¢ Schedule: organize a Doodle ā€¢ Test the whole protocol with a guinea pig ā€¢ Different from the participants!
  • 19. GES Step 1: Prepare setup C=(U,P,E) ā€¢ Identify referents for the following tasks ā€¢ A referent is the expected result of an action (e.g., key pressing, touch, gesture) for a function ā€¢ Distribution the definition of the 14 referents across groups ā€¢ Example: IoT functions: (1) Turn the TV on/off (2) Start player (3) Turn up the volume (4) Turn down the volume (5) Go to the next item in a list (6) Go to the previous item (7) Turn AC on/off (8) Turn lights on/off (9) Brighten lights (10) Dim lights (11)Turn heat on/off (12) Turn alarm on/off (13)Answer phone call (14) End phone call ā€¢ Randomize the order of the referents before the study for each participant (create lists with www.random.org) Is symmetric to
  • 20. GES Step 1: Prepare setup C=(U,P,E) ā€¢ What is a referent? Referent: - State before action - Action description - State after action
  • 21. GES Step 1: Prepare setup C=(U,P,E) ā€¢ Referent: textual, graphical, animation, video (visual priming) 20. Permute files remotely Before After My Network My Disk My Network My Disk Our Network Partner Disk Our Network Partner Disk Our Network Partner Disk Our Network Partner Disk Yellow Document Red Document Red Document Yellow Document Blue Document Blue Document
  • 22. GES Step 1: Prepare setup C=(U,P,E) ā€¢ Referent: textual, graphical, animation, video (visual priming) https://www.youtube.com/watch?v=iVzE4EzfGoc&t=12s
  • 23. GES Step 1: Prepare setup C=(U,P,E) ā€¢ Prepare the hardware, software ā€¢ Keep them constant throughout the experiment ā€¢ Determine a software for recording gestures in Log Files ā€¢ Example: Python software for Leap Motion ā€¢ Example: MyO Gesture Control
  • 24. GES Step 1: Prepare setup C=(U,P,E) ā€¢ Identify the physical location: office, room, usability lab,ā€¦ ā€¢ Keep it constant throughout the experiment ā€¢ Sometimes determined by the device ā€¢ Example: Usability lab
  • 25. GES Step 2: Pre-test (before experiment) ā€¢ Each participant ā€¢ Comes to the experiment location according to schedule ā€¢ Signs a consent form ā€¢ Fills in a demographic questionnaire ā€¢ Notoriety and experience if any for the target device ā€¢ System experience in general (e.g., for a tablet, frequency) ā€¢ Performs a Motor-Skill test: to touch their thumb to each of the other fingers of the same hand 2 times consecutively without flaw ā€¢ Is presented with an introduction to the device (e.g., a video) ā€¢ Experimenter ā€¢ Assigns an ID to each participant on the consent form ā€¢ Checks that all forms are completely filled in, before and after
  • 26. GES Step 3: Test (conduct experiment) ā€¢ Each participant ā€¢ Is presented with the list of referents (randomly selected) ā€¢ Confirms she understood the tasks ā€¢ Is asked to think of a suitable gesture ā€¢ Says ā€œI am readyā€ ā€¢ Gives a rating (Goodness-of-fit) between 1 and 10 ā€¢ To describe how well the proposed gesture fits the referent ā€¢ 1 = very poor fit ļ‚® 10 = excellent fit ā€¢ Optionally gives a value between 1 and 7 for ā€¢ Complexity: 1= most complex ļ‚® 7= simplest gesture ā€¢ Memorability: 1=hardest to remember ļ‚® 7=easiest to remember ā€¢ Fun: 1=is the least funny ļ‚® 7=is the funniest for each gesture in the random set
  • 27. GES Step 3: Test (conduct experiment) ā€¢ The experimenter, on the gesture sheet ā€¢ Records the thinking time = time since the task started (i.e., when the referent was presented to the participant) and the moment when the participant knows what gesture to propose ā€¢ Fills in the gesture sheet: for each gesture collected ā€¢ Writes the referent ID ā€¢ Records the thinking time ā€¢ Write all scores: Goodness-of-fit, Complexity, Memorability, Fun ā€¢ Videotapes the whole session for further analysis ā€¢ Keep video files for the final report ā€¢ Takes some pictures of the room for some interesting gestures ā€¢ Records the gestures into files with the very right device ā€¢ Example: X, Y, Z, timestamp
  • 28. GES Step 4: Post-test (after experiment) ā€¢ The participant ā€¢ Fills in the IBM PSSUQ questionnaire ā€¢ Is being asked a few open questions ā€¢ What did you like the most in the experiment? ā€¢ What did you hate the most in the experiment? ā€¢ The experimenter ā€¢ Checks that the questionnaire is properly filled in ā€¢ Asks questions to the participant if any ā€¢ Encourages the participant to answer the open questions ā€¢ All ā€¢ Encode all data into spreadsheet ā€¢ Template file available
  • 29. GES Step 5: Gesture classification ā€¢ Give each individual collected gesture a consistent, structured name ā€¢ Adopt a structure (action verb)+(limb)+(parameter) ā€¢ Examples: ā€¢ Swipe right with two fingers, swipe left with the dominant hand ā€¢ References ā€¢ Hand gestures (e.g., LeapMotion): http://gestureml.org/doku.php/gestures/motion/gesture_index ā€¢ Arm gestures (see in body motion gestures section above) ā€¢ arm_raised_forward (left/right), arm_raised_side (left/right), ā€¢ arm_point (left/right) ā€¢ arm_wave (left/right), arm_push (left/right), arm_throw (left/right) ā€¢ arm_punch (left/right), arm_folded (left/right), arm_on_hip (left/right)
  • 30. GES Step 5: Gesture classification ā€¢ Assign each individual gesture to a category ID ā€¢ Example: ā€¢ Swipe right with two, three fingers, swipe right with the dominant hand are assigned to the category ā€œ1: Swipe rightā€ however the gesture is performed ā€¢ Create your own classification ā€¢ You can draw inspiration from existing classifications ā€¢ You may depart from existing classifications by ā€¢ Adding, deleting, modifying your own categories ā€¢ Refining, generalizing existing categories ā€¢ Detail enough: minimum 10 categories ā€¢ Do not overdetail (e.g., by creating a category for each gesture): maximum 20 gestures ā€¢ In the Excel file, tab ā€œElicited Gestureā€, fill in the light grey area ā€¢ Input other data, like Thinking time, Goodness-of-fit, etc.
  • 31. GES Step 5: Gesture classification ā€¢ Descriptive labeling: Gesture Cards ā€¢ Examples ā€¢ Touch the ring once, twice, or multiple times in a row. Tap a rhythmic pattern on the ringā€™s surface ā€¢ Rotate the ring on the finger. Rotate the finger wearing the ring. Rotate the hand wearing the ring ā€¢ Slide the ring along the finger. Pull out the ring. Place the ring back on the finger. Change the ring to a different finger,
  • 32. GES Step 5: Gesture classification ā€¢ Gesture type: describes the underlying meaning of a gesture. Possible values are: ā€¢ P= Pointing gestures (= deictic gestures) indicate people, objects, directions ā€¢ Semaphoric gestures are hand postures and movements conveying specific meanings ā€¢ T= Static semaphorics are identified by a specific hand posture. Examples: thumbs-up means ā€œokayā€, a flat palm facing from the actor means ā€œstopā€. ā€¢ D= Dynamic semaphorics convey information through their temporal aspects. Example: a circular hand motion means ā€œrotateā€ ā€¢ S= Semaphoric strokes represent hand flicks are single, stroke-like movements. Example: a left flick of the hand means ā€œdismiss this objectā€ ā€¢ A = Pantomimic gestures demonstrate a specific task to be performed or imitated, which mostly involves motion and particular hand postures. ā€¢ Examples: filling an imaginary glass with water, by tilting an imaginary bucket. They often consist of multiple low-level gestures, grabbing an object, moving it, and releasing it again
  • 33. GES Step 5: Gesture classification ā€¢ Gesture type: describes the underlying meaning of a gesture. Possible values are (contā€™d): ā€¢ Iconic gestures communicate information about objects or entities, such as specific sizes, shapes, and motion paths: ā€¢ I= Static iconics are performed by spontaneous static hand postures. Example: Draw an ā€œOā€ with index finger and thumb means a ā€œcircleā€ ā€¢ Y= Dynamic iconics are often used to describe paths or shapes, such as moving the hand in circles, meaning ā€œthe circleā€. ā€¢ - M= Manipulation gestures a guide movement in a short feedback loop. Thus, they feature a tight relationship between the movements of the actor and the movements of the object to be manipulated. The actor waits for the entity to ā€œfollowā€ before continuing Source: https://www.microsoft.com/en-us/research/publication/understanding-mid-air-hand-gestures-a-study-of-human-preferences-in-usage-of-gesture-types-for-hci/
  • 34. GES Step 5: Gesture classification ā€¢ Gesture form: specifies which form of gesture is elicited. Possible values are: ā€¢ S= stroke when the gesture only consists of taps and flicks ā€¢ T= static when the gesture is performed in only one location ā€¢ M= static with motion (when the gesture is performed with a static pose while the rest is moving) ā€¢ D= dynamic when the gesture does capture any change or motion
  • 35. GES Step 5: Gesture classification ā€¢ Range of motion: relates the distance between the position of the human body producing the gesture and the location of the gesture. Possible values are ā€¢ C= Close intimate, I= Intimate, P= Personal, S=Social, U= Public, R= Remote Range of motion
  • 36. GES Step 5: Gesture classification ā€¢ Laterality: characterizes how the two hands are employed to produce gestures, with two categories, as done in many studies. Possible values are: ā€¢ D= dominant unimanual, N= non-dominant unimanual, S= symmetric bimanual, A= asymmetric bimanual Source: https://www.tandfonline.com/doi/abs/10.1080/00222895.1987.10735426 D (right handed) N (right handed) S (right handed) A (right handed)
  • 37. GES Step 5: Gesture classification ā€¢ Classify each gesture category according to the following criteria and enter corresponding code ā€¢ Use other classification criteria depending on ā€¢ User ā€¢ Platform/device ā€¢ Environment
  • 38. GES Step 5: Compute agreement among gestures ā€¢ Download AGaTE from http://depts.washington.edu/acelab/proj/dollar/agate.html ā€¢ Prepare a CSV file with participant ID (column), referent name (line), Gesture category ID (cell) Participant ID Referent name Gesture category ID
  • 39. GES Step 5: Compute agreement among gestures ā€¢ Compute Agreement Rate (AR) for all referents ā€¢ Agreement Rate = the number of pairs of participants in agreement with each other divided by the total number of pairs of participants that could be in agreement ā€¢ Compute co-agreement for pairs, groups (eg male vs female), categories of referents (eg basic vs. advanced) agreement rate disagreement rate co-agreement rate Source: https://dl.acm.org/citation.cfm?id=2669511
  • 40. GES Step 5: Compute agreement among gestures ā€¢ Example: agreement rate for one referent with 5 participants proposing 2 gestures A and B. ā€¢ Connected pairs represent how two participants performed the same gesture Source: https://dl.acm.org/citation.cfm?id=2669511
  • 41. ā€¢ Ring Gestures ā€¢ 41 24 participants = 672 gesture proposals x 2 rings (Ring Zero) x 14 referents Source: B. Gheran, J. Vanderdonckt, R.-D. Vatavu, Gestures for Smart Rings: Empirical Results, Insights, and Design Implications. Proc. of ACM DISā€™18, 623-635. Example
  • 42. ā€¢ Ring Gestures: samples Example https://www.youtube.com/watch?v=FHT-5aFNhsA
  • 43. ā€¢ Ring Gestures: agreement scores Example
  • 44. ā€¢ Ring Gestures: consensus set Example
  • 45. GES Step 6: Analyze agreement among gestures ā€¢ Plot all referents in decreasing order of their AR, with error bars denoting confidence interval (95%) and gesture category 45 0.193 0.173 0.157 0.157 0.14 0.13 0.117 0.11 0.11 0.107 0.107 0.097 0.08 0.07 0.067 0.06 0.053 0.053 0.043 0.107 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 Turn Light On Turn TV On Begin Player Turn Light Off Turn TV Off Go to previous item Go to next item Turn Heat On Answer a Call Increase Volume Turn Heat Off Increase Light Decrease Volume Hang Up Call Decrease Light Turn AC On Turn Alarm On Turn Alarm Off Turn AC Off Average Agreement rate Low agreement Medium agreement Most elicited gesture 15: Splay 27: Button 27: Button 3: Point 14: Fist 10: Swipe 10: Swipe 8: Rotate 30: Phone 30: Phone 8: Rotate 28: Dimmer 28: Dimmer 15: Splay 6: Flat 10: Swipe 27: Button 27: Button 10: Swipe Referent Category Source: https://dl.acm.org/citation.cfm?id=2702223
  • 46. GES Step 7: Analyze Gestures ā€¢ Provide a detailed analysis of gestures elicited ā€¢ By criteria ā€¢ By agreement rate ā€¢ By observation (e.g., field notes, video notes, gesture sheet) ā€¢ By interview ā€¢ By IBM PSSUQ questionnaire ā€¢ Identify potential correlations between data (e.g., t test, ANOVA, etc.) ā€¢ Decide Consensus set = set of agreed gestures based on results ā€¢ Example: https://ieeexplore.ieee.org/document/8890919
  • 47. Example of Gesture Elicitation Study Head and Shoulders Gestures: Exploring User-Defined Gestures with Upper Body Source: Jean Vanderdonckt, Nathan Magrofuoco, Suzanne Kieffer, Jorge PĆ©rez, Ysabelle Rase, Paolo Roselli, Santiago Villarreal: Head and Shoulders Gestures: Exploring User-Defined Gestures with Upper Body. HCI (19) 2019: 192-213
  • 48. 48 What are head and shoulders gestures? ā€¢ A head gesture is any movement of the head leaving the rest of the body unaffected (stationary) ā€¢ A head gesture could occur in any plane (sagittal, transverse, frontal) ā€¢ A shoulder gesture is any movement of the shoulder joint that leaves the rest of the arm unaffected (stationary). ā€¢ A shoulder gesture occurs in any plane of motion (sagittal, transverse, frontal) or direction (forward, backward, or circular)
  • 50. 50 Why are head and shoulders gestures interesting? ā€¢ Hands-free interaction is made possible ā€¢ Fixed-Gaze Head Movement is appropriate when ā€¢ No device needed ā€¢ Both hands should be free ā€¢ No need to move the gaze ā€¢ Capability to trigger actions from a moderate set ā€¢ Medium command duration ā€¢ Accurate recognizers for head and shoulders gestures start to appear M13 M31 M65 M56 M42 M24 [Qinjie et al., 2019]
  • 51. 51 Why are head and shoulders gestures interesting? ā€¢ They are used in physical exercises
  • 52. 52 Why are head and shoulders gestures interesting? ā€¢ They exhibit some potential for a novel vocabulary Head Label Alias Movement (frontal, transversal, sagittal) X trans- lation Move the head left, right Face left, face right Lateral translation (v,c,c) Y trans- lation Move the head up, down Face up, face down Neck elevation, depression (c,v,c) Z trans- lation Move the head forward, backward Thrust, retreat Protraction, retraction (c,c,v)
  • 53. 53 Why are head and shoulders gestures interesting? ā€¢ They exhibit some potential for a novel vocabulary Head Label Alias Movement (frontal, transversal, sagittal) Frontal tilting Tilt the head to the left, right Bend left, right Lateral flexion (v,v,c) Transvers al tilting Tilt the head up, down Bend up, down Extension, flexion (v,c,v) Saggital tilting Tilt the head forward, backward Bend forward, backward Extension, flexion (c,v,v)
  • 54. 54 Why are head and shoulders gestures interesting? ā€¢ They exhibit some potential for a novel vocabulary Head Label Alias Movement (frontal, transversal, sagittal) X rotation Turn the head up, down Uphead, downhead Horizontal rotation (c,v,v) Y rotation Turn the head left, right Lefthead, righthead Vertical rotation (v,c,v) Z rotation Turn the head forward, backward Forehead, backhead Facial rotation (v,v,c)
  • 55. 55 Why are head and shoulders gestures interesting? ā€¢ They exhibit some potential for a novel vocabulary Shoulders Label Alias Movement (frontal, transversal, sagittal) X translation Move shoulder horizontally to left, right Decontract, contract Extension, flexion (v,c,c) Y translation Raise shoulder, lower shoulder Raise, lower Shoulder elevation, depression (c,v,c) Z translation Move shoulder forward/backward Protract, retract Shoulder protraction, retraction (c,c,v)
  • 56. ā€¢ There are also common head and shoulders gestures
  • 57. ā€¢ Larger design space than www.gestureml.org Source: http://gestureml.org/doku.php/gestures/motion/gesture_index ā€œThese head-motion based gestures can be great for adding subtle context cues to game controls and metrics or even used to directly modify the way digital content is presented on a display.ā€
  • 58. 58 Experiment: Gesture Elicitation Study (GES) ā€¢ Participants ā€¢ 10 females + 12 males = 22 participants ā€¢ Aged from 18 to 62y (M=29, SD=13) ā€¢ Various occupations: secretary, teacher, employee,... ā€¢ Device usage frequencies ā€¢ Creativity score y = 0.1467x + 52.482 RĀ² = 0.0775 0 10 20 30 40 50 60 70 80 18 28 38 48 58 Creativity [score] Age [years] 6.09 6.05 2.68 1.73 1.05 0.00 0.00 1.00 2.00 3.00 4.00 5.00 6.00 7.00 Computer Smartphone Tablet Game console MS Kinect Studied device Average fequency of usage Device
  • 59. 59 Experiment: Gesture Elicitation Study (GES) ā€¢ Stimuli: 14 referents for IoT tasks: Turn the TV On/Off, Start Player, Turn the Volume up, Turn the volume down, Go to the next channel, Go to the previous channel, Turn Air Conditioning On/Off, Turn Lights On/Off, Brighten Lights, Dim Lights, Turn Heating system On/Off, Turn Alarm On/Off, Answer a phone call, and End Phone Call. Example: 3. INCREASE: Brighten lights Before After
  • 60. 60 Experiment: Results ā€¢ 22 participants X 14 referents = 308 elicited gestures resulting into 10 categories Head single gesture, 102 Concurrent compound gesture, 70 Sequential compound gesture, 44 Both shoulders single gesture, 29 Dominant shoulder single gesture, 19 Non-dominant shoulder single gesture, 14 Head repeated gesture, 10 Both shoulders repeated gestures, 9 Dominant shoulder repeated gesture, 4 Non-dominant shoulder repeated gesture, 3 Other, 26
  • 62. 62 Experiment: Results ā€¢ Evolution of aggregated measures per referent 0.00 1.00 2.00 3.00 4.00 5.00 6.00 7.00 8.00 0.00 5.00 10.00 15.00 20.00 25.00 Value Time [Sec] Referent Average thinking time Goodness-of-fit Linear (Average thinking time) Linear (Goodness-of-fit)
  • 63. 63 Experiment: Results ā€¢ Breakdown per criteria Upface/downface, 3% Thrust, 1% Bend, 31% Nod, 4% Rotate, 4% Left/right, 12% Backhead, 0% Raise 11% Lower, 4% Shrug, 11% Clog, 9% Protract 6% Retract, 3% Head (%), 50.87 Shoulders (%), 30.62 Head and Shoulders (%), 18.51 One stroke 68.09% Two strokes 22.37% Three or more strokes 9.54% Body part Elicited gestures Amount of strokes
  • 64. 0.390 0.390 0.316 0.283 0.267 0.260 0.263 0.250 0.250 0.250 0.248 0.229 0.215 0.188 0.138 0.368 0.368 0.286 0.251 0.238 0.234 0.232 0.221 0.221 0.216 0.215 0.195 0.182 0.156 0.104 0.00 0.05 0.10 0.15 0.20 0.25 0.30 0.35 0.40 0.45 0.50 Go to Next Channel Go to Previous Channel Answer Phone Call End Phone Call Play/pause Turn TV On/Off Average Turn Lights On/Off Turn Alarm On/Off Decrease Volume Brighten Lights Increase Volume Dim Lights Turn Air Conditioning On/Off Turn Heating System On/Off High Medium 64 Experiment: 14 Consensus gestures Agreement score [Vatavu & Wobbrock, 2015] Agreement rate [Vatavu & Wobbrock, 2016]
  • 65. 65 Conclusion ā€¢ Contributions ā€¢ Design space for head and shoulders gestures ā€¢ Corpus of 308 elicited gestures with measures ā€¢ Classification into 10 categories ā€¢ Consensus set of 14 head and shoulders gestures ā€¢ Design guidelines ā€¢ Use bending gestures as a first-class citizen ā€¢ Use Upface/downface for infrequent tasks ā€¢ Use thrust only for play/pause ā€¢ Forehead and backhead gestures should not be used, apart for exceptional assignation
  • 67. ā€¢ Shortcomings ā€¢ Legacy bias (Morris et al, 2010) Source: M. Morris et al., Reducing Legacy Bias in Gesture Elicitation Studies, 2010 https://interactions.acm.org/archive/view/may-june-2014/reducing-legacy-bias-in-gesture-elicitation-studies Priming. Priming users to think about the capabilities of a new form factor or sensing technology is another approach that may reduce the impact of legacy bias Partners. Inviting users to participate in elicitation studies in groups, rather than individually, can be another approach to overcoming legacy bias
  • 68. ā€¢ Shortcomings ā€¢ Manual (variable) classification of gestures Source: Radu-Daniel Vatavu, The Dissimilarity-Consensus Approach to Agreement Analysis in Gesture Elicitation Studies. CHI 2019: 224 ā€œIt is possible to design a highly guessable symbol set by acquiring guesses from participants.ā€ ā€œParticipants are first recruited to propose symbols for specified referents within a given domain. The more participants, the more likely the resulting symbol set will be guessable to external users.ā€ The goal is to obtain a rich set of symbols from which to create the resultant symbol set.ā€
  • 69. ā€¢ Shortcomings ā€¢ Manual (variable) classification of gestures N=30 children, Referent: ā€œScratch like a cat!ā€ ļ¶ Grouping criteria #1: same hand ļƒ  39% agreement ļ¶ Grouping criteria #2: same hand & body pose ļƒ  19.6% ļ¶ Grouping criteria #3: same hand & body pose & pattern of movement ļƒ  12.2% Source: Radu-Daniel Vatavu, The Dissimilarity-Consensus Approach to Agreement Analysis in Gesture Elicitation Studies. CHI 2019: 224
  • 70. ā€¢ Shortcomings ā€¢ Automatic classification of gestures 1. A dissimilarity function for gestures Ī”(š‘”š‘–,š‘”š‘—) ļ¶ for example, the Euclidean distance ļ¶ the Dynamic Time Warping cost function ļ¶ or any distance function that takes two gestures as input and returns a real, positive value of how dissimilar they are 2. A threshold (Ļ„, tau) for values computed by Ī” Source: Radu-Daniel Vatavu, The Dissimilarity-Consensus Approach to Agreement Analysis in Gesture Elicitation Studies. CHI 2019: 224
  • 71. ā€¢ Shortcomings ā€¢ Automatic classification of gestures Source: Radu-Daniel Vatavu, The Dissimilarity-Consensus Approach to Agreement Analysis in Gesture Elicitation Studies. CHI 2019: 224
  • 72. ā€¢ Shortcomings and variants ā€¢ Open vs Closed elicitation Open Closed
  • 73. ā€¢ Shortcomings and variants ā€¢ Gesture elicitation distributed in time and space GestMAN Source: GestMan: a cloud-based tool for stroke-gesture datasets
  • 74. ā€¢ Shortcomings and variants ā€¢ Referent-free elicitation Open
  • 75. ā€¢ Shortcomings and variants ā€¢ Some limbs are privileged Source: Villarreal et al., A Systematic Review of Gesture Elicitation Studies: What Can We Learn from 216 Studies? Body parts: 1.- The Individual Frequency of Body Parts (IFBP) 2. The Combination Frequency of Body Parts (CFBP)
  • 76. ā€¢ Shortcomings and variants ā€¢ Eliciting other symbols than gestures, other modalities Source: Villarreal et al., A Systematic Review of Gesture Elicitation Studies: What Can We Learn from 216 Studies?
  • 77. ā€¢ Shortcomings and variants ā€¢ Eliciting more than one symbol Source: Villarreal et al., A Systematic Review of Gesture Elicitation Studies: What Can We Learn from 216 Studies? Multiple symbols could be combined to trigger a function by relying on hierarchical structure and congruence
  • 78. ā€¢ Conclusion ā€¢ Elicitation study is a practical tool for eliciting (symbol) proposals from participants for commands, icons, shortcuts, gestures, vocal commands, etc. ā€¢ Efficient, natural ā€¢ Subject to manual classification ā€¢ Not always sure for agreement rates (other scores) ā€¢ Legacy bias ā€¢ Continuity from elicitation to recognition ā€¢ Subject to Context variability (U,P,E)
  • 79. ā€¢ Conclusion ā€¢ Many GES already exist, but no consolidation of this knowledge! y = 3.8909x - 4.1636 RĀ² = 0.9052 0 5 10 15 20 25 30 35 40 45 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 Number of studies Year of publication Frequency of Gesture ElicitationStudies
  • 80. How to get the best gestures from people? Considered insofar: ā€¢ Human factors: preference, sketching time, delete operations, agreement, (dis)similarity ā€¢ System factors: recognition rate More to come and to consider: ā€¢ Human factors: hedonic value, memorability, naturalness, discoverability, consistency, congruence, ā€¦ ā€¢ System factors: recognition rate, execution time, computational complexity, more measuresā€¦