Lecture 3: Conducting a Gesture Elicitation Study: How to Get the Best Gestures From People?
Francqui Chair in Computer Science 2020 VUB, Jean Vanderdonckt, 27 April 2021
Spring Boot vs Quarkus the ultimate battle - DevoxxUK
Ā
Conducting a Gesture Elicitation Study: How to Get the Best Gestures From People?
1. Lecture 3: Conducting a Gesture Elicitation Study:
How to Get the Best Gestures From People?
Jean Vanderdonckt, UCLouvain
Vrije Universiteit Brussel, 20 April 2021, on-line
2. How to get the best gestures from people?
Case study: sketching a graphical UI
(Interactā07)
https://www.youtube.com/watch?v=SBNB1O-8pGw&t=5s
6. How to determine the best gestures for widgets?
1: none
2: low
3: medium
7. How to determine the best gestures for widgets?
1: none
2: low
3: medium
4: high
8. Use de Borda Count method (Condorcet method)
ā¢ Each participant assigns a score equal to the number of
representations to whom he or she is preferred in
decreasing order: n-1, n-2, n-3,ā¦
n=5
Participant #1 0 n-3=2 n-4=1 n-1=4 n-2=3
Participant #2 n-1=4 0 n-3=2 n-2=3 n-4=1
Score= 4
Score= 7
Score= 3
Score= 2
Score= 4
9. Use de Borda Count method (Condorcet method)
ā¢ Each participant assigns a score equal to the number of
representations to whom he or she is preferred in
decreasing order: n-1, n-2, n-3,ā¦
11. Further classification: complexity of the representation
ST=sketching time
DEL=delete operations
Source: Suzanne Kieffer, Adrien Coyette, Jean Vanderdonckt, User interface design by sketching: a complexity analysis of widget representations.
EICS 2010: 57-66
12. How to get the best gestures from people?
Considered insofar:
ā¢ Human factors: preference, sketching time, delete
operations
ā¢ System factors: recognition rate
Other method for preference
ā¢ By elicitation
ā¢ By wizard of Oz
13. Gesture Elicitation Study (GES)
ā¢ To elicit
ā¢ To call forth or draw out something, such as information or a
response (Merriam Webster)
ā¢ To evoke or draw out a reaction, answer, or fact from someone
(Dictionary.com)
ā¢ To draw forth something that is latent or potential into existence
(Dictionary.com)
ā¢ Elicitation
ā¢ Acquisition of information from a person or a group of persons in a
manner that does not disclose the intent of the interview,
discussion or conversation
ā¢ Elicitation technique
ā¢ A technique used to discreetly extract information, not readily
available for a specific purpose
Source: Wobbrock et al., 2009
14. Gesture Elicitation Study (GES)
ā¢ Definition
ā¢ Is a study for asking end-users to elicit their own gestures for a set
of predefined functions through referents and reach a consensus
ā¢ Aims and goals
ā¢ To let end users elicit the gestures they want for a function
ā¢ To compute agreement between end users (=participants, subjects)
ā¢ To define a vocabulary of potential gestures
ā¢ To identify a consensus set, which is a sub-vocabulary of agreed
gestures. Output = Consensus Set
ā¢ For one context of use at a time: C= (U,P,E)
ā¢ User : any person, Functions: IoT functions
ā¢ Platform/device/sensor: an armband
ā¢ Environment: smart home, usability lab, controlled room
15. Gesture Elicitation Study (GES)
ā¢ Advantages
ā¢ Free from any form of stress or burden of working
ā¢ Information comes deliberately once the favorable context is
created
ā¢ No adverse effect
ā¢ Easy and cheap to run
ā¢ Disadvantages
ā¢ Legacy bias
ā¢ Previous experience with a similar system, another device
ā¢ No experience
ā¢ Lack of creativity
16. GES Step 1: Prepare setup C=(U,P,E)
ā¢ Create a representative
sampling of the user
population
17. GES Step 1: Prepare setup C=(U,P,E)
ā¢ Find a representative sampling of the user population
ā¢ Example of stratified sampling
Sampling Female Male
Age group % 32 51 49
18-24 8.9 3 2 1
25-34 15.4 5 3 2
35-44 16.5 5 3 2
45-54 18.3 6 3 3
55-64 18.7 6 3 3
65+ 22.2 7 4 3
Total 100 32 18 14
18. GES Step 1: Prepare setup C=(U,P,E)
ā¢ Find a representative sampling of the user population
ā¢ Find n=30 participants
ā¢ Ideally 50% female, 50% male
ā¢ Different ages (not all students!)
ā¢ Different backgrounds (not all in your discipline!)
ā¢ Different levels of experience (e.g., with touch or not)
ā¢ Each participant takes part in the experiment individually
ā¢ Define an experiment schedule
ā¢ Location (should be constant): place to choose
ā¢ Time: 30 min for each participant + resting time
ā¢ Schedule: organize a Doodle
ā¢ Test the whole protocol with a guinea pig
ā¢ Different from the participants!
19. GES Step 1: Prepare setup C=(U,P,E)
ā¢ Identify referents for the following tasks
ā¢ A referent is the expected result of an action (e.g., key
pressing, touch, gesture) for a function
ā¢ Distribution the definition of the 14 referents across groups
ā¢ Example: IoT functions:
(1) Turn the TV on/off (2) Start player
(3) Turn up the volume (4) Turn down the volume
(5) Go to the next item in a list (6) Go to the previous item
(7) Turn AC on/off (8) Turn lights on/off
(9) Brighten lights (10) Dim lights
(11)Turn heat on/off (12) Turn alarm on/off
(13)Answer phone call (14) End phone call
ā¢ Randomize the order of the referents before the study for each
participant (create lists with www.random.org)
Is symmetric to
20. GES Step 1: Prepare setup C=(U,P,E)
ā¢ What is a referent?
Referent:
- State before action
- Action description
- State after action
21. GES Step 1: Prepare setup C=(U,P,E)
ā¢ Referent: textual, graphical, animation, video (visual priming)
20. Permute files remotely
Before
After
My Network
My Disk
My Network
My Disk
Our Network
Partner Disk
Our Network
Partner Disk
Our Network
Partner Disk
Our Network
Partner Disk
Yellow Document
Red Document
Red Document
Yellow Document
Blue Document
Blue Document
23. GES Step 1: Prepare setup C=(U,P,E)
ā¢ Prepare the hardware, software
ā¢ Keep them constant throughout the experiment
ā¢ Determine a software for recording gestures in Log Files
ā¢ Example: Python software for Leap Motion
ā¢ Example: MyO Gesture Control
24. GES Step 1: Prepare setup C=(U,P,E)
ā¢ Identify the physical location: office, room, usability lab,ā¦
ā¢ Keep it constant throughout the experiment
ā¢ Sometimes determined by the device
ā¢ Example: Usability lab
25. GES Step 2: Pre-test (before experiment)
ā¢ Each participant
ā¢ Comes to the experiment location according to schedule
ā¢ Signs a consent form
ā¢ Fills in a demographic questionnaire
ā¢ Notoriety and experience if any for the target device
ā¢ System experience in general (e.g., for a tablet, frequency)
ā¢ Performs a Motor-Skill test: to touch their thumb to each of
the other fingers of the same hand 2 times consecutively
without flaw
ā¢ Is presented with an introduction to the device (e.g., a video)
ā¢ Experimenter
ā¢ Assigns an ID to each participant on the consent form
ā¢ Checks that all forms are completely filled in, before and
after
26. GES Step 3: Test (conduct experiment)
ā¢ Each participant
ā¢ Is presented with the list of referents (randomly selected)
ā¢ Confirms she understood the tasks
ā¢ Is asked to think of a suitable gesture
ā¢ Says āI am readyā
ā¢ Gives a rating (Goodness-of-fit) between 1 and 10
ā¢ To describe how well the proposed gesture fits the referent
ā¢ 1 = very poor fit ļ® 10 = excellent fit
ā¢ Optionally gives a value between 1 and 7 for
ā¢ Complexity: 1= most complex ļ® 7= simplest gesture
ā¢ Memorability: 1=hardest to remember ļ® 7=easiest to remember
ā¢ Fun: 1=is the least funny ļ® 7=is the funniest
for
each
gesture
in
the
random
set
27. GES Step 3: Test (conduct experiment)
ā¢ The experimenter, on the gesture sheet
ā¢ Records the thinking time = time since the task started (i.e., when
the referent was presented to the participant) and the moment
when the participant knows what gesture to propose
ā¢ Fills in the gesture sheet: for each gesture collected
ā¢ Writes the referent ID
ā¢ Records the thinking time
ā¢ Write all scores: Goodness-of-fit, Complexity, Memorability, Fun
ā¢ Videotapes the whole session for further analysis
ā¢ Keep video files for the final report
ā¢ Takes some pictures of the room for some interesting gestures
ā¢ Records the gestures into files with the very right device
ā¢ Example: X, Y, Z, timestamp
28. GES Step 4: Post-test (after experiment)
ā¢ The participant
ā¢ Fills in the IBM PSSUQ questionnaire
ā¢ Is being asked a few open questions
ā¢ What did you like the most in the experiment?
ā¢ What did you hate the most in the experiment?
ā¢ The experimenter
ā¢ Checks that the questionnaire is properly filled in
ā¢ Asks questions to the participant if any
ā¢ Encourages the participant to answer the open questions
ā¢ All
ā¢ Encode all data into spreadsheet
ā¢ Template file available
29. GES Step 5: Gesture classification
ā¢ Give each individual collected gesture a consistent,
structured name
ā¢ Adopt a structure (action verb)+(limb)+(parameter)
ā¢ Examples:
ā¢ Swipe right with two fingers, swipe left with the dominant hand
ā¢ References
ā¢ Hand gestures (e.g., LeapMotion):
http://gestureml.org/doku.php/gestures/motion/gesture_index
ā¢ Arm gestures (see in body motion gestures section above)
ā¢ arm_raised_forward (left/right), arm_raised_side (left/right),
ā¢ arm_point (left/right)
ā¢ arm_wave (left/right), arm_push (left/right), arm_throw (left/right)
ā¢ arm_punch (left/right), arm_folded (left/right), arm_on_hip (left/right)
30. GES Step 5: Gesture classification
ā¢ Assign each individual gesture to a category ID
ā¢ Example:
ā¢ Swipe right with two, three fingers, swipe right with the dominant hand
are assigned to the category ā1: Swipe rightā however the gesture is
performed
ā¢ Create your own classification
ā¢ You can draw inspiration from existing classifications
ā¢ You may depart from existing classifications by
ā¢ Adding, deleting, modifying your own categories
ā¢ Refining, generalizing existing categories
ā¢ Detail enough: minimum 10 categories
ā¢ Do not overdetail (e.g., by creating a category for each
gesture): maximum 20 gestures
ā¢ In the Excel file, tab āElicited Gestureā, fill in the light grey area
ā¢ Input other data, like Thinking time, Goodness-of-fit, etc.
31. GES Step 5: Gesture classification
ā¢ Descriptive labeling: Gesture Cards
ā¢ Examples
ā¢ Touch the ring once, twice, or multiple times in a row. Tap a
rhythmic pattern on the ringās surface
ā¢ Rotate the ring on the finger. Rotate the finger wearing the ring.
Rotate the hand wearing the ring
ā¢ Slide the ring along the finger. Pull out the ring. Place the ring
back on the finger. Change the ring to a different finger,
32. GES Step 5: Gesture classification
ā¢ Gesture type: describes the underlying meaning of a
gesture. Possible values are:
ā¢ P= Pointing gestures (= deictic gestures) indicate people, objects,
directions
ā¢ Semaphoric gestures are hand postures and movements conveying
specific meanings
ā¢ T= Static semaphorics are identified by a specific hand posture. Examples:
thumbs-up means āokayā, a flat palm facing from the actor means āstopā.
ā¢ D= Dynamic semaphorics convey information through their temporal aspects.
Example: a circular hand motion means ārotateā
ā¢ S= Semaphoric strokes represent hand flicks are single, stroke-like movements.
Example: a left flick of the hand means ādismiss this objectā
ā¢ A = Pantomimic gestures demonstrate a specific task to be performed or
imitated, which mostly involves motion and particular hand postures.
ā¢ Examples: filling an imaginary glass with water, by tilting an imaginary bucket.
They often consist of multiple low-level gestures, grabbing an object, moving it,
and releasing it again
33. GES Step 5: Gesture classification
ā¢ Gesture type: describes the
underlying meaning of a gesture.
Possible values are (contād):
ā¢ Iconic gestures communicate information about objects
or entities, such as specific sizes, shapes, and motion
paths:
ā¢ I= Static iconics are performed by spontaneous
static hand postures. Example: Draw an āOā with
index finger and thumb means a ācircleā
ā¢ Y= Dynamic iconics are often used to describe
paths or shapes, such as moving the hand in
circles, meaning āthe circleā.
ā¢ - M= Manipulation gestures a guide movement in a
short feedback loop. Thus, they feature a tight
relationship between the movements of the actor and
the movements of the object to be manipulated. The
actor waits for the entity to āfollowā before continuing
Source: https://www.microsoft.com/en-us/research/publication/understanding-mid-air-hand-gestures-a-study-of-human-preferences-in-usage-of-gesture-types-for-hci/
34. GES Step 5: Gesture classification
ā¢ Gesture form: specifies which form of gesture is elicited.
Possible values are:
ā¢ S= stroke when the gesture only consists of taps and flicks
ā¢ T= static when the gesture is performed in only one location
ā¢ M= static with motion (when the gesture is performed with a
static pose while the rest is moving)
ā¢ D= dynamic when the gesture does capture any change or
motion
35. GES Step 5: Gesture classification
ā¢ Range of motion: relates the distance between the
position of the human body producing the gesture and
the location of the gesture. Possible values are
ā¢ C= Close intimate, I= Intimate, P= Personal, S=Social,
U= Public, R= Remote
Range of
motion
36. GES Step 5: Gesture classification
ā¢ Laterality: characterizes how the two hands are
employed to produce gestures, with two categories, as
done in many studies. Possible values are:
ā¢ D= dominant unimanual, N= non-dominant unimanual,
S= symmetric bimanual, A= asymmetric bimanual
Source: https://www.tandfonline.com/doi/abs/10.1080/00222895.1987.10735426
D
(right handed)
N
(right handed)
S
(right handed)
A
(right handed)
37. GES Step 5: Gesture classification
ā¢ Classify each gesture category according to
the following criteria and enter corresponding
code
ā¢ Use other classification criteria depending on
ā¢ User
ā¢ Platform/device
ā¢ Environment
38. GES Step 5: Compute agreement among gestures
ā¢ Download AGaTE from
http://depts.washington.edu/acelab/proj/dollar/agate.html
ā¢ Prepare a CSV file with participant ID (column), referent
name (line), Gesture category ID (cell)
Participant ID
Referent
name
Gesture category
ID
39. GES Step 5: Compute agreement among gestures
ā¢ Compute Agreement Rate (AR) for all referents
ā¢ Agreement Rate = the number of pairs of participants in
agreement with each other divided by the total number
of pairs of participants that could be in agreement
ā¢ Compute co-agreement for pairs, groups (eg male vs
female), categories of referents (eg basic vs. advanced)
agreement rate disagreement rate co-agreement rate
Source: https://dl.acm.org/citation.cfm?id=2669511
40. GES Step 5: Compute agreement among gestures
ā¢ Example: agreement rate for one referent with 5
participants proposing 2 gestures A and B.
ā¢ Connected pairs represent how two participants
performed the same gesture
Source: https://dl.acm.org/citation.cfm?id=2669511
41. ā¢ Ring Gestures
ā¢ 41
24 participants
= 672 gesture proposals
x 2 rings (Ring Zero)
x 14 referents
Source: B. Gheran, J. Vanderdonckt, R.-D. Vatavu, Gestures for Smart Rings: Empirical Results, Insights, and Design Implications.
Proc. of ACM DISā18, 623-635.
Example
42. ā¢ Ring Gestures: samples
Example
https://www.youtube.com/watch?v=FHT-5aFNhsA
45. GES Step 6: Analyze agreement among gestures
ā¢ Plot all referents in decreasing order of their AR, with error bars
denoting confidence interval (95%) and gesture category
45
0.193 0.173 0.157 0.157 0.14 0.13 0.117 0.11 0.11 0.107 0.107 0.097 0.08 0.07 0.067 0.06 0.053 0.053 0.043
0.107
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
Turn Light
On
Turn TV
On
Begin
Player
Turn Light
Off
Turn TV
Off
Go to
previous
item
Go to next
item
Turn Heat
On
Answer a
Call
Increase
Volume
Turn Heat
Off
Increase
Light
Decrease
Volume
Hang Up
Call
Decrease
Light
Turn AC
On
Turn
Alarm On
Turn
Alarm Off
Turn AC
Off
Average
Agreement
rate
Low agreement
Medium agreement
Most
elicited
gesture
15: Splay 27: Button 27: Button
3: Point 14: Fist 10: Swipe 10: Swipe 8: Rotate 30: Phone 30: Phone
8: Rotate
28: Dimmer 28: Dimmer
15: Splay 6: Flat 10: Swipe 27: Button 27: Button 10: Swipe
Referent
Category
Source: https://dl.acm.org/citation.cfm?id=2702223
46. GES Step 7: Analyze Gestures
ā¢ Provide a detailed analysis of gestures elicited
ā¢ By criteria
ā¢ By agreement rate
ā¢ By observation (e.g., field notes, video notes, gesture sheet)
ā¢ By interview
ā¢ By IBM PSSUQ questionnaire
ā¢ Identify potential correlations between data (e.g., t test,
ANOVA, etc.)
ā¢ Decide Consensus set = set of agreed gestures based
on results
ā¢ Example: https://ieeexplore.ieee.org/document/8890919
48. 48
What are head and shoulders gestures?
ā¢ A head gesture is any movement of the head leaving the rest
of the body unaffected (stationary)
ā¢ A head gesture could occur in any plane (sagittal, transverse,
frontal)
ā¢ A shoulder gesture is any movement of the shoulder joint that
leaves the rest of the arm unaffected (stationary).
ā¢ A shoulder gesture occurs in any plane of motion (sagittal,
transverse, frontal) or direction (forward, backward, or circular)
50. 50
Why are head and shoulders gestures interesting?
ā¢ Hands-free interaction is made possible
ā¢ Fixed-Gaze Head Movement is appropriate when
ā¢ No device needed
ā¢ Both hands should be free
ā¢ No need to move the gaze
ā¢ Capability to trigger actions from a moderate set
ā¢ Medium command duration
ā¢ Accurate recognizers for head and shoulders gestures start to
appear
M13 M31
M65
M56
M42
M24
[Qinjie et al., 2019]
51. 51
Why are head and shoulders gestures interesting?
ā¢ They are used in physical exercises
52. 52
Why are head and shoulders gestures interesting?
ā¢ They exhibit some potential for a novel vocabulary
Head Label Alias Movement
(frontal,
transversal,
sagittal)
X trans-
lation
Move the head
left, right
Face left, face
right
Lateral translation
(v,c,c)
Y trans-
lation
Move the head
up, down
Face up, face
down
Neck elevation,
depression (c,v,c)
Z trans-
lation
Move the head
forward,
backward
Thrust,
retreat
Protraction,
retraction (c,c,v)
53. 53
Why are head and shoulders gestures interesting?
ā¢ They exhibit some potential for a novel vocabulary
Head Label Alias Movement
(frontal,
transversal,
sagittal)
Frontal
tilting
Tilt the head to
the left, right
Bend left,
right
Lateral flexion
(v,v,c)
Transvers
al tilting
Tilt the head up,
down
Bend up,
down
Extension, flexion
(v,c,v)
Saggital
tilting
Tilt the head
forward,
backward
Bend
forward,
backward
Extension, flexion
(c,v,v)
54. 54
Why are head and shoulders gestures interesting?
ā¢ They exhibit some potential for a novel vocabulary
Head Label Alias Movement
(frontal,
transversal,
sagittal)
X rotation Turn the head
up, down
Uphead,
downhead
Horizontal
rotation (c,v,v)
Y rotation Turn the head
left, right
Lefthead,
righthead
Vertical rotation
(v,c,v)
Z rotation Turn the head
forward,
backward
Forehead,
backhead
Facial rotation
(v,v,c)
55. 55
Why are head and shoulders gestures interesting?
ā¢ They exhibit some potential for a novel vocabulary
Shoulders Label Alias Movement (frontal, transversal,
sagittal)
X translation Move shoulder horizontally to left, right Decontract,
contract
Extension, flexion (v,c,c)
Y translation Raise shoulder, lower shoulder Raise, lower Shoulder elevation, depression
(c,v,c)
Z translation Move shoulder forward/backward Protract, retract Shoulder protraction, retraction
(c,c,v)
56. ā¢ There are also common head and shoulders gestures
57. ā¢ Larger design space than www.gestureml.org
Source: http://gestureml.org/doku.php/gestures/motion/gesture_index
āThese head-motion based gestures can be great for adding subtle context cues to
game controls and metrics or even used to directly modify the way digital content is
presented on a display.ā
58. 58
Experiment: Gesture Elicitation Study (GES)
ā¢ Participants
ā¢ 10 females + 12 males = 22 participants
ā¢ Aged from 18 to 62y (M=29, SD=13)
ā¢ Various occupations: secretary, teacher, employee,...
ā¢ Device usage frequencies
ā¢ Creativity score
y = 0.1467x + 52.482
RĀ² = 0.0775
0
10
20
30
40
50
60
70
80
18 28 38 48 58
Creativity
[score]
Age [years]
6.09 6.05
2.68
1.73
1.05
0.00
0.00
1.00
2.00
3.00
4.00
5.00
6.00
7.00
Computer
Smartphone
Tablet
Game
console
MS
Kinect
Studied
device
Average
fequency
of
usage
Device
59. 59
Experiment: Gesture Elicitation Study (GES)
ā¢ Stimuli: 14 referents for IoT tasks:
Turn the TV On/Off, Start Player, Turn the Volume up, Turn the volume
down, Go to the next channel, Go to the previous channel, Turn Air
Conditioning On/Off, Turn Lights On/Off, Brighten Lights, Dim Lights,
Turn Heating system On/Off, Turn Alarm On/Off, Answer a phone call,
and End Phone Call.
Example: 3. INCREASE: Brighten lights
Before After
60. 60
Experiment: Results
ā¢ 22 participants X 14 referents = 308 elicited gestures
resulting into 10 categories
Head single gesture, 102
Concurrent compound
gesture, 70
Sequential
compound
gesture, 44
Both shoulders single
gesture, 29
Dominant shoulder single
gesture, 19
Non-dominant shoulder
single gesture, 14
Head repeated gesture, 10
Both shoulders
repeated
gestures, 9
Dominant shoulder
repeated gesture, 4
Non-dominant shoulder
repeated gesture, 3
Other, 26
62. 62
Experiment: Results
ā¢ Evolution of aggregated measures per referent
0.00
1.00
2.00
3.00
4.00
5.00
6.00
7.00
8.00
0.00
5.00
10.00
15.00
20.00
25.00
Value
Time
[Sec]
Referent
Average thinking time Goodness-of-fit
Linear (Average thinking time) Linear (Goodness-of-fit)
63. 63
Experiment: Results
ā¢ Breakdown per criteria
Upface/downface, 3%
Thrust, 1% Bend, 31%
Nod,
4%
Rotate,
4%
Left/right, 12%
Backhead, 0%
Raise
11%
Lower,
4%
Shrug, 11% Clog, 9%
Protract
6%
Retract,
3%
Head (%), 50.87 Shoulders (%), 30.62
Head and Shoulders (%),
18.51
One stroke
68.09%
Two strokes
22.37%
Three or more strokes
9.54%
Body part
Elicited gestures
Amount of strokes
64. 0.390 0.390
0.316
0.283
0.267 0.260
0.263
0.250 0.250 0.250 0.248
0.229
0.215
0.188
0.138
0.368 0.368
0.286
0.251
0.238 0.234
0.232
0.221 0.221 0.216 0.215
0.195
0.182
0.156
0.104
0.00
0.05
0.10
0.15
0.20
0.25
0.30
0.35
0.40
0.45
0.50
Go to Next
Channel
Go to
Previous
Channel
Answer
Phone Call
End Phone
Call
Play/pause Turn TV
On/Off
Average Turn Lights
On/Off
Turn Alarm
On/Off
Decrease
Volume
Brighten
Lights
Increase
Volume
Dim Lights Turn Air
Conditioning
On/Off
Turn Heating
System
On/Off
High Medium
64
Experiment: 14 Consensus gestures
Agreement score [Vatavu & Wobbrock, 2015]
Agreement rate [Vatavu & Wobbrock, 2016]
65. 65
Conclusion
ā¢ Contributions
ā¢ Design space for head and shoulders gestures
ā¢ Corpus of 308 elicited gestures with measures
ā¢ Classification into 10 categories
ā¢ Consensus set of 14 head and shoulders gestures
ā¢ Design guidelines
ā¢ Use bending gestures as a first-class citizen
ā¢ Use Upface/downface for infrequent tasks
ā¢ Use thrust only for play/pause
ā¢ Forehead and backhead gestures should not be used,
apart for exceptional assignation
67. ā¢ Shortcomings
ā¢ Legacy bias (Morris et al, 2010)
Source: M. Morris et al., Reducing Legacy Bias in Gesture Elicitation Studies, 2010
https://interactions.acm.org/archive/view/may-june-2014/reducing-legacy-bias-in-gesture-elicitation-studies
Priming. Priming users to think about the
capabilities of a new form factor or sensing
technology is another approach that may
reduce the impact of legacy bias
Partners. Inviting users to participate in
elicitation studies in groups, rather than
individually, can be another approach to
overcoming legacy bias
68. ā¢ Shortcomings
ā¢ Manual (variable) classification of gestures
Source: Radu-Daniel Vatavu, The Dissimilarity-Consensus Approach to Agreement Analysis in Gesture Elicitation Studies. CHI
2019: 224
āIt is possible to design a highly guessable symbol set by acquiring guesses from participants.ā
āParticipants are first recruited to propose symbols for specified referents within a given domain.
The more participants, the more likely the resulting symbol set will be guessable to external users.ā
The goal is to obtain a rich set of symbols from which to create the resultant symbol set.ā
69. ā¢ Shortcomings
ā¢ Manual (variable) classification of gestures
N=30 children, Referent: āScratch like a cat!ā
ļ¶ Grouping criteria #1: same hand ļ 39% agreement
ļ¶ Grouping criteria #2: same hand & body pose ļ 19.6%
ļ¶ Grouping criteria #3: same hand & body pose & pattern of
movement ļ 12.2%
Source: Radu-Daniel Vatavu, The Dissimilarity-Consensus Approach to Agreement Analysis in Gesture Elicitation Studies. CHI
2019: 224
70. ā¢ Shortcomings
ā¢ Automatic classification of gestures
1. A dissimilarity function for gestures Ī(šš,šš)
ļ¶ for example, the Euclidean distance
ļ¶ the Dynamic Time Warping cost function
ļ¶ or any distance function that takes two gestures as
input and returns a real, positive value of how
dissimilar they are
2. A threshold (Ļ, tau) for values computed by Ī
Source: Radu-Daniel Vatavu, The Dissimilarity-Consensus Approach to Agreement Analysis in Gesture Elicitation Studies. CHI
2019: 224
71. ā¢ Shortcomings
ā¢ Automatic classification of gestures
Source: Radu-Daniel Vatavu, The Dissimilarity-Consensus Approach to Agreement Analysis in Gesture Elicitation Studies. CHI
2019: 224
73. ā¢ Shortcomings and variants
ā¢ Gesture elicitation distributed in time and space
GestMAN
Source: GestMan: a cloud-based tool for stroke-gesture datasets
75. ā¢ Shortcomings and variants
ā¢ Some limbs are privileged
Source: Villarreal et al., A Systematic Review of Gesture Elicitation Studies: What Can We Learn from 216 Studies?
Body parts:
1.- The
Individual
Frequency of
Body Parts
(IFBP)
2. The
Combination
Frequency of
Body Parts
(CFBP)
76. ā¢ Shortcomings and variants
ā¢ Eliciting other symbols than gestures, other modalities
Source: Villarreal et al., A Systematic Review of Gesture Elicitation Studies: What Can We Learn from 216 Studies?
77. ā¢ Shortcomings and variants
ā¢ Eliciting more than one symbol
Source: Villarreal et al., A Systematic Review of Gesture Elicitation Studies: What Can We Learn from 216 Studies?
Multiple symbols could
be combined to trigger a
function by relying on
hierarchical structure and
congruence
78. ā¢ Conclusion
ā¢ Elicitation study is a practical tool for eliciting (symbol)
proposals from participants for commands, icons,
shortcuts, gestures, vocal commands, etc.
ā¢ Efficient, natural
ā¢ Subject to manual classification
ā¢ Not always sure for agreement rates (other scores)
ā¢ Legacy bias
ā¢ Continuity from elicitation to recognition
ā¢ Subject to Context variability (U,P,E)
79. ā¢ Conclusion
ā¢ Many GES already exist, but no consolidation of this
knowledge!
y = 3.8909x - 4.1636
RĀ² = 0.9052
0
5
10
15
20
25
30
35
40
45
2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019
Number
of
studies
Year of publication
Frequency of Gesture ElicitationStudies
80. How to get the best gestures from people?
Considered insofar:
ā¢ Human factors: preference, sketching time, delete
operations, agreement, (dis)similarity
ā¢ System factors: recognition rate
More to come and to consider:
ā¢ Human factors: hedonic value, memorability,
naturalness, discoverability, consistency, congruence, ā¦
ā¢ System factors: recognition rate, execution time,
computational complexity, more measuresā¦