Publication: Faisal Taher, Jason Alexander, John Hardy, and Eduardo Velloso. 2014. An Empirical Characterization of Touch-Gesture Input-Force on Mobile Devices. In Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces (ITS '14). ACM, New York, NY, USA, 195-204. DOI=http://dx.doi.org/10.1145/2669485.2669515
Abstract:
Designers of force-sensitive user interfaces lack a ground-truth characterization of input force while performing common touch gestures (zooming, panning, tapping, and rotating). This paper provides such a characterization firstly by deriving baseline force profiles in a tightly-controlled user study; then by examining how these profiles vary in different conditions such as form factor (mobile phone and tablet), interaction position (walking and sitting) and urgency (timed tasks and untimed tasks). We conducted two user studies with 14 and 24 participants respectively and report: (1) force profile graphs that depict the force variations of common touch gestures, (2) the effect of the different conditions on force exerted and gesture completion time, (3) the most common forces that users apply, and the time taken to complete the gestures. This characterization is intended to aid the design of interactive devices that integrate force-input with common touch gestures in different conditions.
Generative AI for Technical Writer or Information Developers
An Empirical Characterization of Touch-Gesture Input-Force on Mobile Devices
1. An Empirical Characterization of
Touch-Gesture Input-Force on
Mobile Devices
Faisal Taher
Jason Alexander
John Hardy
Eduardo Velloso
GHOST project, funded by the EU
ACM ITS 2014
Interactive Tabletops and Surfaces
2. Research Aim
• Characterization of gesture input-force
– Touch-screen devices.
– Common touch gestures.
– Various contexts.
• Aid the design of force-sensitive input devices
– Realize force ranges and tolerances.
– Effects of context.
– What does gesture force look like?
3. Research Aim
• User evaluations
– 2 controlled studies: 14 and 24 participants.
– 1st study – baseline characterization.
– 2nd study – effect of various factors.
• A large number of factors to examine!
7. Studies Set-up
• Gesture application
– Pinch-in, pinch-out, panning,
tapping, rotating, and typing.
• FingerTPS force sensing
equipment.
Glove with force
sensor
8. Studies Set-up
• Gesture application
– Pinch-in, pinch-out, panning,
tapping, rotating, and typing.
• FingerTPS force sensing
equipment.
• Control application to process
sensor readings.
9. Studies Set-up
• Procedure:
• Conditions:
– Study 1: Seated, Tablet, Untimed.
– Study 2:
• Form factor: Mobile phone vs. Tablet.
• Position: Sitting vs. Walking.
• Urgency: Timed tasks vs. Untimed tasks.
• 2x2x2 factor design
• We recorded force and time readings of individual
gestures.
10. Results – Profiles
• Force profiles
– Force and time normalized.
– High-level comparison.
• STUDY 1 – baseline profiles
– Typing and Panning: similar shapes.
– Tapping: more variation.
12. Results – Profiles
• Force profiles
– Force and time normalized.
– High-level comparison.
• STUDY 1 – baseline profiles
– Typing and Panning: similar shapes.
– Tapping: more variation.
– Two finger gesture were more interesting.
13. Results – Profiles
Rotate
Force
Time
Index finger Thumb
Force
Time0
1
10
1
1
0
1
1
Index finger
Force
Time
Thumb
Force
Time0
1
1
Zoom-in
Force
Time
Index finger
Force
Thumb
Time0
1
1 0
1
1
Zoom-out
14. Results – Profiles
Rotate
Force
Time
Index finger Thumb
Force
Time0
1
10
1
1
0
1
1
Index finger
Force
Time
Thumb
Force
Time0
1
1
Zoom-in
Force
Time
Index finger
Force
Thumb
Time0
1
1 0
1
1
Zoom-out
15. Results – Profiles
• Study 2 – effect of form factor, position, urgency
– Typing, tapping and panning:
• Generally the same shape.
• Slight variations for typing and panning.
• Tapping remained the same.
17. Results – Profiles
• Study 2 – effect of form factor, position, urgency
– Typing, tapping and panning:
• Generally the same shape.
• Slight variations for typing and panning.
• Tapping remained the same.
– Rotation, Zoom-in and Zoom-out:
• Varied thumb force profiles.
• Index finger was the same (except Zoom-out).
• Thumb profiles harder to predict.
• More force when fingers are further apart.
18. Results – Profiles
Study 2 force profile
0
1
1
Index finger
Force
Time
Thumb
Force
Time0
1
1
Zoom-in
Index finger
Force
Time
Force
Time
Thumb
0 1
1
0
1
1
Zoom-out
Rotate
Index finger
Force
Time
Force
Time0
1
10
1
1
Thumb
Study 1 force profile
19. Results – Factors
• Form factor: Mobile vs. Tablet
– Participants pressed harder on a tablet.
– Gestures affected: Typing, panning, zooming, rotating.
– Individual gestures took longer on the mobile phone.
– Gestures affected: Typing, panning, zooming.
• Position: Sitting vs. Walking
– Participants pressed harder whilst walking.
– Gestures affected: Tapping, panning, zooming, rotating.
• Urgency: Timed vs. Untimed
– Participants pressed harder whilst typing during the timed condition.
20. Results – Force Averages
0
10
20
30
40
50
60
70
80
Typing Tapping Panning Rotating
(index)
Rotating
(thumb)
Zoom-in
(index)
Zoom-in
thumb
Zoom-out
(index)
Zoom-out
(thumb)
Averageforce(grams)
Gesture
Average Force (grams) by gesture / finger
Typing
Tapping
Panning
Rotating (index)
Rotating (thumb)
Zoom-in (index)
Zoom-in thumb
Zoom-out (index)
Zoom-out (thumb)
21. Results – Time Averages
0
50
100
150
200
250
300
350
Typing Tapping Panning Rotating
(index)
Rotating
(thumb)
Zoom-in
(index)
Zoom-in
thumb
Zoom-out
(index)
Zoom-out
(thumb)
Averagetime(milliseconds)
Gesture
Average time (milliseconds) by gesture / finger
Typing
Tapping
Panning
Rotating (index)
Rotating (thumb)
Zoom-in (index)
Zoom-in thumb
Zoom-out (index)
Zoom-out (thumb)
23. Further Explorations
• Factors:
– Device lag.
– Target size.
– Situated devices.
– Complex gestures: multi-finger panning,
bi-manual interactions.
• Develop empirical models to
predict gestures based on input-force.
24. Summary
• More force applied on tablet.
• More force applied whilst walking.
• Force profile identification.
• More force applied when index and thumb are further apart.
• Thumb force profiles are harder to predict.
• Index finger is more dominant / consistent during index-
thumb gestures.
31. Key points
• High-level profiles of common touch gestures.
• Aid gesture recognition.
• Aid device calibration.
• Effects of context.
Editor's Notes
The key objective is to develop and understanding of the input force that we apply whilst carrying out common touch gestures on touch screen devices. We’re also interested in the effect of context on input force, so for example, do we press harder whilst playing a game, compared to selecting an icon.
The phenomenon we’re interested in is the force – which changes. The area is constant
Before we proceed, I’d just like to clarify why we use force rather than pressure. Pressure is a measurement that takes into account the surface area where force is being applied, but we’re looking at pressing force from users fingers independent of surface area, so we use the term force.
The goal here is to inform the design of force-sensitive devices and answer questions like: How much force do users apply when they are typing? Does it matter if it’s a phone or tablet? What if they are walking? This enables designers to identify different gestures, and develop devices with specific force tolerances and for specific contexts.
Different forces – different functions
We’re interested in how hard users press when they carry out touch gestures, like pinching or tapping, on interactive devices. Also, does context have an effect? So does walking make you pinch harder on a phone? Do you press harder when under time pressure? This research is designed to inform the design of force-sensitive devices, so that we can develop devices that can distinguish between a normal swipe, and a forceful swipe. This is based on the premise that, for example, when reading an e-book, a normal swipe moves a page, whereas a hard swipe turns the page.
To achieve this we carried out two user evaluations with 14 and 24 participants in controlled lab-based conditions. The overall aim of the first study was to establish a baseline characterization of input force, and the second study aimed to examine the effect of different factors. I will discuss this in more detail next. Of course, there are a number of possible factors, so we had to narrow it down and define a scope.
For instance, there are several types of configuration, situational factors, interaction styles or gestures, content related factors, and whether users are just browsing, playing a game, or texting someone.
As we are interested in mobile devices and typical mobile device interactions, we selectively chose factors that applied to our research. Firstly, for user position we looked at walking versus sitting down. We selected emotion and looked at how urgent and non urgent conditions affects interaction. We picked the most common touch gestures that people use on touch screen devices, and finally, we experimented with a mobile phone and tablet.
We developed a web-application that enables users to carry out the six gestures, which are pinch to zoom panning, tapping, rotating and typing. We separate typing from tapping as typing is normally restricted to a dense button set, essentially fast tapping in a very specific context
These gestures were captured by force sensors, which study participants were asked to wear. These consisted of finger gloves with a sensor attached to the end. These gloves were attached to a Bluetooth emitter and the sensor readings were picked up and processed by a C# application.
The general procedure of the user studies consisted of collecting demographic data, calibrating the force sensors to cater for differences in finger sizes, a training phase where participants learnt how to carry out the tasks. Following this they were then asked to perform the tasks on the web application.
The first study, which aimed to achieve baseline characterization of input force, involved participants sitting down, using a tablet and carrying out the tasks with no time pressure. In the second study, we introduced different conditions like form factor, position and urgency, thereby using a 2 by 2 by 2 factor design. Urgency was invoked by putting a timer on the gesture tasks, purposely set to a short time.
Whilst participants were carrying out the gestures, we recorded input force in grams and elapsed time in milliseconds. Each individual gesture was captured.
Following the first study, we developed force profiles for each gesture, which consisted of combining all the force and time values and normalizing them so that we could compare them. We were specifically interested in their shape – for example: what does a panning gesture look like – is it a simple rise to a peak and then a drop? Or does it have more variation?
We found that typing and panning have very similar force profiles, whereas tapping has more force variation around the peak region.
As you can see here, the typing and panning profiles are very similar, consisting of a simple rise and drop. Tapping on the other hand, has a slightly prolonged peak showing minor variation of force. We believe that this is caused by a slight delay in releasing the finger as users wait for feedback that a tap has occurred.
For two-finger gestures such as rotation and pinch to zoom, we found that the force profiles of the index finger was more consistent than the thumb. We attribute this to the dominance of the index finger.
As you can see here there is less variation for the index finger compared to the thumb, which produced different force profiles for the rotate and zoom-out gestures.
The pinch to zoom force profiles also correlate in that more force is applied when user fingers are further apart, meaning that users apply more force to maintain contact with the display. It’s clear that as the fingers move closer to each other, force decreases.
This also explains with the rotation gesture doesn’t completely lean towards one direction – thumb and index finger are consistently apart.
The second study looked at the effects of form factor, mobile and tablet, position, sitting and walking, and urgency, timed and untimed tasks.
In terms of force profiles, the overall curves for typing and panning were slightly different to study 1, whilst tapping remained the same.
As you can see here, typing had a slightly steeper rise to its peak in study 2. The panning profile in study was also more prolonged.
In terms of the rotation, zoom-in and zoom-out gestures, we observed similarities to study 1 in that the thumb force profiles were more varied compared to the index finger profiles.
The index finger profiles for the rotate and zoom-in task remained the same as study 1, but the index finger profile for zoom-out required modification. All of the thumb force profile had to be modified as they did not closely match those from study 1.
This study indicated that thumb gestures are harder to predict in comparison to the index finger and there is more variation.
This study also reinforced the notion that users press harder and with more varying force when their index finger and thumb are apart, like during the rotation tasks.
Our results also showed that participants pressed harder on a tablet whilst they were typing, panning, zooming and rotating. We believe this is due to the larger size and weight of a tablet, requiring users to use more force for balance and control.
The typing, panning and zooming gestures also took longer on a mobile phone. The smaller screen size can make users more cautious whilst typing for example, on a smaller keyboard.
Participants also pressed harder whilst walking, for tapping, panning, zooming, and rotating gestures. We believe that users press harder to compensate for the device vibrations that occur whilst walking.
Also, and surprisingly, we found that under timed conditions, which intended to create a sense of urgency, participants only pressed harder for the typing task.
As a general overview, here are the average forces, in grams, applied for each gesture. There is more detail in the paper, showing the actual distribution, but we thought it would be interesting to just look at the high level mean values. We can see certain trends like participants only applied a small amount of force whilst typing.
Participants generally pressed hardest whilst carrying out the tapping task.
It was also clear that during two finger interactions, the index finger applied more force than the thumb
In terms of time taken to carry out an individual gesture, typing involves very quick presses, as we all know. Tapping takes a little bit longer, as we tend to wait for feedback that the target has been selected. Panning is also a slightly longer action as it is typically a dragging action. During the studies users were essentially scrolling for a keyword.
The first two are quite discrete gestures and take a shorter amount of time to complete whereas the rest are more continuous gestures which will naturally take longer
But of course, there are certain limitations. For instance, our studies were carried out in a controlled setting and not in the wild. We developed a specific set of tasks, although they were designed to be generalisable, we could not capture all potential scenarios. Also, wearing force sensing equipment can alter how users normally interact with phones and tablets.
We do intend to carry on with our research and explore other factors, such as device lag, which can be a common problem for the slightly older handsets. Different target icon / button sizes, table top or on the wall displays. We are also interested in exploring more complex gestures like multi-finger panning.
Another avenue of further work would be to carry out more trials to develop empirical models that can predict force interactions.
In summary we found that:
More force is generally applied on a tablet, compared to a phone, and also whilst walking.
There is high similarity in force profiles for different gestures such as typing and panning, even though they are completely different gestures used for different outcomes.
It was clear that the index finger was more consistent and the thumb creates more variable force, therefore it’s harder to predict thumb input-force.
This research demonstrates the first step towards understanding the forces we apply on mobile devices.
Here are some places where we believe that our results can be applied
Our results can aid the design of force-sensitive mobile devices, so for example, being able to differentiate between a forceful zoom vs. normal zoom. Designers can then apply a different function to a forceful zoom.
Our results can also aid gesture recognition, through the understanding of force ranges, and also the general profile of gestures.
The findings of how different contexts can affect input-force can also help, for example, calibrate devices such that tablets have a higher force tolerance to mobile devices.
New types of devices like flexible displays