How People Really Hold & Touch (their phones)


Published on

Despite decades of research and years of carrying a touchscreen mobile handset around, there’s a lot of myth, disinformation, and half-truths about how touchscreens work, how users actually interact with touch devices, and how best to design for touch.

Participants in this session will get research findings and other data in order to clarify and set aside misunderstandings about user behavior and touchscreen technologies. You’ll learn the different ways and types of interactions for touch devices that will give you a solid base of knowledge you will then use to review how behavior and interaction can influence design patterns and design choices.

Published in: Technology, Business
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • What do we know about how people interact with touchscreen mobile devices? …
  • We think that…
  • 1) The iPhone is the perfect size for one handed use, so all right-thinking people carry an iPhone, and use it with one hand, in portrait, tapping with their thumb.
  • 2) That people’s fingers vary in width, so any touchscreen needs to account for that.
  • 3) But somehow also that a 44 point (Apple points here) box is the ideal touch target. And so on.
  • Wrong, irrelevant and wrong.We make assumptions, use hearsay, and apply personal biases. We need to stop this. There's research. If you don't know, look aroundor ask. If the answer is not out there, do the research yourself.
  • I did some. I found that there wasn’t good research, but lots of assumptions about how people held devices, so I looked at 1,333 people holding their phones, and found this. People hold their phones in lots of different ways. Under half are one-handed, using their thumb.
  • So the traditional view of what holding a phone means, with easy, harder and impossible areas to touch…
  • Becomes a lot more complex.If you believe everything important is at the top of the screen, it might not be easily selectable. BUT… …if you follow other guidelines to put dangerous or rarely used items in the hard to reach area at the top of the screen… it may not work and people shift their hands.
  • And even for one handed use, people move around to reach where they need.
  • And if that isn’t enough to drive you crazy, people shift. Between two handed for typing… and cradling for general tapping and scrolling, for example. I don’t have a simple answer yet, but think about people, their contexts, needs and expectations instead of assuming any one thing about how they will reach and tap.
  • Speaking of tapping, what you most need to remember is that there are THREE facets of touch, not one. Sorry, you cannot remember a single number and make sure touch targets are that size.
  • Visual targets are important. As much as no-affordance interfaces and secret gestures are a trendy way to insist you are making delightfully surprising experiences, making simple click targets makes everything just work. Visual targets must: Attract the user's eye.Be drawn so that the user understands that they are actionable elements.Bereadable, so the user understands what action they will perform.Be large and clear enough the user is confident that he can easily tap them.
  • Many of us are failing to do this. When you make a button, or a line item with nice separators or colored backgrounds, but only make the text the clickable item, you have failed. Even if the words look like links, the row is also perceived to be a link. The user sees a line item, or a button, and will usually try to click that. The whole area, and you make sure they miss by making the target something else. Stop it.
  • How big should words, buttons and lines be? It depends. On how far away the device is from the user. For visual, NO absolute size is ever true.
  • Angular resolution matters, and that’s calculated based on the distance between the screen and the viewer’s eyeballs. Get your cameras out…
  • BecauseI did the math for you, and here are some guidelines. This is all already published in an article you can google, and I will share this deck so you don’t have to take photos.
  • When you see a typical mobile handset or tablet these days, what do you see? A screen. Right?
  • The phone sees a lot more about you. And these are just the sensors that have a reasonable chance of being used as input devices. The screen turns on or off based on proximity, or just now with some by actually seeing your face This and some other phones let you use kinesthetic hand gestures in mid air. I didn’t include the magentometer, GPS and GLONASS, temperature, humidity, camera, radios, and many, many more.
  • These are increasingly going to be important. Maybe next year I’ll bring some info on designing for kinesthetic gesture. But today we’re going to talk about the touchscreen itself. And since it’s the most common thing, we’re talking about capacitive touch. Resistive is the one where you have to simply apply pressure, and a grid of conductive leads make contact, so the device knows which point you touched. These are still being built. Even for consumer devices, like tablets or seatback entertainment systems and so on. Capacitive touch, uses electrical properties of your body. Your finger acts as a capacitor whose presence in the system can easily be measured by these little nodes, in a grid, on several layers between the display screen and the protective plastic or glass. But it is not perfect. There is math, and interference, and tradeoffs in thickness, weight, cost, and optical clarity that get in the way of increased precision.
  • A year or two ago, Motorola put a handful of devices in a little jig so they could precisely, robotically control the pressure, angle and speed of touch sensing. This is some of them. Even the much-loved iPhone is imperfect, with notable distortion at the edges, and actually a total inability to get to the edge on some sides. Look at the stairstep pattern on the Droid. That’s a problem with the calculations or something that predicts the precise position between the sensors. The pitch of the steps is clearly the grid size.
  • Touch targets. First, the Apple guideline of 44 points.
  • No. Never. Pretend it never happened. When it was 44 px that was a problem, but now that Apple makes many devices, and this is a range of physical sizes, the obvious problem arises that people’s fingers do not have resolution. Our fingers are always the same size. You need to measure, specify, and when possible code, in physical sizes. Inches, millimeters, points, picas, twips, or whatever is available and you are happy with.
  • Not incidentally, an Apple point is an insulting lie. There’s a physical unit of measure called a point, or typographers point, which is very convenient for this sort of stuff. It has NO relation to the new Apple point. I use typographers points here, so don’t get confused.
  • As it turns out, it’s not really important how big our fingers are, except insofar as they obscure part of the screen, which is something else. Our finger squishes against the screen and only a part gets flattened and detected. My research indicates this is pretty much the same for everyone. Children press really hard, so have a larger relative contact patch for example. There is some variability based on task, so people can use fingertips and press lightly.
  • And it hardly matters, because phones mostly don’t pick up the size, and work exclusively with the centroid of the contact.No, nothing like this.
  • The centroid is just the geometric center of an area. The way the electrical conductivity of the capacitive touch screen works, the part that is always sensed is the centroid of that contact patch. What matters is the Circular Error of Probability or the pointing accuracy of people with their fingers. There’s a bit of a range here, depending on the user’s attention, care and the environment in which they operate. Not to mention the ability of the user themselves.
  • What really matters is interference. Why are you worried about touch targets and taking notes right now? To avoid accidental clicks. Interference is what happens when two or more targets are close enough together that they all fall into a single CEP. The user might hit any of them with a single selection. If you can only remember one number to check to assure your design is touch-friendly, make it this one. 8 mm if you have to, 10 mm if you possibly can. More is generally better if you have the room.
  • Defining as spacing between buttons won’t do it. Your link or button is so variable, what you need is a guideline for interference alone. As you see, you can check for it digitally, if you set Photoshop or Fireworks to the right pixel density. Hint, it’s NEVER 72 dpi. It’s different for every device.
  • Better is to work at device scale directly. Draw on printouts…
  • …or put your designs on the actual device screen and measure it directly to make sure. You can use the 8 and 10 mm circles from any old circle template you get at the art supply store (or these days, Amazon), but I made up my own little tool I keep in my pocket, because this is so important.
  • Since everyone loves actionable information, what’s better than numbers. Here are all the numbers I had in the deck in one place, and it might even be up for a minute so you can all take photos, if I am not out of time yet. I will be posting this to Slideshare in the next few minutes, as well, and all the info I presented is published with sources in a couple of articles, so look it up or ask me for the links.
  • What about tablets?
  • For touch, technology and human factors, you’ll see I already gave you the numbers. They are basically the same. For grasping? The best data I can get are from surveys, which of course are filled with bias and the few I have seen have tiny (or unspecified) numbers of responses. I am not even sure how to get better data as these are used in the home more than anywhere. If designing for an existing user base, I’d observe use in the classroom or office or wherever they are used and work off that data. If it’s solid, try to share it with us all.
  • But how about using theinformation we do have to help design? We can make a lot of general decisions, to settle on adaptive techniques to make sure scale and orientation are well accounted for across the range of devices. But more tactically, how do we decide where to put items on the page? Let’s take something ubiquitous and heavily used. The Back button. On iOS it’s way up in the corner…
  • … where the touch charts indicate that it cannot be reached for your one-handed user, with their thumb.
  • And Back is well used. Apps vary, and we have less good data, but this chart shows how it’s the most used common function of browsers. The other big pies are things like clicking a link. My experience is that Back is used, a lot. In all platforms. But how? I don’t recall anyone griping that it’s hard to hit Back on their iPhone, or expressing pleasure that they can reach it when they switch to Android where it’s conveniently at the bottom. How?
  • Maybe this.Maybe they shift their grip so they can reach anyway. Maybe they use both hands, or switch to their left hand for a moment. OR maybe this video
  • Or maybe they cradle with the other hand so they can reach with the thumb. Luke Wroblewski shared this video with me, as he’s less scared of taking video in public.
  • I am not yet sure that these colorfulthumb sweep areas matter as much in the end. This is data from just one study, with a game used to detect touch precision. It’s one that informed the touch target and interference size data above. But lately I have looked harder at some data. This is interesting. I see no pattern (and neither did the researchers) except for “edges are harder to touch.” Which might just be that edge sensing is bad on many devices as I showed you earlier.
  • I would like to reveal I have a Grand Unified Theory of Touch Positioning. But… I don’t yet. Now, I think it’s a balance, or a process. Consider architecture and information design needs for hierarchy of information. Make sure there’s nothing catastrophically bad ergonomically, like touch areas far from the edge on a touchscreen desktop (people tend to hold the edge of the device when using for touch). Check for secondary physical effects. If you have people tapping regularly along the middle or top of the screen, are they missing out on information below? Carousels often fall into this trap, with text about the item in the carousel BELOW the image, where it cannot be seen through your finger. Be consistent. Internally of course, between your various products if possible, within reason following OS conventions.
  • If you miss these addresses, just Google my name and you’ll find me. Fill out your yellow forms, and tell everyone how wonderful the talk was! I’ll be here tonight, and all day tomorrow if anyone wants to talk more, about anything at all, just find ,e.
  • How People Really Hold & Touch (their phones)

    1. 1. 1How People ReallyHold & Touch(their phones)@shoobe01 #mLearnCon
    2. 2. 2What do we know?
    3. 3. 3What do we know?• iPhone is sized for one hand
    4. 4. 4What do we know?• iPhone is sized for one hand• The width of fingers varies
    5. 5. 5What do we know?• iPhone is sized for one hand• The width of fingers varies• 44 points is the ideal targetsize
    6. 6. 6What do we know?• iPhone is sized for one hand• The width of fingers varies• 44 points is the ideal targetsize
    7. 7. 7
    8. 8. 8
    9. 9. 9
    10. 10. 10
    11. 11. 11
    12. 12. 12• Visual targets• Touch targets• Interference
    13. 13. 13Visual targets must:• Attract the users eye.• Look like actionableelements.• Afford the specific action.• Be trustworthy.
    14. 14. 14
    15. 15. 15
    16. 16. 16(3438)(l)d=V
    17. 17. 172.5-inchPhone3.5–5-inchPhone9–10-inchTabletText4 pt1.4 mm6 pt2.1 mm8 pt2.8 mmIconsF6 pt2.1 mm8 pt2.8 mm10 pt3.5 mmMinimum Visual Targets:
    18. 18. 18Capacitive TouchScreen
    19. 19. 19ProximityAccelerometerGryosensorLight colorGestureCover sensorLight levelCapacitive TouchScreen
    20. 20. 20
    21. 21. 2121
    22. 22. 2244 points
    23. 23. 2344 points
    24. 24. 24(1 pt = 1/72”)
    25. 25. 25
    26. 26. 26Centroid?
    27. 27. 27Touch Targets:Minimum17 pt / 6 mmPreferred23 pt / 8 mmMaximum43 pt / 15 mm
    28. 28. 28Interference:Measured on centerMinimum23 pt / 8 mmPreferred28 pt / 10 mm
    29. 29. 29
    30. 30. 3030
    31. 31. 3131
    32. 32. 322.5-inch Phone3.5–5-inchPhone9–10-inchTabletText 4 pt / 1.4 mm 6 pt / 2.1 mm 8 pt / 2.8 mmIconsF 6 pt / 2.1 mm 8 pt / 2.8 mm 10 pt / 3.5 mmMinimum Visual Targets:Minimum 17 pt / 6mmPreferred 23 pt / 8 mmMaximum43 pt / 15mmTouch Targets:Minimum 23 pt / 8 mmPreferred28 pt / 10mmInterference:Measured on center
    33. 33. 33Tablets?
    34. 34. 3434
    35. 35. 35
    36. 36. 36
    37. 37. 37
    38. 38. 38
    39. 39. 39
    40. 40. 40
    41. 41. 41Design for Touch• Follow IA, information, hierarchies• Ergonomically feasible• Don’t obscure information• Consistency• Platform conventions
    42. 42. 42Contact me for consulting, design, tofollow up on this deck, or just to talk:Steven 816 210 0455@shoobe01shoobe01