Successfully reported this slideshow.
Your SlideShare is downloading. ×

Research Traps: 7 ways of thinking that keep you from doing great customer research

Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Loading in …3
×

Check these out next

1 of 58 Ad
Advertisement

More Related Content

Slideshows for you (20)

Viewers also liked (19)

Advertisement

Similar to Research Traps: 7 ways of thinking that keep you from doing great customer research (20)

Recently uploaded (20)

Advertisement

Research Traps: 7 ways of thinking that keep you from doing great customer research

  1. 1. Research Traps: 7 ways of thinking that keep you from doing great customer research Wendy Castleman Principal User Research Scientist To be presented at the UPA Conference in Portland, OR in June 2009
  2. 2. Who am I? Experience Research
  3. 3. RESEARCH TRAPS
  4. 4. Mental Shortcuts
  5. 5. Mental shortcuts help us • Make quicker decisions to take action faster…
  6. 6. Those shortcuts become research traps
  7. 7. Awareness
  8. 8. Habit False Illusory Consensus Correlation Recency Congruency Availability Confirmation
  9. 9. Meet Elsie
  10. 10. What do you think happened? I lived here I moved here maps.google.com
  11. 11. Habit We tend to do things the way we usually do
  12. 12. New project? Sure, I can run a usability test!
  13. 13. Habit Trap Can lead us to do the wrong research Your favorite research method may not be the best way to learn what you need to know
  14. 14. The best way to find that out would be a site visit…
  15. 15. How to Avoid the Habit Trap • Look at every project as unique • Consider what you need to learn • Identify the best method
  16. 16. • If Yes = guess most people would agree • If No = guess most people wouldn’t agree
  17. 17. False Consensus We tend to attribute our beliefs, thoughts and behaviors to others
  18. 18. Hi guys! Wanna be in a study?
  19. 19. False Consensus Trap Test with the wrong participants Other people may not think like you…
  20. 20. Junior League of Palo Alto
  21. 21. Avoiding the False Consensus Trap • Focus on the customer –spend time watching and talking • Test with people who aren’t like you
  22. 22. What is the rule? 2, 4, 6, 8, ___ Hypotheses: Each value must be 2 higher than the one before. How do you test this hypothesis? Actual Rule: Each value must be any number bigger than the one before.
  23. 23. Congruence Bias We jump to conclusions by only looking at one approach
  24. 24. Try out our new idea for an iPhone application!
  25. 25. Congruence Trap insufficiently inform the design Only trying one solution may miss a better one
  26. 26. Try out each of these iPhone applications!
  27. 27. Avoiding the Congruence Trap • Test several different solutions • Test out what shouldn’t work
  28. 28. Which card do you turn over? T6L 4 Hypothesis: The back of every “6” is an “L”. Did you say “T”?
  29. 29. Confirmation Error Bias We have a tendency to search for data to confirm expectations
  30. 30. As I suspected! 25% of users can’t do that task!
  31. 31. Confirmation Trap Get an incomplete picture of the data Seeking to prove our ideas can lead to missed surprises
  32. 32. Quicken used by small business
  33. 33. Avoiding the Confirmation Trap • Look for surprises, instead of what you expect • Test out what shouldn’t work • Consider independent evaluation
  34. 34. In American English are there more: Words that begin with the letter “K”? Or Words where “K” is the third letter? There are 3x as many words with “K” in the 3rd position
  35. 35. Availability Heuristic We have a tendency to put too much weight on what comes easily to mind
  36. 36. Availability Trap in Research …a lot of people had trouble with that task…
  37. 37. Availability Trap Draw inaccurate conclusions What comes to mind easiest may not be the most important or most frequent finding.
  38. 38. Hmm… I didn’t realize that happened so often…
  39. 39. Avoiding the Availability Trap • Gather key usability metrics –(task success, specific error counts, time) • Don’t rely on your memory • Look at all of the data –Encourage others to do the same
  40. 40. Last 3 movies Prior 3 movies
  41. 41. Recency Bias We tend to put too much weight on what we saw most recently
  42. 42. Recency Bias in Research Participant 1 Participant 2 Participant 3 Participant 4 Participant 5 Click “Done” Click Click “Done” Click Click “Cancel” “Continue” “Continue” Most people clicked “Continue”
  43. 43. Recency Trap Draw inaccurate conclusions What you saw most recently may not be the most important or most frequent finding.
  44. 44. Hmm… I didn’t realize that happened so often…
  45. 45. How to Avoid the Recency Trap • Gather key usability metrics –(task success, specific error counts, time) • Don’t rely on your memory • Look at all of the data
  46. 46. Happiness Weight
  47. 47. Illusory Correlation The tendency to find patterns where none exist
  48. 48. This is the third guy who uses a laptop in his living room. Maybe all men use laptops in their living rooms
  49. 49. Illusory Correlation Trap Draw inaccurate conclusions Things that co-occur may not be related.
  50. 50. That’s the fourth man who has bought this version. I need to find out how many men buy this…
  51. 51. Avoiding the Illusory Correlation Trap • Recognize the limitations of your research methods • Verify magnitude estimations and correlations with large-scale quantitative studies
  52. 52. Habit False Illusory Consensus Correlation Recency Congruency Availability Confirmation
  53. 53. Ways to avoid the traps… Planning 1. Look at every project as unique 2. Consider what you need to learn 3. Identify the best method 4. Test with people who aren’t like you 5. Test several different solutions 6. Test out what shouldn’t work
  54. 54. Ways to avoid the traps… Conducting 1. Look for surprises, instead of what you expect 2. Gather key usability metrics 3. Consider independent evaluation
  55. 55. Ways to avoid the traps… Analyzing 1. Don’t rely on your memory 2. Look at all of the data 3. Recognize the limitations of your research methods 4. Verify magnitude estimations and correlations with large-scale quantitative studies
  56. 56. http://Deepunderstandings.blogspot.com QUESTIONS? Email me Wendy_castleman@intuit.com

Editor's Notes

  • My name is Wendy CastlemanI am a wifeAnd a motherMy background is PsychologyBut now I work as an experience design researcherAt a company called “Intuit”… Where I do research to help us design solutions that empower individuals and businesses to achieve their dreams.
  • In my time as a researcher, I’ve made lots of mistakes that have limited the impact of the research I’ve done. I’m here to talk with you about some common research traps, how to recognize them and avoid them.
  • It ends up that YOU and I and everyone else around here all have some built in mental shortcuts that help us think efficiently.
  • This efficient thinking is really useful for our survival. Like, if you are being chased by a bear, you don’t have all the time in the world to try out different escape options to find the best one. You just need to try something right away…
  • But, these same shortcuts become barriers for research. In order to do great research to inform the design of delightful solutions, we need to break away from our natural biases and get beyond the obvious to the underlying truths.  This talk will focus on some of the most common mental barriers, or cognitive biases, that can interfere with doing great customer research.  
  • It doesn’t matter whether you are new to the fieldOr very experienced… we are all at risk of falling for these traps. Fortunately, a raised awareness of the traps can help us successfully avoid them.
  • Today we are going to go over 7 research traps, understand what they are, why they happen and how to avoid them.
  • These are the 7 traps, but don’t worry… I’ll go through them one at a time.
  • Now, I’m not going to admit that I’ve let habit come into play when doing research… nor would I say that it hasn’t… So, drawing from Child Psychology Therapy Sessions, I’m going to use a puppet to tell the story of how some usability researchers may handle these biases. So, Elsie will show us examples throughout this talk.
  • Let me tell you a story. This is a picture of the street that I live on.For the past year, I lived here.I just bought the house across the street and moved here.Now, driving home after a late night at work… what do you think happened?YES! I drove to the wrong house. Why? Because I was going on auto-pilot… force of habit. I wasn’t thinking, I was just doing what I always do.
  • Habit comes into play with research. We often don’t really think about the type of research we should do, but instead do the same type of research that we usually do. So, Elsie spends a lot of time in the lab doing usability tests… and when someone comes to her with a new project, she just jumps straight to doing another usability test.
  • The problem is that all research methods are not the same. By doing what Elsie always does, a usability test, she might end up not learning the key type of information that the project team needs to know. Have you ever done a test that ended up not being that helpful? Maybe it was because you fell for the Habit trap.
  • With her new raised awareness, now Elsie asks the team a number of questions and figures out which method is most appropriate.
  • Ross and colleagues asked students to walk around campus with a sign saying ‘Eat at Joe’s’. Those who agreed said that 62% of other people would agree to carry the sign. Those who disagreed said that 67% would not carry the sign. Ross, L, Amabile, T. M. and Steinmetz, J. L.(1977) Social roles, social control and biases in social perception, Journal of Personality and Social Psychology, 35, 485-494
  • Example for “False Consensus” is pretty much any product team I have worked on—the engineers and tax dev folks have a tendency to think they are just like the customer and know the problem.  They get all wrapped up in the ui design, when they are clearly not accountants!  Not sure how you can turn that into a story without incriminating people though. J
  • There are lots of ways False Consensus can come into play with usability… for example, conducting a cognitive walkthrough and assuming that your insights are the same as a target user. However, another way we fall for this trap is by recruiting participants who are not quite the target audience. Ever invite friends and family to participate in a study? Do your friends and family think like you do? You might have fallen for the False Consensus trap.
  • Example for “False Consensus” is pretty much any product team I have worked on—the engineers and tax dev folks have a tendency to think they are just like the customer and know the problem.  They get all wrapped up in the ui design, when they are clearly not accountants!  Not sure how you can turn that into a story without incriminating people though. J
  • Quicken usability testing was conducted with first-time PC users, including the Palo Alto Junior League in 1984.  Intuit’s ambitious goal:  to have complete PC novices up and running on Quicken within 15 minutes.  (Courtesy of Virginia Boyd.) 
  • Because of the confirmation bias, we tend to fall victim to another bias … the congruence bias. Take this example. You see a sequence of numbers and are tasked with identifying the rule. Your hypotheses is that these are a sequence of even numbers, increasing by 2. If you fill in the next value to determine the accuracy of your hypothesis, you are likely to put in the number 10. But, this is a direct test. It is possible that 10 would be okay but your hypothesis is not true. If you test the indirect hypothesis, you might choose 9. Because, if your hypothesis is true, then 9 should be FALSE.
  • The team is coming up with a new iPhone app and asked Elsie to find out what people think about the idea and how to make it better. She tests the application… and suggests usability improvements… what’s wrong with that? Well…
  • Ideally, Elsie will test multiple in a pseudo-usability test to gauge directionSome might not turn out to solve a big enough of a problem for her customers (X)Some might be best combined into one (O and arrow)
  • Turn T & 6, not 6 & L. If there is a 6 on the other side of L, that doesn’t tell you anything. If there is an L on the other side of the 6, that is consistent with your hypothesis, but doesn’t tell you if you are right. If there is an L on the other side of the 4, that doesn’t mean anything either (maybe all even numbers have L’s… that wouldn’t mean that your hypothesis would be wrong, or maybe some 6’s have Ls and some have Rs). Only if you turn over the T and found a 6 would you be able to determine that your hypothesis definitely isn’t right.
  • Elsie goes looking for the data that supports her hypotheses…
  • Soon after the Quicken launch, Scott and his team started doing surveys where they asked a number of demographic questions. One of the things he learned was that 48% of customers said they used Quicken in an office or both at home and in an office. This made no sense to anyone on the team, as it was clearly intended for personal home users, so they ignored it. They repeated the study 18 months later. That time they found that 49% said they used it in an office. So they ignored it again. But the anomaly kept gnawing at him, and a year later Scott started calling and visiting these customers to figure out why…
  • Someone is asked to estimate the proportion of words that begin with the letter \"R\" or \"K\" versus those words that have the letter \"R\" or \"K\" in the third position. Most English-speaking people could immediately think of many words that begin with the letters \"R\" (roar, rusty, ribald) or \"K\" (kangaroo, kitchen, kale), but it would take a more concentrated effort to think of any words where \"R\" or \"K\" is the third letter (street, care, borrow, acknowledge); the immediate answer would probably be that words that begin with \"R\" or \"K\" are more common. The reality is that words that have the letter \"R\" or \"K\" in the third position are more common. In fact, there are three times as many words that have the letter \"K\" in the third position. Tversky, A. & Kahneman, D. (1974). Judgments under uncertainty: Heuristics and biases. Science, 185, 1124-1131.
  • Have you ever conducted a study with multiple people, then been asked to summarize what you found without having the time to go back and analyze all of your data? If so, you too have probably fallen victim to the availability trap.
  • Tracking and counting can help make this more objective. Even if it’s just taking a couple of minutes to tally up what you recall finding…
  • List the following: - the last 3 movies you saw in 2008 - the prior 3 movies you sawWhich was easier?
  • Elsie does a usability test… and sees which button people click on a particular screen. Immediately after the last participant, the product manager asks her how the study went and what they saw on that screen. Elsie says, “Most people clicked Continue”. However, as you can see… that just isn’t true. She’s fallen victim to the recency trap. This is especially likely when you have more participants. Common on site visits too, where time elapses between sessions and you have a greater chance of forgetting.
  • Again, the key to avoiding this trap is to take a moment and review your counts…
  • A study was done with teenaged girls. They were asked to describe the relationship between weight and happiness. What do you think they found? The teenagers thought there was a negative correlation between weight and happiness. In truth, there is no correlation at all.
  • Elsie is noticing patterns… that’s fine, but if she actually claims the relationship to be true, she can lose credibility if it ends up that it isn’t.
  • Elsie’s instincts are right on here… notice a pattern and seek to discover if other data supports it.
  • So those are the seven traps. Now that you know about them…
  • Don’t fall for these traps!
  • And how to avoid them…
  • And how to avoid them…
  • And how to avoid them…

×