Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Measuring the Wrong Thing: Data-Driven Design Pitfalls

1,992 views

Published on

We’ve come a long way from Douglas Bowman’s infamous Google lament about having to test 41 shades of blue. Today, using data to inform and evolve designs has become the standard at large companies. And sophisticated web analytics and A/B testing tools are now available to more of us than ever before. But in our eagerness to leverage the power of quantitative data, could we possibly be measuring the wrong things? And if so, would we even know it? I'll examine a few common pitfalls when trying to gather and use data for product design that I've encountered, how they impact your project. And I'll share some strategies that any designer can use to help use data more effectively to improve their designs, gain more influence with business stakeholders, and ultimately improve the products that our customers use.

Written text of presentation: http://www.jenmatson.com/blog/measuring-the-wrong-thing-data-driven-design-pitfalls/

Published in: Design

Measuring the Wrong Thing: Data-Driven Design Pitfalls

  1. 1. Measuring the Wrong Thing: Data-Driven Design Pitfalls Jen Matson @nstop
  2. 2. Hi, I’m Jen Matson. • Senior UX Designer at Amazon • Designing & building web sites since 1994 • Unabashed data junkie
  3. 3. !"#"$%&' ('&"$%&' !)*)#"% +,-'./# '&0)*1 2&3 '&0)*1 From Ideas and/or data by Cennydd Bowles
  4. 4. Case Study #1: The Meaning of A Click (or Tap)
  5. 5. Case Study #1: The Meaning of A Click (or Tap) Company: Movie listings site Project: Create a mobile-optimized view of the movie detail page
  6. 6. Case Study #1: The Meaning of A Click (or Tap) Average Review Movie Showtimes
  7. 7. Case Study #1: The Meaning of A Click (or Tap) Average Review Movie Showtimes
  8. 8. Case Study #1: The Meaning of A Click (or Tap) Average Review Movie Showtimes Result: Mistaking one thing for another Causes: • Too focused on quantitative data • Clickable UI elements too closely grouped
  9. 9. Case Study #1: The Meaning of A Click (or Tap) Average Review Movie Showtimes Potential impact: • Frustrated users due to “broken” UI • Drawing the wrong conclusions about what users want/like • Building more features based on those conclusions
  10. 10. Case Study #1: The Meaning of A Click (or Tap) Average Review Movie Showtimes How to fix: • Gather qualitative data (customer feedback) along with quantitative • Make time for usability testing, and subsequent design/dev cycle prior to launch
  11. 11. Case Study #2: Throwing Stuff Against the Wall
  12. 12. Case Study #2: Throwing Stuff Against the Wall Company: Mobile service provider site Project: Redesign the help portal to offer personalized content
  13. 13. Case Study #2: Throwing Stuff Against the Wall Help
  14. 14. Case Study #2: Throwing Stuff Against the Wall Help
  15. 15. Case Study #2: Throwing Stuff Against the Wall Help
  16. 16. Case Study #2: Throwing Stuff Against the Wall Help Result: False positive Causes: • Choosing a metric (clicks) with only a loose connection to user need • Poor communication between teams
  17. 17. Case Study #2: Throwing Stuff Against the Wall Help Impact: • Irrelevant content leads to user confusion, lack of trust • Failure to improve help relevance due to bad data feedback loop
  18. 18. Case Study #2: Throwing Stuff Against the Wall Help How to fix: • Audit relevance of help, match to real user attributes • Use real user events to power suggestions • Unify project teams
  19. 19. Case Study #3: Unclear Cause and Effect
  20. 20. Case Study #3: Unclear Cause and Effect Company: TV manufacturer site Project: Redesign the search engine for the support section to make content easier to find
  21. 21. Case Study #3: Unclear Cause and Effect User tasks: 1.Find article (search) 2.Read article 3.Use solution or tool found in article to solve problem
  22. 22. Case Study #3: Unclear Cause and Effect User tasks: 1.Find article (search) Content: Findable 2.Read article Content: Consumable 3.Use solution or tool found in article to solve problem Content: Actionable
  23. 23. Case Study #3: Unclear Cause and Effect Search Results
  24. 24. Case Study #3: Unclear Cause and Effect Search Results Contact
  25. 25. Case Study #3: Unclear Cause and Effect Search Results Help Article Tool Contact
  26. 26. Case Study #3: Unclear Cause and Effect Search Results Help Article Tool Contact
  27. 27. Case Study #3: Unclear Cause and Effect Search Results Result: Unclear impact Causes: • Choosing to measure only what we were already set up to measure • Lack of data to ensure business goals are aligned with project work
  28. 28. Case Study #3: Unclear Cause and Effect Search Results Impact: • Data gathered from product launch not useful in helping to prioritize future features • Further defer updates to content and tools due to lack of data
  29. 29. Case Study #3: Unclear Cause and Effect Search Results How to fix: •Work with product manager on goals, project definition before finalized • Use customer journey and task mapping to highlight data collection needs
  30. 30. What else?
  31. 31. Understand your company culture
  32. 32. Learn more about what you can measure, and how
  33. 33. hovers Learn more about what you can measure, and how data inputs clicks scroll depth site path mouse path dwell time
  34. 34. Use what you learn to improve your designs and increase your influence
  35. 35. Add new questions to your arsenal
  36. 36. What data do we have to support this? Add new questions to your arsenal How will we get data to validate this?
  37. 37. Thank you. Jen Matson @nstop https://www.flickr.com/photos/72764087@N00/9990024683/ https://www.flickr.com/photos/coolmel/5469163/ https://www.flickr.com/photos/37182073@N06/5142618640/ Photo credits (in order of appearance): https://www.flickr.com/photos/notemily/4765937286 https://www.flickr.com/photos/gwdexter/1401789875

×