Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Experimenting Your Way to a Better Product (Front-Trends 2017)

914 views

Published on

We all want to create the best possible product for our users, that also meets the goals of our business in the strongest way. But how do we know if we are really doing that? A/B testing is a tool and process that can validate your design and development decisions through data, helping you remain focused on your customers and achieve more success in solving their problems.

In this presentation, you’ll learn why A/B testing is important and why “expert” opinions about UX and visual design are so often wrong. I’ll talk about how Booking.com has been using A/B testing and other user research to optimize its products for a decade, and the do’s and don't's I’ve learned from running hundreds of experiments over the past three years there. You’ll leave with ideas for when and how to use data to improve your own work and add value to your business.

Published in: Technology
  • Be the first to comment

Experimenting Your Way to a Better Product (Front-Trends 2017)

  1. 1. Experimenting your way to a better product Zoe Mickley Gillenwater Senior UX Designer
  2. 2. One-size-fits-all A/B testing
  3. 3. A/B testing when it’s done well is really about the democratization of allowing your organization to impact its customer positively. Stuart Frisby, Director of Design, Booking.com “
  4. 4. Why do A/B testing
  5. 5. Avoid HiPPOs
  6. 6. Avoid bias and ego
  7. 7. Users drive the product direction
  8. 8. 9 out of 10
  9. 9. Helping customers helps the business
  10. 10. [Booking.com’s] utilization of A/B testing drives higher conversion across its entire platform, resulting in conversion levels 2-3x the industry average. Evercore Equity Research http://bkng.it/1jypoK5 “
  11. 11. How to A/B test
  12. 12. Our A/B testing process 1. Make observations 2. Formulate user-centered hypothesis 3. Create and run experiment 4. Evaluate results of experiment 5. Accept or reject hypothesis
  13. 13. Make observations Gather data, qualitative and quantitative
  14. 14. Data is fuel for hypotheses. So the more informed you are across your entire organization, the more valuable your hypotheses are gonna be and the more successful you’re gonna be in testing them. Stuart Frisby, Director of Design, Booking.com “
  15. 15. Formulate user-centered hypothesis Why / What / Who / Outcome
  16. 16. Why are you doing this? What’s wrong with the current state? What problem are you trying to solve for your users? State the evidence
  17. 17. What are you changing? The implementation, before and after
  18. 18. Who is going to be affected? Which users under what conditions in which spots? How are you going to track only these users?
  19. 19. Outcome What do you expect to happen to users? How you measure the success or failure of your hypothesis Primary and supporting metrics, increase or decrease
  20. 20. Why / What / Who / Outcome Based on user testing where many guests struggled to find hotels with butlers, we believe that adding a butler filter to search results for users filtering by 5-star properties will help them find butler-enhanced properties more easily and make them excited to book. We will know this when bookings increase.
  21. 21. Create and run experiment Focused changes, run for the right amount of time
  22. 22. Create the smallest change possible Limit your investment and risk since most tests fail Easier to fail fast, learn, and iterate Small, isolated changes can still have a big impact across the site
  23. 23. Use a power calculator http://bit.ly/2rvLCl6
  24. 24. A B
  25. 25. Evaluate results of experiment Learn from your data
  26. 26. Metrics combined together tell a story New hypothesis: Users filter by butlers, get zero results, then change dates to try to get more results with butlers. But maps are more powerful than butlers, so move butlers away from maps. Bookings Map usage Zero results Date changes
  27. 27. YOU KEEP USING THAT METRIC I DO NOT THINK IT MEANS WHAT YOU THINK IT MEANS
  28. 28. Focus on what matters
  29. 29. I have not failed 10,000 times. I have not failed once. I have succeeded in proving that those 10,000 ways will not work. When I have eliminated the ways that will not work, I will find the way that will work. Thomas Edison “
  30. 30. Accept or reject hypothesis Remember that concept != execution
  31. 31. A negative or neutral result doesn’t necessarily mean ‘no.’ [It] can also possibly mean, ‘Not quite right’ or ‘Not quite yet.’ The more you test, the more you’ll be able to spot when ‘no’ actually means ‘no.’ Erin Weigel, Principal Designer, Booking.com “
  32. 32. Why might solid concepts fail? Timing, performance, tracking, design details... Full list in Erin’s article: http://bit.ly/2qxwyo3 Erin’s presentation: http://bit.ly/2rso8O5
  33. 33. What not to do
  34. 34. Don’t test something just because you can Google’s 41 shades of blue
  35. 35. Don’t test the same thing over and over Or: how to make a negative experiment seem positive
  36. 36. Don’t cherry-pick metrics after running the experiment Texas Sharpshooter Fallacy, Confirmation Bias, HARKing
  37. 37. Texas Sharpshooter Fallacy
  38. 38. Don’t A/B test without enough traffic for significant results “No data” is better than “wrong data”
  39. 39. Other ways to validate your assumptions User complaints/comments (app reviews, social media mentions, CS tickets, Usabilla) Guerilla street/cafe testing Surveys (email, on-site) Unmoderated remote usability testing Intercept testing Panels/focus groups Diary studies Lab usability testing Participatory design Contextual research/shadowing “5 ways to listen to your customers” by Tomasz Pieta, Senior UX Designer, Booking.com: http://bit.ly/2qt4JyB
  40. 40. Don’t accept every result at face value Twyman's Law
  41. 41. What to do if you get surprising results Run the same test over again Test the same concept somewhere else Validate with other data (Google Analytics, qualitative)
  42. 42. The experiment tool [A/B testing tool showing data] is our user’s body language. It only tells us a percentage of what they are thinking or wanting. Erin Weigel, Principal Designer, Booking.com “
  43. 43. It’s about company culture
  44. 44. Question assumptions Validate opinions Democratize design
  45. 45. Embrace unpredictability
  46. 46. Accept failure
  47. 47. Solve user problems
  48. 48. Questions? zomigi.com @zomigi We’re hiring! workingatbooking.com @planetbooking

×