Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Building a Testing Playbook by Andrew Richardson

240 views

Published on

A testing playbook combines the best practices of testing and optimization, along with communication strategies, education, and gaining buy-in from your client. Andrew Richardson, Senior Director of Analytics at Delphic Digital, provides a peek behind the curtain to reveal how Delphic prioritizes tests, recruits/trains/staffs-up for a testing practice, and moved from A/B to multivariate testing. Come with and open mind, walk away with a Testing Playbook Template you can put to use at once.

Published in: Marketing
  • Be the first to comment

Building a Testing Playbook by Andrew Richardson

  1. 1. I CAN’T EAT LUNCH YET THIS GUY KEEPS TALKING…
  2. 2. This is your BRAIN An attention span of 8 seconds *https://advertise.bingads.microsoft.com/en-us/insights @analyticandrew #eMetrics
  3. 3. IS MY CONVERSION RATE GOOD OR BAD? WHY DO I NEED TO TEST? BUT NO ONE WANTS TO DO TESTING AT MY ORGANIZATION.  PLATFORM, PEOPLE, PROCESS WHAT DO I TEST AND HOW DO I TRACK IT? TIPS AND SUGGESTIONS FOR GETTING STARTED TAKEAWAYS @analyticandrew #eMetrics
  4. 4. IS MY CONVERSION RATE GOOD OR BAD? WHY DO I NEED TO TEST? BUT NO ONE WANTS TO DO TESTING AT MY ORGANIZATION.  PLATFORM, PEOPLE, PROCESS WHAT DO I TEST AND HOW DO I TRACK IT? TIPS AND SUGGESTIONS FOR GETTING STARTED TAKEAWAYS @analyticandrew #eMetrics
  5. 5. 2.6% 2.8% 2.8% 3.2% 3.3% 3.4% 5.0% 5.0% 5.5% 6.0% Higher Education Real Estate Health Legal Home Improvement Business Services Travel Business Consulting Credit & Lending Vocational Studies & Job Training *Unbounce Conversion Benchmark Report – March 2017 HOW DO YOU COMPARE?
  6. 6. Never stop testing, and your advertising will never stop improving. - David Ogilvy -
  7. 7. IS MY CONVERSION RATE GOOD OR BAD? WHY DO I NEED TO TEST? BUT NO ONE WANTS TO DO TESTING AT MY ORGANIZATION.  PLATFORM, PEOPLE, PROCESS WHAT DO I TEST AND HOW DO I TRACK IT? TIPS AND SUGGESTIONS FOR GETTING STARTED TAKEAWAYS @analyticandrew #eMetrics
  8. 8. DAYLIGHT Your ideas see
  9. 9. No more HIPPO
  10. 10. 5xHigher ROI than PPC Up to
  11. 11. IS MY CONVERSION RATE GOOD OR BAD? WHY DO I NEED TO TEST? BUT NO ONE WANTS TO DO TESTING AT MY ORGANIZATION.  PLATFORM, PEOPLE, PROCESS WHAT DO I TEST AND HOW DO I TRACK IT? TIPS AND SUGGESTIONS FOR GETTING STARTED TAKEAWAYS @analyticandrew #eMetrics
  12. 12. Not enough TRAFFIC
  13. 13. We have NO IDEAwhat to test
  14. 14. IS MY CONVERSION RATE GOOD OR BAD? WHY DO I NEED TO TEST? BUT NO ONE WANTS TO DO TESTING AT MY ORGANIZATION.  PLATFORM, PEOPLE, PROCESS WHAT DO I TEST AND HOW DO I TRACK IT? TIPS AND SUGGESTIONS FOR GETTING STARTED TAKEAWAYS @analyticandrew #eMetrics
  15. 15. Nothing should be OFF LIMITS
  16. 16. T-shirt TEST size your ING
  17. 17. Testing is often about CHANGINGpolicy & perception
  18. 18. IS MY CONVERSION RATE GOOD OR BAD? WHY DO I NEED TO TEST? BUT NO ONE WANTS TO DO TESTING AT MY ORGANIZATION.  PLATFORM, PEOPLE, PROCESS WHAT DO I TEST AND HOW DO I TRACK IT? TIPS AND SUGGESTIONS FOR GETTING STARTED TAKEAWAYS @analyticandrew #eMetrics
  19. 19. Ok, I am ALL INNow what?
  20. 20. Tool Selection: BUILD or BUY
  21. 21. 7 TOOLS MarTech 2011:
  22. 22. 175 TOOLS MarTech 2017:
  23. 23. THE PEOPLE The most important thing
  24. 24. THE PEOPLE The most important thing BUT!
  25. 25. THE PEOPLE The most important thing one: focus efforts where they matter most two: avoid conflict-put opinions to the test three: empower people to optimize & be data-drive four: align teams around shared goals five: collaborate around hypothesis generation, test ideas & execution
  26. 26. PROCESS IS CRUCIAL FOR SUCCESS
  27. 27. RESEARCH 1. PRIORITIZE 2. EXPERIMENT 3. ANALYZE, LEARN, REPEAT 4.
  28. 28. RESEARCH 1. “If A/B tests are used in lieu of research, the variations are essentially guesses. We can improve outcomes of A/B tests by incorporating UX research to: •improve cause identification •develop more realistic hypotheses •identify more opportunities for experimentation” Nielsen Norman Group: www.nngroup.com
  29. 29. RESEARCH 1.
  30. 30. RESEARCH 1.
  31. 31. RESEARCH 1.
  32. 32. RESEARCH 1.
  33. 33. RESEARCH 1.
  34. 34. A theory can be proved by experiment; but no path leads from experiment to the birth of a theory.
  35. 35. RESEARCH 1. Issue • What is the conversion problem? Cause Theory • What do you think is causing the problem? Variation Hypotheses • What do you propose changing? Goals • What outcomes are expected?
  36. 36. RESEARCH 1. Issue • What is the conversion problem? Cause Theory • What do you think is causing the problem? Variation Hypotheses • What do you propose changing? Goals • What outcomes are expected?
  37. 37. RESEARCH 1.
  38. 38. RESEARCH 1. User-Experience to Improve Optimization Testing • Defining user intent and objections • Why did you visit? Were you successful? If not, why? • Exposing interface flaws • Changing colors vs. changing form fields • Measuring findability • Tree testing, site search, heatmaps • Cleaning up design variations • Reduce friction, maximize testing
  39. 39. RESEARCH 1. Bug tracking Brainstorming Site Search Web Analytics Site Surveys Chat
  40. 40. PRIORITIZE 2. • TEST • Where you put stuff for testing • INSTRUMENT • Fixing, adding, improving tagging, events, etc. • HYPOTHESIZE • Found something, but no clear solution • JUST DO IT (JDI) • No brainers • INVESTIGATE • Need to dig further, ask more questions Craig Sullivan: @optimiseordie
  41. 41. PRIORITIZE 2.
  42. 42. PRIORITIZE 2. Craig Sullivan Hypothesis Kit one: Because we saw (data/feedback) two: We expect that (change) will cause (impact) three: We’ll measure this using (data metric) https://medium.com/@optimiseordie/hypothesis-kit-2-eff0446e09fc
  43. 43. PRIORITIZE 2. Craig Sullivan Advanced Kit one: Because we saw (qual and quant data) two: We expect that (change) for (population) will cause (impact(s)) three: We except to see (data metric(s) change) over a period of (x business cycles) https://medium.com/@optimiseordie/hypothesis-kit-2-eff0446e09fc
  44. 44. EXPERIMENT 3.
  45. 45. EXPERIMENT 3.
  46. 46. EXPERIMENT 3.
  47. 47. EXPERIMENT 3.
  48. 48. ANALYZE, LEARN, REPEAT 4. Insert random cycle chart here
  49. 49. IS MY CONVERSION RATE GOOD OR BAD? WHY DO I NEED TO TEST? BUT NO ONE WANTS TO DO TESTING AT MY ORGANIZATION.  PLATFORM, PEOPLE, PROCESS WHAT DO I TEST AND HOW DO I TRACK IT? TIPS AND SUGGESTIONS FOR GETTING STARTED TAKEAWAYS @analyticandrew #eMetrics
  50. 50. • Headlines • Sub headlines • Paragraph Text • Testimonials • Call to Action text • Call to Action Button • Links • Images • Content near the fold • Social proof • Media mentions • Awards and badges
  51. 51. DESKTOP Segment From MOBILE
  52. 52. SWEAT Don’t the small stuff
  53. 53. IS MY CONVERSION RATE GOOD OR BAD? WHY DO I NEED TO TEST? BUT NO ONE WANTS TO DO TESTING AT MY ORGANIZATION.  PLATFORM, PEOPLE, PROCESS WHAT DO I TEST AND HOW DO I TRACK IT? TIPS AND SUGGESTIONS FOR GETTING STARTED TAKEAWAYS @analyticandrew #eMetrics
  54. 54. audience.ask(questions); @analyticandrew #eMetrics
  55. 55. Never stop testing, and your advertising will never stop improving. - David Ogilvy -

×