Fail and Win: Why a Failed Test Isn’t a Bad Thing

747 views

Published on

Caleb Whitmore, CEO, Analytics Pros
Ryan Lillis, Strategic Optimization Consultant, Optimizely

Here's something you don't expect to hear at a CRO conference: most A/B tests don't produce a variation that's better than what you already have.

If all you're doing is running an A/B test, viewing select metrics, and giving a "thumbs up" or "thumbs down," you won't have a successful optimization program — even if you happen upon a few "winners."

But you don't have to run your optimization program this way.

A/B testing done right allows you to draw winning insights from "losing" tests that have the power to genuinely affect your business.

Caleb Whitmore, Founder and CEO of Analytics Pros, shows that you can achieve a 360-degree view of data that leverages your analytics engine as well as your testing platform to drive deep and genuine insights about the effects of your tests.

You'll learn a holistic approach to testing that goes way beyond "winners" and "losers."

Published in: Marketing, Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
747
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
7
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Fail and Win: Why a Failed Test Isn’t a Bad Thing

  1. 1. Fail  and  Win:   Why  a  Failed  Test  Isn’   Caleb  Whitmore   Founder  &  CEO   Analy<cs  Pros   @CalebWhitmore  
  2. 2. Hi,  I’m  @CalebWhitmore  
  3. 3. I  climbed  a  mountain…  once.  
  4. 4. I  work  @Analy=csPros  
  5. 5. And  get  to  work  with  some   awesome  clients!  
  6. 6. Why  Test?  
  7. 7. Why  Test?   TESTING…     •  Should  focus  on  improving  an  outcome   •  Is  the  shortest  distance  between  you  and   improvements   •  Is  done  best  with  rapid  itera<on  and  lots  of  data  
  8. 8. Some  things  may  seem  obvious  to   test…  
  9. 9. Bad  Form  
  10. 10. Good  Form  
  11. 11. But  you  should  s=ll  test  them!  
  12. 12. Tes=ng  depends  on  data  
  13. 13. Op=mizely  Data  
  14. 14. Google  Analy=cs  Data  
  15. 15. Not  all  tests  are  created  equally  
  16. 16. Some  tests  will  win  
  17. 17. Some  tests  will  fail  
  18. 18. Is  winning  a  success?  
  19. 19. Is  failing  a  loss?  
  20. 20. Tes=ng  is  a  Success  When:   •  A  process  is  followed   •  An  outcome  is  measured   •  A  complete  picture  is  seen  
  21. 21. When  Things  Shake,  What   Moves?  
  22. 22. When  Things  Shake,  What   Moves?  
  23. 23. Crea=ng  a  360  Degree  View   •  A  test  will  impact  more  than  what  you  intended   •  You  must  measure  the  impact  across  your   spectrum  of  user  interac<on   •  Measure  your  test  with  your  exis<ng  infrastructure  
  24. 24. Crea=ng  a  360  Degree  View   What  happens  to…   •  The  “key”  conversion  point   •  Engagement  on  the  test  page   •  Length  of  the  en<re  visit   •  Total  pageviews/interac<ons  in  the  visit   •  Secondary  conversion  goals  –  view  product,  view   content,  start  checkout,  download  a  file,  etc…  
  25. 25. Here’s  an  example  
  26. 26. Example  Experiment   ASSUMPTIONS   •  Auto-­‐play  repels  visitors   – Audio  without  warning   •  Thus,  bounce  rates  are  high  and  more  people  leave   •  Turn  off  auto  play  &  more  people  will  stay  and   engage  
  27. 27. Op=mizely  Shows  a  Winner  
  28. 28. Bounce  Rate  Improved  
  29. 29. Purchase  Rate  Tanked!  
  30. 30. DO  THIS  
  31. 31. Setup  a  360  Degree  View  of  Tests   •  Tip:  your  Digital  Analy<cs  plaXorm  has  this   view  already!  
  32. 32. Op=mizely  Automated   Integra=ons  
  33. 33. Pipes  data  straight  into  GA  
  34. 34. Is  this  a  good  outcome?  
  35. 35. What  about  this?  
  36. 36. See  the  full  picture  
  37. 37. Capture  Experiment  Data   •  Tie  experiment  data  into   your  Tag  Management   and  Digital  Data   frameworks  
  38. 38. Leverage  Built-­‐in  Repor=ng   •  Create  lots  of  Goals  in  Op<mizely  –  pages,   clicks,  revenue,  etc…  
  39. 39. QUESTIONS?   @CalebWhitmore  of    @Analy<csPros  
  40. 40. Fail  and  Win:   Why  a  Failed  Test  Isn’t  a  Bad  Thing   Caleb   Whitmore   Founder  &  CEO   Analy<cs  Pros   @CalebWhitmore  

×