Understand A/B Testing in 9 use cases & 7 mistakes

18,014 views

Published on

More infos here : http://buff.ly/OWUPQw

Published in: Technology, Education
0 Comments
31 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
18,014
On SlideShare
0
From Embeds
0
Number of Embeds
7,713
Actions
Shares
0
Downloads
153
Comments
0
Likes
31
Embeds 0
No embeds

No notes for slide

Understand A/B Testing in 9 use cases & 7 mistakes

  1. Many thanks to:
  2. Growth Hacking Meetup - Episode 4 - 9 A/B Testing Use Cases
  3. But first… some A/B Testing mistakes startups make ALL the time
  4. 1 1 A/B tests are ended too early
 Statistical significance (large sample size, etc) is what tells you version A is actually better than version B
  5. As an optimizer, your job 
 is to figure out the truth. 
 You have to put your ego aside.
  6. 2 1 Tests are not run for full weeks
 Conversion rates can vary greatly depending on the day of the week. Always test a full week at a time.
  7. 3 1 Always pay attention to external factors
 Is it Christmas? External factors definitely affect your test results. When in doubt, run a follow-up test.
  8. 4 1 Testing random ideas without a hypothesis You’re wasting precious time and traffic. Never do that.
  9. By the way, if you want to test button colors “green vs orange” is not the essence of A/B testing. 
 It’s about understanding the target audience.
  10. Use your traffic on high-impact stuff. 
 Test data-driven hypotheses.
  11. 5 1 They give up after the first test fails
 Run a follow-up test, learn from it, and improve your hypotheses.
  12. “I have not failed 10,000 times. 
 I have successfully found 10,000 ways that will not work.” - Thomas Alva Edison, inventor
  13. 6 1 They’re ignoring small gains 
 Only 1 out of 8 A/B tests drive 
 significant change.
  14. 7 1 They’re not running tests at all times
 Every single day without a test is a wasted day. Testing is learning.
  15. Now, for the second part… 
 12 Surprising A/B Test Results to Stop You Making Assumptions
  16. Test 1: Which Copy Increased Trial Sign-Ups?
  17. Result In this test, Version B increased sign-ups by 38% – a big rise. However, you would think version A was the better design.
  18. Why does Version B work? 
 Simply because the copy is better. Lesson: landing pages don’t have to be pretty to be effective.
  19. Test 2: Which Landing Page Got 24% More Lead Generation Form Submissions? Picture vs No picture
  20. Result Surprisingly, Version A was the page that got the 24% increase in submissions, simply by removing the image from the page.
  21. This is a great example of why you should confirm your assumptions with quantitative testing.
  22. Test 3: Which Radically Redesigned Form Increased B2B Leads By 368.5%?
  23. Result Version A is an obvious winner, but it’s not just the big red button that makes the difference. Version A keeps things really tight and uses grouping to visually shrink the impact of the form
  24. When designing your landing page, don’t overestimate your user’s tolerance, goodwill, and patience.
  25. Test 4: Does Matching Headline & Body Copy to Your Initial Ad Copy Really Matter?
  26. Result Version B looks as if it should be better: the headline copy is snappier, the sub-head clearer, but in tests version A increased leads by 115%.
  27. Why? Version A was designed to tie in with the ads that drive users to the page. 
 Lesson: making the elements of the sales funnel work together increases efficacy.
  28. Test 5: Which Landing Page Increased Mortgage Application Leads by 64%?
  29. Result Video can be a very effective tool in communicating lots of information in a compact form. But the presence of video couldn’t save Version B; Version A increased leads by 64%.
  30. Test 6: Does adding testimonials to your Homepage increase Sales?
  31. Result It would seem that this would have very little effect, but in practice this small change increased sales by 34% – a big margin. Why is this? Having ‘social proof’, even in this basic form, humanizes the conversion experience
  32. Test 7: Does Social Proof Increase Conversions? ‘Big Brand Prices’ vs. Consumer Ratings
  33. Result ‘Social proof’ is a powerful tool that can have a demonstrable effect on conversion outcomes. But in this case, customers were just looking for the cheapest offer. Asking them to consider an additional piece of information – customer satisfaction – made them back away from the conversion.
  34. Test 8: Does an Email Security Seal Help or Hinder Lead Generation Form Completions?
  35. Result Actually, users were put off by seeing it in this context, assuming that they were about to pay for something
  36. Test 9: Which Page Got an 89.8% Lift in Free Account Sign Ups?
  37. Result Version B wins: It has three bullet points, as opposed to words in speech bubbles. Removing a distraction from the page and reducing the risk of users navigating away from the page worked better
  38. Whether you are right or wrong: 
 First, make sure you get the basics right before you start testing, and second, always be testing, because unless you test, you can never be absolutely sure.
  39. You are Welcome to Join the Adventure hey@thefamily.co

×