This document discusses common problems with A/B testing and provides alternatives. It notes that many companies test pointless metrics that do not impact revenue, have too small sample sizes to be confident in results, and stop tests too soon. Instead, it recommends user testing with 5-15 users to identify usability issues, analyzing site copy, design, and speed, and running tests over multiple business cycles. The key message is that A/B testing should only be used if enough traffic is available and tests should focus on metrics that directly influence revenue.
4. "visitors spent 67% more
time on my website after I
changed my headline"
"23% more
people click the
red button"
"42% increase in email open rates
when I included my brand in the subject line"
"adding more images got me
18% more downloads"
Reason #1: Pointless tests
5. That's all interesting, but:
So what if you got more clicks?
What difference does it make if
someone spends 1 minute or 10 minutes
on your site?
Why does it matter how many people
open your email?
8. Before running a test,
use this checklist:
- What are we testing?
- What improvement are we trying to achieve?
- What difference will it make to our revenue?
- How will we know if it's been a success?
- What will we do if we don't find a winner?
(don't worry, the majority of tests won't give you a clear winner)
10. "People don't think how they feel, they
don't say what they think, and they
don't do what they say"
David Ogilvy
People are unpredictable. So the
more people you include in your
sample, the more confident you
can be that your results aren't
down to chance.
11. How to find out
how many you need
Use a sample size
calculator, like:
http://www.evanmiller.org/ab-testing/sample-size.html
12. Let's take a closer
look at those figures.
imagine your current
page converts at 20%...
and your A/B test gives
you 5% improvement...
this is the number of people who
need to have seen each version.
That's 50,510 people...
These figures mean you can
be 95% sure of your result.
14. That's ok... you can still
optimise your website.
Just not through A/B testing.
Otherwise you risk making business decisions
based on the wrong data.
And you could end up removing what's
working on your website,
and keeping what isn't.
16. User testing
Get real live feedback
from real live humans
1) recruit 5–15 users
(they will catch 95% of the problems on your site)
2) set some tasks, like:
- show them your homepage for 8 seconds and then ask
them if they know what the site offers
- ask them to complete some tasks (find a category,
compare products, make a purchase – that sort of thing)
3) get their feedback
(look for common themes or phrases they use, see where
they had problems – and fix them)
17. Copy analysis checklist
- what is this website about?
Does the copy explain, in a way that shows a
benefit to the reader and gives reasons to stick
around?
- can I trust it?
Are there testimonials, reviews, credit card logos,
security logos?
- is it clear what I can do here?
Am I meant to read, watch a video, make an
enquiry, give my email address?
Go through every page and make sure
each one answers these questions:
18. Design analysis checklist
Go through every page and make sure
each one answers these questions:
- What is the most important thing on this page?
Make sure the most important stuff (eg headline,
image of happy customer, list of benefits) stands
out.
- Is everything where I expect to see it?
The menu at the top, search box top-right, the
most important information high up the page, so
users don't have to scroll.
- Are there any distractions?
Do the pictures and layout help the user or does it
just look good without guiding them?
19. Set up video recording and watch how your visitors
interact with your site. Use a tool like:
Record your visitors
20. - where your users click
- what they look at
- what is ignored or missed
- where users have problems on the page
- how far down the page they scroll
Use Hotjar to learn:
Hotjar clickmap screenshot Hotjar scroll map. Colours get cooler
where fewer people scroll
21. Test site speed
40% of users leave a site if it takes
longer than 3 seconds to load
Every second added to page loading speed
= 7% drop in conversions
If your site is slow, expect to
slide down the Google rankings.
22. Want to find out if
your website is slow?
Plug in your URL to tools like:
- https://tools.pingdom.com
(you can also dig into Google Analytics under Page Speed Insights.
This is takes you beyond the averages, and shows specific pages
that are slow)
- https://gtmetrix.com
25. If your repeat visitors arrive and see your
test page, they're likely to be curious and
explore more than usual.
This is
dangerous
26. That's the 'novelty
effect' in action
Wait for your visitors to get used to your
test version.
Then your results will settle down.
And you can start unovering the treasure
hidden within your data.
At the start of an A/B test, results are
always erratic.
27. - Which seasons or times of the year are
traditionally busy?
28. If your target audience is people
with salaried jobs, they're more likely to
be shopping around the end of the month.
'When is payday?'
31. Before deciding on a winner,
work out your business cycle.
Then run tests
over 2 cycles.
32. Know what's going on in your industry and how this
might affect your test results:
- Black Friday
An increase in visits from bargain hunters
- Seasonal trends
People shopping for Christmas or summer holidays
- Competitor goes bust
You then get an increase in visits from people who
may not have visited your site before
- Communication breakdown
Another department in your organisation publishes
a press release or starts a social media campaign
which leads to a sudden increase in visits
33. Don't worry if you
don't find a winner
That's fine. You'll still find out what
doesn't have an effect.
So you can try something else next time.
"I haven't failed. I've just found
10,000 ways that won't work."
Thomas Edison
The majority of tests
don't give a clear winner.
34. And relax.
Thanks for staying until the end.
Here's a summary:
- Do A/B tests if you have enough traffic
- If you don't, do a 'best practice' analysis
- carry out user testing
- run tests for 2 business cycles
- don't stop the test too early
- don't worry if you don't find a winner
35. We've just scratched the surface.
So I hope you've found something useful among it all.
If you have any questions about the slides,
or you want me to go more in-depth about any of it,
get in touch via LinkedIn:
https://uk.linkedin.com/in/stevealphabet