SearchLove London 2018 - Dom Woodman - A year of SEO split testing changed how I thought SEO worked
If you asked a UX professional whether users prefer one image or two on a blog post, they'd tell you to test it — trying to double guess users is foolish.
Yet for many companies, SEO has no testing at all, just endless reams of best practice and hand waving. Last year I changed role and got the chance to treat SEO differently, running over 50 tests across different websites. This session will give an insight into what worked, and just as importantly, what didn’t.
If you don’t have intent,
bells and whistles fail.
5 star rich snippets increased traffic by 16% on the right site with the
right intent. When the intent wasn’t there, they appeared, but did
You don’t need to argue,
when you can test.
When you have solid testing framework and can build things quickly
and easily, testing is easier than arguing and removes arguments
from a relationship.
Hmm that did
Ooh traffic went up
The same changes have
different effects on
Really can’t emphasize this one enough.
Get a testing framework.
Most tests fail or are null.
Having a framework will help you move faster and find those wins.
Testing will improve your
You’ll have to spend less time arguing and it creates a culture of
You probably have some beliefs about what works or doesn’t work
which are wrong from blind chance. Re-test them.
Making changes to sections of pages
● Making SEO changes with tag
● Cloud flare edge workers
General useful posts on testing
frameworks & velocity
● Hypothesis framework
● Running a weekly growth meeting
Do it yourself
How does split testing work?
● How does SEO split testing work?
● Pinterest - Demystifying SEO with
● Etsy - SEO title tag testing
Measuring SEO split tests
● Google’s original causal impact
● A DIY tool for measuring SEO split
● A walkthrough of the R Causal