7. Does anyone in this room believe that the Earth
doesnât revolve around the Sun?
8.
9. The Earth (and everything
in the solar system,
including the Sun) revolves
around our systemâs
gravitational Barycenter,
which is only sometimes
near the center of the Sun.
11. In 2009, Conversion
Rate Experts built us a
new landing page, and
increased our
subscribers by nearly
25%. What did they do?
Via CREâs Case Study
12. One of the most
commonly cited facts
about CREâs work is the
âlong landing page.â
13. The Crap Skeptic The Good Skeptic The Great Skeptic
Letâs change our
landing page to
be a long one
right now!
We should A/B
test a long
landing page in
our conversion
funnel.
How do we know
page length was
responsible? What
else changed?
14. The Crap Skeptic The Good Skeptic The Great Skeptic
âI do believe sadly itâs
going to take some
diseases coming back to
realize that we need to
change and develop
vaccines that are safe.â
âListen, all magic is
scientific principals
presented like "mystical
hoodoo" which is fun,
but it's sort of
irresponsible.â
"The good thing about
science is that it's true
whether or not you
believe in it."
15. In fact, weâve changed our
landing pages numerous
times to shorter versions and
seen equal success. Length, it
would seem, was not the
primary factor in this pageâs
success.
17. Assumes one belief-reinforcing
data point is evidence enough
Doesnât question whatâs truly
causal vs. merely correlated
Doesnât seek to validate
18. Doesnât make assumptions about
why a result occurred
Knows that correlation isnât
necessarily causal
Validates assumptions w/ data
19. Seeks to discover the reasons
underlying the results
Knows that correlation
doesnât imply causality
Thoroughly validates, but doesnât
let imperfect knowledge stop
progress
23. Via Wordstreamâs What is a Good Conversion Rate?
Do Those Who Test More Really Perform Better?
24. Hmm⌠Thereâs no correlation
between those who run more
tests across more pages and
those who have higher
conversion rates. Maybe the
number of tests isnât the right
goal.
26. Trust
Word of Mouth
Likability
Design
Associations
Word of Mouth
Amount of Pain
CTAs
UX
Effort Required
Process
Historical Experiences
Social Proof
Copywriting
CONVERSION DECISION
Timing
Discovery Path
Branding
Price
(itâs a complex process)
27. How do we know where our
conversion problems lie?
28. Ask Smart Questions to the Right People
Potential Customers
Who Didnât Buy
Those Who Tried/Bought But
Didnât Love It
Customers Who
Bought & Loved It
Professional,
demographic, &
psychographic
characteristics
Professional, demographic, &
psychographic characteristics
Professional,
demographic, &
psychographic
characteristics
What objections did you
have to buying?
What objections did you have;
how did you overcome them?
What objections did
you overcome; how?What would have made you
stay/love the product?
What would have made
you overcome them?
What do you love
most? Can we share?
29. We can start by targeting
the right kinds of
customers. Trying to please
everyone is a recipe for
disaster.
30. Our tests should be
focused around
overcoming the
objections of the people
who best match our
customer profiles
38. Via Visual Website Optimizer
A/B Test Results
They found that without the secure icon had over
400% improvement on conversions as compared
to having the image.
[Note: results ARE statistically significant]
39. We need to remove the
security messages on our
site ASAP!
43. Via Kayakâs Most Interesting A/B Test
A/B Test Results
âSo we decided to do our own experiment about this
and we actually found the opposite that when we
removed the messaging, people tended to book less.â
- Vinayak Ranade, Director of Engineering for Mobile, KAYAK
66. Thereâs a lot of nuance, but we
can certainly see how
messages sent at certain times
reach different sizes and
populations of our audience.
67. Comparing a tweet or share sent
at 9am Pacific against tweets
and shares sent at 11pm Pacific
will give us misleading data.
68. But, we now know three things:
#1 - When our audience is online
#2 â Sharing just once is suboptimal
#3 â To be a great skeptic (and
marketer), we should attempt to
understand each of these inputs with
similar rigorousness
69. Do they work? Can we make
them more effective?
Share Buttons
82. Testing a small number of the
most impactful social button
changes should produce
enough evidence to give us a
direction to pursue.
83. Buzzfeed & OKTrends share several
unique qualities:
1) They have huge amounts of
social traffic
2) Social shares are integral to their
business model
3) The content they create is
optimized for social sharing
84. Unless we also fit a number of
these criteria, I have to ask again:
Is this the most meaningful test
we can perform right now?
85. BTW â it is true that testing social
buttons can coincide with a lot of
other tests (since itâs on content
vs. the funnel), but dev resources
and marketing bandwidth
probably are not infinite ď
86. Does it still work better
than standard link text?
Anchor Text
87. Psh. Anchor text links
obviously work. Otherwise
Google wouldnât be
penalizing all these sites for
getting them.
88. It has been a while since weâve
seen a public test of anchor text.
And thereâs no way to know for
sure how powerful it still is.
89. Testing in Google is very, very hard.
Thereâs so many confounding
variables â weâd have to choose our
criteria carefully and repeat the test
multiple times to feel confident of any
result.
90. 1) Three word, informational keyword phrase with relatively light
competition and stable rankings
Test Conditions:
2) We selected two results (âAâ and âBâ), ranking #13 (âAâ) and
#20 ( âBâ) in logged-out, non-personalized results
3) We pointed links from 20 pages on 20 unique, high-DA, high-
trust, off-topic sites at both âAâ and âBâ
91. A) We pointed 20 links from 20
domains at this result with anchor
text exactly matching the query
phrase
#11
#12
#13
#14
#15
#16
#17
#18
#19
#20
B) We pointed 20 links from the
same 20 pages as âAâ to this URL
with anchor text that did not contain
any words in the query
94. While both results moved up the
same number of positions, itâs
almost certainly the case that #13
to #9 was against more serious
challengers, and thus anchor text
would seem to make a difference.
That said, Iâd want to repeat this a
few times.
95. Princess Bubblegum and I are in
agreement. We should do the test
at least 2-3 more times keeping as
many variables as possible the
same.
96. 1) Three word, informational keyword phrase with relatively light
competition and stable rankings
Early Results from a Second Test:
2) We selected two results (âAâ and âBâ), ranking #20 (âAâ) and
#14 ( âBâ) in logged-out, non-personalized results
3) We pointed links from 20 pages on 20 unique, high-DA, high-
trust, off-topic sites at both âAâ and âBâ
97. B) We pointed 20 links from 20
domains to this URL with anchor
text that did not contain any words
in the query
#11
#12
#13
#14
#15
#16
#17
#18
#19
#20
A) We pointed 20 links from the
same pages/domains at this result
with anchor text exactly matching
the query phrase
99. Good thing we
tested!
This is looking more
conclusive, but we
should run at least
one more test.
Anchor text =
rankings. Stick a
fork in it!
100. Does it influence Googleâs non-
personalized search rankings?
Google+
101. Good discussion about Google+ correlations in this post
Google+ is just too damn high.
102. Good discussion about Google+ correlations in this post
From a comment Matt Cutts left on the blog post:
âMost of the initial discussion on this thread seemed to take from the blog
post the idea that more Google +1s led to higher web ranking. I wanted to
preemptively tackle that perception.â
103. Good discussion about Google+ correlations in this post
To me, thatâs Google working really hard to NOT say âwe donât use any data
from Google+ (directly or indirectly) at all in our ranking algorithms.â I would
be very surprised if they said that.
104. Google explicitly SAID +1s
donât affect rankings. You
think theyâd lie so blatantly?
As if.
105. The correlations are surprisingly
high for something with no
connection. There have been
several tests showing no result,
but if all it takes is a Google+ post,
letâs do it!
106. First, remember how hard it is to
prove causality with a public test
like this. And second, donât let
anything but consistent, repeatable,
provable results sway your opinion.
109. 42 minutes later, after ~30
shares, 40 +1s, and several
other G+ accounts posting
the link, the target moved up
to position #23
#21
#22
#23
#24
#25
#26
114. Could Google be donking up the test?
Sadly, itâs impossible to know.
115. GASP!!! The posts did move
the result up, then someone
from Google must have
seen it and is messing with
you!!!
116. Sigh⌠Itâs possible that Jennyâs
right, but impossible to prove. We
donât know for sure what caused
the initial movement, nor can we
say whatâs causing the weird
personalized results.
117. More testing is needed, but how
you do it without any potential
monkey wrenches is going to be a
big challenge.
That said, remember this:
119. If I were Google, I wouldnât use
Google+ activity by itself to rank
anything, but I would connect G+
to my other data sources and
potentially increase a pageâs
rankings if many pieces of data
told a story of engagement &
value for visitors.