The Secret to Successful Experimentation
Robin Pam
Product Marketing, Optimizely
robin.pam@optimizely.com
Nick So
Director of Strategy, WiderFunnel
nick.so@widerfunnel.com
2015 2017
“Our decisions are not made
in the old waterfall way that
required a long cycle time.”
Michelle Peluso, CMO
On IBM’s Agile Marketing culture
CULTURE OF EXPERIMENTATION
0
10
100
1000
VELOCITY
MATURITY
10000
VELOCITY
EXPERIMENTATION
HERO
MATURITY
0
10
100
1000
10000
VELOCITY
EXPERIMENTATION
PROGRAM
EXPERIMENTATION
HERO
MATURITY
0
10
100
1000
10000
VELOCITY
EXPERIMENTATION
HERO
EXPERIMENTATION
PROGRAM
CULTURE OF
EXPERIMENTATION
MATURITY
0
10
100
1000
10000
Early win:
112% increase in sign ups for Watson Community
Running at scale:
20X increase in number of experiments and
widespread adoption
IBM transforms into a culture of experimentation
So, how do you actually achieve
experimentation maturity?
A quick story
A client’s journey from just getting started to ongoing,
scalable, organizational experimentation program.
The situation when we started
▪ High volume website with a one-person optimization team.
▪ Mish-mash of technologies for digital experimentation:
Ad server, Optimizely, and Adobe Analytics.
▪ ‘Hunt and peck’ optimization planning.
▪ 4-8 experiments per month.
▪ Little confidence in test results.
▪ Unknown optimization opportunity.
They needed:
A growth program that produces a never-ending
stream of profitable insights.
Here’s where DMV.org is now
▪ The right tech – with Adobe and Optimizely integrated.
▪ The right process – the Infinity Optimization Process.
▪ The right team – integrated DMV.org and WiderFunnel teams.
▪ The right velocity – now running 500+ experiments per year.
▪ The right results – revenue doubled four years in a row.
So… how do I get there?
Source: https://www.gartner.com/technology/research/methodologies/hype-cycle.jsp
Use landing page
templates
Follow conversion
“best practices”
Choosing tools before
strategy / tool-driven
testing.
No experiments
Start personalization without a
personalization strategy
Misunderstand statistical
significance
Pollute test results with improper
technical setup
Little cognitive
bias training
Unaware of Design of
Experiments (DOE)
The most common mistake:
Implementing “gut feeling” ideas without
framework thinking or experiment validation.
Hype phase
▪ Tool-triggered testing
▪ Ad-hoc testing
▪ Low-hanging fruit
▪ ‘Best practices’ testing
▪ Isolated testing team
▪ Concluding tests before 7 days or
running for 6+ weeks
▪ Initial wins from low-hanging fruit
▪ Excitement to scale rapidly
▪ Interest and hype from other
departments
▪ Reporting results based on
secondary conversion metrics
▪ Logjam of ideas
▪ No idea where or what to test first
▪ Inability to scale
▪ Diminishing wins and ROI
“We had been struggling with doing optimization
ourselves internally and with other partners but
getting no meaningful results. We needed help
from someone who could bring a structured
approach to our optimization efforts.”
Avoid the trough of disillusionment:
The importance of framework thinking.
The Infinity Optimization Process
Yin
Creative
Inspired
Fuzzy
QualitativeYang
Logical
Proven
Solid
Quantitative
The LIFT Model
The right Value Proposition
communicated with Clarity
in a context of Relevance
avoiding Anxiety and Distraction
and adding high Urgency.
CLARITY: Headline is long, wrapping 3
lines.
RELEVANCE: “Try the Demo” – not
referenced in headline on demo page.
DISTRACTION: Unnecessary copy
persuading visitors to “try the demo”.
CLARITY: Details of what the demo
includes hidden within the block of copy.
URGENCY: No mention of “instant
access”
CLARITY: No mention of “Free” on the
page.
LIFT Analysis
LIFT Analysis
DISTRACTION: “Try the Demo” CTA still
at the top of the page.
DISTRACTION: Unnecessary text block.
CLARITY: First form field is not
auto-focused.
CLARITY: Are all fields required?
CLARITY: “Submit” is the worst possible
CTA button copy.
RELEVANCE: Inconsistent CTA design
compared to previous step.
Winning merged page design
How do I decide where to start?
It’s as easy as PIE
PIE Prioritization Framework
Potential
How much
improvement can be
made?
Ease
How easy would it be
to actually make these
improvements?
Importance
How much impact can
these improvements
have on our primary
business goal?
“Zone” prioritization: Where to experiment
Lift Zone #1 8 7 7 7.3
Lift Zone Potential Importance Ease PIE Score
Web analytics
Heuristic & first impression analysis
Voice of customer
Revenue impact
Experimentation velocity
Cost of acquisition
Technical
“Political”
Hypothesis prioritization: What to experiment on
Hypothesis #1 3 7 9 6.3
Hypothesis Potential Importance Ease PIE Score
Evidence from Digital Analytics
Evidence from User Testing & Research
Based on Persuasion Principles
Conversion KPI
Revalidates Past Insight
Informs Business Insight
Technical
“Political”
Business Prioritization
Inputs:
▪ Unique visitors / users
▪ Identified conversion goal
▪ Conversion rate
▪ Value per conversion
Business Prioritization
Inputs:
▪ Unique impressions
▪ Identified conversion goal
▪ Conversion rate
▪ Value per conversion
Achieving organizational experimentation maturity
Optimization is a shared
company priority, with
dedicated resources.
Rigorous process guides
ideation, prioritization,
execution & analysis.
Foundations for Experimentation Maturity
Optimization strategy and
metrics are directly linked to
key company metrics.
Right tech stack and set
development process for
testing & implementation.
Defined system for idea
gathering and results sharing
across the organization.
Each experiment is a valued
resource (win or lose), providing
quality insights and results.
▢ Conversion metrics have direct impact on
revenue or on major business goals.
▢ There is a dollar value attributed to a
successful conversion.
▢ Conversion metrics are prioritized in
order of importance to business.
▢ Conversion “success” is clearly defined.
▢ Overall optimization program success is
linked to operational metrics.
Checklist: Identify impactful optimization success metrics
▢ A system enabling employees to submit
hypotheses and test ideas.
▢ Set objective evidence factors for
hypothesis prioritization.
▢ List of agreed upon conversion metrics.
▢ Hypotheses prioritization framework in
place and understood.
Checklist: Idea & Hypothesis Gathering
Checklist: Results Sharing & Dissemination
▢ Executive summary reports and quarterly
reviews.
▢ Accessible real-time data reports.
▢ Posted results reports or progress graphs
displayed openly.
▢ Internal results and insights database.
▢ Cross-departmental results & insights
presentations.
“Experimentation is important because
change is the only constant. Whether
it’s consumer behavior or new technology,
things are always dynamic in our world.
You can never assume that what worked
yesterday will work tomorrow.”
- Gretchen Gary, Product Manager
Thank You!

Lunch & Learn - Secret to Successful Experimentation

  • 1.
    The Secret toSuccessful Experimentation Robin Pam Product Marketing, Optimizely robin.pam@optimizely.com Nick So Director of Strategy, WiderFunnel nick.so@widerfunnel.com
  • 3.
  • 5.
    “Our decisions arenot made in the old waterfall way that required a long cycle time.” Michelle Peluso, CMO On IBM’s Agile Marketing culture
  • 7.
  • 8.
  • 9.
  • 10.
  • 11.
  • 12.
    Early win: 112% increasein sign ups for Watson Community Running at scale: 20X increase in number of experiments and widespread adoption IBM transforms into a culture of experimentation
  • 13.
    So, how doyou actually achieve experimentation maturity?
  • 14.
    A quick story Aclient’s journey from just getting started to ongoing, scalable, organizational experimentation program.
  • 15.
    The situation whenwe started ▪ High volume website with a one-person optimization team. ▪ Mish-mash of technologies for digital experimentation: Ad server, Optimizely, and Adobe Analytics. ▪ ‘Hunt and peck’ optimization planning. ▪ 4-8 experiments per month. ▪ Little confidence in test results. ▪ Unknown optimization opportunity.
  • 16.
    They needed: A growthprogram that produces a never-ending stream of profitable insights.
  • 17.
    Here’s where DMV.orgis now ▪ The right tech – with Adobe and Optimizely integrated. ▪ The right process – the Infinity Optimization Process. ▪ The right team – integrated DMV.org and WiderFunnel teams. ▪ The right velocity – now running 500+ experiments per year. ▪ The right results – revenue doubled four years in a row.
  • 18.
    So… how doI get there?
  • 19.
  • 21.
    Use landing page templates Followconversion “best practices” Choosing tools before strategy / tool-driven testing. No experiments Start personalization without a personalization strategy Misunderstand statistical significance Pollute test results with improper technical setup Little cognitive bias training Unaware of Design of Experiments (DOE)
  • 22.
    The most commonmistake: Implementing “gut feeling” ideas without framework thinking or experiment validation.
  • 24.
    Hype phase ▪ Tool-triggeredtesting ▪ Ad-hoc testing ▪ Low-hanging fruit ▪ ‘Best practices’ testing ▪ Isolated testing team ▪ Concluding tests before 7 days or running for 6+ weeks
  • 25.
    ▪ Initial winsfrom low-hanging fruit ▪ Excitement to scale rapidly ▪ Interest and hype from other departments ▪ Reporting results based on secondary conversion metrics
  • 26.
    ▪ Logjam ofideas ▪ No idea where or what to test first ▪ Inability to scale ▪ Diminishing wins and ROI
  • 27.
    “We had beenstruggling with doing optimization ourselves internally and with other partners but getting no meaningful results. We needed help from someone who could bring a structured approach to our optimization efforts.”
  • 28.
    Avoid the troughof disillusionment: The importance of framework thinking.
  • 30.
  • 32.
  • 35.
  • 38.
    The right ValueProposition communicated with Clarity in a context of Relevance avoiding Anxiety and Distraction and adding high Urgency.
  • 39.
    CLARITY: Headline islong, wrapping 3 lines. RELEVANCE: “Try the Demo” – not referenced in headline on demo page. DISTRACTION: Unnecessary copy persuading visitors to “try the demo”. CLARITY: Details of what the demo includes hidden within the block of copy. URGENCY: No mention of “instant access” CLARITY: No mention of “Free” on the page. LIFT Analysis
  • 40.
    LIFT Analysis DISTRACTION: “Trythe Demo” CTA still at the top of the page. DISTRACTION: Unnecessary text block. CLARITY: First form field is not auto-focused. CLARITY: Are all fields required? CLARITY: “Submit” is the worst possible CTA button copy. RELEVANCE: Inconsistent CTA design compared to previous step.
  • 41.
  • 43.
    How do Idecide where to start? It’s as easy as PIE
  • 45.
    PIE Prioritization Framework Potential Howmuch improvement can be made? Ease How easy would it be to actually make these improvements? Importance How much impact can these improvements have on our primary business goal?
  • 46.
    “Zone” prioritization: Whereto experiment Lift Zone #1 8 7 7 7.3 Lift Zone Potential Importance Ease PIE Score Web analytics Heuristic & first impression analysis Voice of customer Revenue impact Experimentation velocity Cost of acquisition Technical “Political”
  • 47.
    Hypothesis prioritization: Whatto experiment on Hypothesis #1 3 7 9 6.3 Hypothesis Potential Importance Ease PIE Score Evidence from Digital Analytics Evidence from User Testing & Research Based on Persuasion Principles Conversion KPI Revalidates Past Insight Informs Business Insight Technical “Political”
  • 48.
    Business Prioritization Inputs: ▪ Uniquevisitors / users ▪ Identified conversion goal ▪ Conversion rate ▪ Value per conversion
  • 49.
    Business Prioritization Inputs: ▪ Uniqueimpressions ▪ Identified conversion goal ▪ Conversion rate ▪ Value per conversion
  • 50.
  • 51.
    Optimization is ashared company priority, with dedicated resources. Rigorous process guides ideation, prioritization, execution & analysis. Foundations for Experimentation Maturity Optimization strategy and metrics are directly linked to key company metrics. Right tech stack and set development process for testing & implementation. Defined system for idea gathering and results sharing across the organization. Each experiment is a valued resource (win or lose), providing quality insights and results.
  • 52.
    ▢ Conversion metricshave direct impact on revenue or on major business goals. ▢ There is a dollar value attributed to a successful conversion. ▢ Conversion metrics are prioritized in order of importance to business. ▢ Conversion “success” is clearly defined. ▢ Overall optimization program success is linked to operational metrics. Checklist: Identify impactful optimization success metrics
  • 53.
    ▢ A systemenabling employees to submit hypotheses and test ideas. ▢ Set objective evidence factors for hypothesis prioritization. ▢ List of agreed upon conversion metrics. ▢ Hypotheses prioritization framework in place and understood. Checklist: Idea & Hypothesis Gathering
  • 54.
    Checklist: Results Sharing& Dissemination ▢ Executive summary reports and quarterly reviews. ▢ Accessible real-time data reports. ▢ Posted results reports or progress graphs displayed openly. ▢ Internal results and insights database. ▢ Cross-departmental results & insights presentations.
  • 55.
    “Experimentation is importantbecause change is the only constant. Whether it’s consumer behavior or new technology, things are always dynamic in our world. You can never assume that what worked yesterday will work tomorrow.” - Gretchen Gary, Product Manager
  • 56.