Last year the growth team at Box revisited a design layout from their Plans page. A page that they had been excited about redesigning in the past, but had failed to show the measurable improvements needed to key business metrics.
With a new hypothesis and experiment implementation — the team found that this redesign not only delivered more sign ups, it also made a significant improvement to their annual recurring revenue (ARR). In this webinar, Box's Renny Chan will share their philosophy of revisiting tests that have failed with a fresh perspective.
In this webinar we share:
- A new process for evaluating failed tests in your program
- How to think about the win/loss rate of your program
- New ideation tactics that will inform the way you create new experiments
5. 5
Digital Experience Optimization:
Digital Products, commerce & campaigns
Up to 5X Increase in Yield:
Revenue, share of wallet, funnel conversion,
risk mitigation, ops efficiency
Partner of Choice:
Work with leading global enterprises & “digital
disruptors” including 26 of F100
OUR COMPANY
Digital Experimentation Platform:
Next gen “Test and Learn” system
Replaces Digital Guesswork:
with evidence-based optimization
Speeds Innovation & Optimization:
Single platform for marketing & product teams
Best-in-class stats & machine learning
Consumer-grade usability
Enterprise program management & prof. services
OUR SOLUTION
6. 6
“Our success is a function of
how many experiments
we do per year, per month,
per week, per day.”
“Instead of saying ‘I have an
idea,’ what if you said
‘I have a new hypothesis,
let’s go test it.’”
“Our company culture
encourages experimentation
and
free flow of ideas.”
“One of the things I’m
most proud of, and I think
what is the key to our
success, is this testing
framework we’ve built.”
Experimentation is the
Next Great Business
Transformation
Jeff Bezos Larry Page
Mark Zuckerberg Satya Nadella
The Surprising
Power of Online
Experiments
7. 7
10x more experiments
Consumer-grade usability
Open data integration
Maximum yield of business value
UX and feature-level experiments
and personalization at every digital
touch point
Enterprise-wide
management & governance
Captures, governs and shares
ideation, analyses & results
World’s most trusted outcomes
BestinclassStatsEngine
FasttimetoresultsviaML
Accelerates digital innovation
Speeds dev ops & deployment
De-risks continuous feature delivery
Ensures success of new features
Unifying flagging & experiments
enables controlled testing of new
features while maintaining high
performance
Ideate
Manage
StoreGovern
Analyze
Share
Open Data
Integration
Security &
compliance
Stats
Engine
Stats
Accelerator
Consumer-
grade
usability
APIs &
Developer
Tools
Feature
Flags
Open source
SDKs
X-Channel
Full Stack
Experimentation
Personali-
zation
Recommen-
dations
Web
Experimentation
PRODUCTS
COMMERCE
CAMPAIGNS
Optimizely X Unlocks the Experimentation Best Practices
of the World’s Greatest Digital Companies
8. 8
26 of the fortune 100 have chosen Optimizely to drive their digital experience
We’re Proud to Work With Great Global Enterprises
9. 9
B U S I N E S S V A L U E
VELOCITY/VOLUME
LEVEL 1
Executional
Start
LEVEL 2
Foundational
Growth
LEVEL 3
Cross-Functional
Advancement
LEVEL 4
Operational
Excellence
LEVEL 5
Culture of
Experimentation
Our Products and Services Take
You on Your Experimentation Journey
11. Turning failed tests into big wins
Renny Chan, Sr. Manager, Growth & Monetization
12. Agenda
/ Who am I?
/ Reviewing failed tests
/ How to get team buy in?
/ Implementation and results
/ Team structure at Box
/ Generating new ideas
/ Key takeaways
13. 13
• Previous experience running A/B testing at Intuit Canada and at BigCommerce
o Intuit Canada didn’t have an A/B testing roadmap
o BigCommerce had no A/B testing framework
o Built momentum around testing at both companies
• Currently lead Growth & Monetization at Box
What am I doing at Box?
Where I came from and what am I doing now
14. 14
• Evaluate and try to understand why the test failed
Some questions to ask: Bad implementation? Bad audiences? Desktop vs Mobile? Or wrong hypothesis?
• Determine if the test hypothesis is still valid
• Build a case for running a new iteration of the test
Why review old tests?
Always take time to go back and look at how old tests perform
15. 15
• Our pricing page used to only display 3 plan cards
We had a 4th plan called Business Plus that wasn’t shown
It was a victim of a poorly implemented test
Included in a complete page re-design so there were a
significant number of changes all at once
• This was still a valid test
• We wanted to ensure the test was clean
• Business Plus was being sold by our sales agents at a pretty
good rate
Why did we re-run our test?
Reasoning and case
16. 16
• Explain the factors surrounding why a test failed previously
• Present data supporting why re-testing would be a good idea
• Understand and be empathetic to pushback
How to convince the team?
Re-running failures can be tough
17. 17
• We took the current pricing page and added the
Business Plus plan
• Hypothesis: By adding the Business Plus plan we
would increase our ARR since it’s a higher value plan.
We expected a decline in Business plan signups
• We ran this test for 1 month
• Saw a decrease in Business plan conversions, but the
increase in Business Plus conversions more than
made up for the difference
• Closed the test at 1.57% traffic to trial vs. 1.50% for
Business only. We increased Annual Recurring
Revenue (ARR) by 11%
How did we do it? What did it look like?
What did we measure?
18. 18
• A/B Testing PM
Responsible for deployment and test creation
• Growth/Monetization Marketer
Ideation/Champion for various tests
• Engineering Team
Make the test a reality, and deploys winners
• Analytics Team
Test measurement
Team structure at Box
Website A/B testing lives on the Digital Team
19. 19
• Well defined roles and responsibilities
This has helped with testing velocity
• Great to have a team of SMEs that can execute in their areas of expertise
• Having a broader team has helped us come with with different testing ideas that I hadn’t considered
before
Benefits of the team structure at Box
Where would I start building a team
20. 20
• Research, research, research
Look at what other companies in your industry and what are they testing
Read behavioral psychology books on customer behavior
• Set up brainstorming sessions with various team members
Outside team members bring different perspectives that can be new and interesting
• Spend time going through your own flow
Really understand what are things that make you click on different CTAs
• User research
• Spend time being empathetic and learning about customer pain points and what improvements would
make the experience better for them
Ideation
How to come up with new ideas
22. 22
• Pricing plan re-ordering
Examples of ideas we’ve tested/testing
A few of our ideas
23. 23
• Moving the most popular banner
Examples of ideas we’ve tested/testing
A few of our ideas
24. 24
• Do not provide readouts early on during the testing process
• Run tests that are as radical as possible
Prioritize big and medium impact tests
Don’t run a bunch of small tests
• Try and make sure tests are as clean as possible
• Review tests and results with different teams
• Have fun!
Lessons learned
How can this help you with your testing program