Experiment Design
Moving from ideas to business
Dylan Evans
Senior Design Lead for Digital User Experience
November 24th 2016
© Philips 2016
2
Twitter @veluuria
Where I started…
Moving from fields to offices
There is no such thing
as a failed experiment...
…only experiments with
unexpected outcomes.
Buckminster Fuller
Why?
On Experimentation
In a lot of ways building a company is
like following the scientific method. You
try a bunch of different hypotheses, and
if you set up the experiments well, then
you kind of learn what to do…
We invest in this huge testing
framework. At any given point in time,
there’s not just one version of Facebook
running in the world. There’re probably
tens of thousands of versions running
because engineers here have the power
to try out an idea and ship it to maybe
10,000 people or 100,000 people. And
then they get a readout.
As humans, we:
3. When successful, assume we know
what made it successful
2. Inflate the impact of a success
1. Overestimate the probability of success
If we’re so
biased,
how do we
inform the
decision to
invest?
Validated
learning…
… is a process by which we learn by
experimenting with an idea and
measuring it to validate its effect.
By using validated learning
we can create the best
business recommendation
based on how it touches:
• the consumer
• the business
A point about learning…
Unconscious
Incompetence
Conscious
Incompetence
Conscious
Competence
Unconscious
Competence
Ignorance is
bliss
I’m not
doing it
right!
I can do
it if I try
To learn,
we must
experiment
…is about finding what
works best, and quickly.
Experimenting…
…creates the
opportunity to
explore multiple
directions,
choose the best
and refine it.
Experimenting helps…
Discover fatal
flaws early,
before all
time and money
has been spent
Reduce the risk of
keeping things
untested and…
Experimenting helps…
(Highest Paid Person’s Opinion)
Experimenting helps…
Overcome
opinion-led
decision
making by
HiPPOs
How?
Each experiment
tests a falsifiable
hypothesis.
Focus on speed
for faster
validated learning.
Experiment Design
Why	we	need	to	experiment
• AS	humans,	we:
• 1.	 Overestimate	probability	of	success
2.	 Inflate	the	impact	of	a	success
3.	 When	successful,	assume	we	know	what	made	it	successful
• Changing	environment
Typical graph for development
Options for
Success
Cost of
change
Time Launch
Amount
Planning
Costs increase, options decrease
The first principle is that you
must not fool yourself
- and you are the easiest
person to fool.
Richard Feynman
Image	from	https://grasshopperherder.com/assumption-vs-hypothesis-to-the-death/
Image	from	https://grasshopperherder.com/assumption-vs-hypothesis-to-the-death/
hypothesis
Pronunciation:
/hīˈpäTHəsəs/
NOUN
A supposition or proposed
explanation made on the
basis of limited evidence as
a starting point for further
investigation
assumption
Pronunciation:
/əˈsəm(p)SH(ə)n/
NOUN
A thing that is accepted
as true or as certain to
happen, without proof.
Assumption 1
Hypothesis	1
Hypothesis	2
Hypothesis	3
Hypothesis	4
Experiment	
1
Experiment	3
Experiment	
2
Break assumptions
down in to hypotheses
and experiments to
gain coverage
It’s child's play
Children iterate
towards a solution
through trial and error
Your
journey
Incrementing vs. Iterating
Idea Business
Specify
hypothesis
Design
experiment
& actions
to be taken
with output
Run test
and take
actions
The Experimental Design iterative learning loop
State what we believe
to be true and required
for the offer’s success
Choose a methodology
define the measure and
set a target and actions
Run test, capture the
outcomes and follow
actions to be taken
Prioritise!
If hypothesis is
confirmed, we persevere
and GO.
If falsified, we pivot and
change the proposition.
What will kill the
proposition first?
2
34
1
Idea BusinessMany, many iterations
Exp 1 Exp 2 Exp 3 Exp 4
Experiment along the journey
Check if the performance matches the promise
Idea BusinessMany, many iterations
Video
Advert
(A/B)
Web Page
(A/B)
Paper
Prototype
1 Placed Facebook ads 2 Clicked through to Landing Page 3 Sign-up
Sequence experiments to test the future experience
4 Recruited for
beta product test
Prototypes are the engine of experimental design
Making stuff
deepens our
thinking and
drives ACTION
If you want to make something
great, start making.
Tom Kelley, IDEO
Having stuff means
we can share with
people who will “get
it” more easily
We can iterate and
make better
development
decisions
1 2 3
‘‘
The purpose of a
Minimum Viable
Proposition (MVP) is to
maximize learning per
euro we spend*.
* using customers and not testers
Real“Test” products
Consumer
perception
Our perception
MVP 1.0
Prototypes
€/real & LCM
1.0
Propositions
on the market
MVP MVP
Learning tools
MVPs are prototypes that customers believe to be real
Proto A Proto B Proto C Proto DProto AProto C
Experts can guide to the right experiment design
Low Fidelity - FASTER
Buy Use
High Fidelity - SLOWER
Wizard of Oz
ConciergeImposter Judo
Analog/Physical
Dry-Wallet High Hurdle
Video Trailer
Crowd-Funding
Pitch
Landing page
Qualitative
interviews / groups
Survey
Web Prototype
Paper Mockup / POP
Keynote / InVision Prototype
Online Ads
TypeForm Landing
Native OS Prototype
Role Play
Quantitative
Product Usage
tests
Technical / sensory
testing
BASES / Concept test
Classic methods
Recent digital-enabled methods
Shop alongs
© Philips 2016
Example
journey
Exp 1 Exp 2 Exp 3 Exp 4
Experiment along the journey
Check if the performance matches the promise
Idea BusinessMany, many iterations
Video
Concept
video
When we involve
real customers, we
need to engage
them through
storytelling
Add	pic
1. Create a story
2. Make the video
3. Fix the video
Too
complex?
A simple structure is your friend:
Confrontation/tension
Enlightenment/Change
Exploration
Closure }e.g. try a four
act structure
Exp 1 Exp 2 Exp 3 Exp 4
Experiment along the journey
Check if the performance matches the promise
Idea BusinessMany, many iterations
Video
Advert
(A/B)
Web Page
(A/B)
A/B test
Big directions
In the early stages, it’s not a small variable
vs. small variable (aka optimisation)
We have to answer and find the big
directions before we can optimise.
Choose the best channel for your communication
http://www.growthtribe.io/blog/brass-method-your-ultimate-guide-to-prioritising-which-customer-acquisition-channel-to-test-first/
Facebook isn’t always the right channel to reach
your audience.
Analytics
How many & when?
- e.g. Google Analytics
- e.g. Advert analytics
What did they do?
- E.g. Hotjar
n.b. The numbers never match
Exp 1 Exp 2 Exp 3 Exp 4
Experiment along the journey
Check if the performance matches the promise
Idea BusinessMany, many iterations
Video
Advert
(A/B)
Web Page
(A/B)
Paper
Prototype
1st consult
skills
2nd consult
habit
3rd consult
Last resort
basic instruction one remedy one Talcum powder (messy but it
works)
basic instruction two remedy two
gauge skill level advanced instruction
introduction to new skill error strategy
terminology understanding confront habit
surprising instruction
Concierge MVP
Team identified an advice model
which appears to very effective
Paper
Prototype
GoodBad
While the one on the left looks nicer, it is not usable.
Iterate through your sketches, throw lots away, keep the best stuff.
How useful?
Try http://popapp.in and see how quickly
you can get a prototype running.
Use this to test navigation & meaning very
early on. Use the results to iterate through
a web app with full analytics. Use this to
understand if a feature is worth building.
Your Team
Draw on expertise around you to:
Structure Effective Hypotheses
Identify the right Risks in the Idea - Elephant in the room
Ensure Stakeholders stay accountable to the Results
Create the ability to Run Quick, Iterative Experiments
Stop the experiment when you’ve collected the full data
Hire statistical knowledge (sample # you must use to find a significant result)
Run at same times - compare apples with apples
Create control variables across experiments
Integrate traditional testing where relevant
Build your individual expertise
Use clear assessment criteria (Desirability, Feasibility, Viability)
A TEAM
MARKETING PROFESSIONALS
DESIGNERS & STRATEGISTS
TECHNOLOGY SPECIALISTS
DEVELOPERS
SUBJECT MATTER EXPERTS
STAKEHOLDERS
Closing
After each experiment, there are always 4
possible outcomes
1. GO You are confident that your hypothesis is
valid and have the evidence to show it
2. Confirm
3. Pivot
4. STOP
You are confident that your hypothesis is
correct but you need more evidence to
support it
You no longer believe that this offer is a
right for you to pursue at this time.
You believe in the vision but you need to
re-visit the means & path to get to the
vision
Experiment Design
Key Take Aways
KPI driven
Simple
As close to reality as possible
Prioritized
Statistically significant
Allowed to fail!
Experiments must be:
Build the whole journey
Speed is your ally. Perfection comes later
Your first experiment is a rehearsal
Your approach & mindset
Fail Fast,
learn fast.
The only
way to
proceed
is to get
your feet
wet
Thank you

Presentation: Philips

  • 1.
    Experiment Design Moving fromideas to business Dylan Evans Senior Design Lead for Digital User Experience November 24th 2016 © Philips 2016
  • 2.
    2 Twitter @veluuria Where Istarted… Moving from fields to offices
  • 3.
    There is nosuch thing as a failed experiment... …only experiments with unexpected outcomes. Buckminster Fuller
  • 4.
  • 5.
    On Experimentation In alot of ways building a company is like following the scientific method. You try a bunch of different hypotheses, and if you set up the experiments well, then you kind of learn what to do… We invest in this huge testing framework. At any given point in time, there’s not just one version of Facebook running in the world. There’re probably tens of thousands of versions running because engineers here have the power to try out an idea and ship it to maybe 10,000 people or 100,000 people. And then they get a readout.
  • 6.
    As humans, we: 3.When successful, assume we know what made it successful 2. Inflate the impact of a success 1. Overestimate the probability of success
  • 7.
    If we’re so biased, howdo we inform the decision to invest?
  • 8.
    Validated learning… … is aprocess by which we learn by experimenting with an idea and measuring it to validate its effect.
  • 9.
    By using validatedlearning we can create the best business recommendation based on how it touches: • the consumer • the business
  • 10.
    A point aboutlearning… Unconscious Incompetence Conscious Incompetence Conscious Competence Unconscious Competence Ignorance is bliss I’m not doing it right! I can do it if I try
  • 11.
  • 12.
    …is about findingwhat works best, and quickly. Experimenting… …creates the opportunity to explore multiple directions, choose the best and refine it.
  • 13.
    Experimenting helps… Discover fatal flawsearly, before all time and money has been spent
  • 14.
    Reduce the riskof keeping things untested and… Experimenting helps…
  • 15.
    (Highest Paid Person’sOpinion) Experimenting helps… Overcome opinion-led decision making by HiPPOs
  • 16.
  • 17.
    Each experiment tests afalsifiable hypothesis. Focus on speed for faster validated learning. Experiment Design
  • 18.
    Why we need to experiment • AS humans, we: • 1. Overestimate probability of success 2. Inflate the impact of a success 3. When successful, assume we know what made it successful • Changing environment
  • 19.
    Typical graph fordevelopment Options for Success Cost of change Time Launch Amount Planning Costs increase, options decrease
  • 20.
    The first principleis that you must not fool yourself - and you are the easiest person to fool. Richard Feynman
  • 21.
  • 22.
  • 23.
    hypothesis Pronunciation: /hīˈpäTHəsəs/ NOUN A supposition orproposed explanation made on the basis of limited evidence as a starting point for further investigation assumption Pronunciation: /əˈsəm(p)SH(ə)n/ NOUN A thing that is accepted as true or as certain to happen, without proof.
  • 24.
  • 25.
    It’s child's play Childreniterate towards a solution through trial and error
  • 26.
  • 27.
  • 28.
    Specify hypothesis Design experiment & actions to betaken with output Run test and take actions The Experimental Design iterative learning loop State what we believe to be true and required for the offer’s success Choose a methodology define the measure and set a target and actions Run test, capture the outcomes and follow actions to be taken Prioritise! If hypothesis is confirmed, we persevere and GO. If falsified, we pivot and change the proposition. What will kill the proposition first? 2 34 1
  • 29.
  • 30.
    Exp 1 Exp2 Exp 3 Exp 4 Experiment along the journey Check if the performance matches the promise Idea BusinessMany, many iterations Video Advert (A/B) Web Page (A/B) Paper Prototype
  • 31.
    1 Placed Facebookads 2 Clicked through to Landing Page 3 Sign-up Sequence experiments to test the future experience 4 Recruited for beta product test
  • 32.
    Prototypes are theengine of experimental design Making stuff deepens our thinking and drives ACTION If you want to make something great, start making. Tom Kelley, IDEO Having stuff means we can share with people who will “get it” more easily We can iterate and make better development decisions 1 2 3 ‘‘
  • 33.
    The purpose ofa Minimum Viable Proposition (MVP) is to maximize learning per euro we spend*. * using customers and not testers
  • 34.
    Real“Test” products Consumer perception Our perception MVP1.0 Prototypes €/real & LCM 1.0 Propositions on the market MVP MVP Learning tools MVPs are prototypes that customers believe to be real Proto A Proto B Proto C Proto DProto AProto C
  • 35.
    Experts can guideto the right experiment design Low Fidelity - FASTER Buy Use High Fidelity - SLOWER Wizard of Oz ConciergeImposter Judo Analog/Physical Dry-Wallet High Hurdle Video Trailer Crowd-Funding Pitch Landing page Qualitative interviews / groups Survey Web Prototype Paper Mockup / POP Keynote / InVision Prototype Online Ads TypeForm Landing Native OS Prototype Role Play Quantitative Product Usage tests Technical / sensory testing BASES / Concept test Classic methods Recent digital-enabled methods Shop alongs © Philips 2016
  • 36.
  • 37.
    Exp 1 Exp2 Exp 3 Exp 4 Experiment along the journey Check if the performance matches the promise Idea BusinessMany, many iterations Video
  • 38.
  • 39.
    When we involve realcustomers, we need to engage them through storytelling Add pic
  • 40.
    1. Create astory 2. Make the video 3. Fix the video
  • 42.
    Too complex? A simple structureis your friend: Confrontation/tension Enlightenment/Change Exploration Closure }e.g. try a four act structure
  • 43.
    Exp 1 Exp2 Exp 3 Exp 4 Experiment along the journey Check if the performance matches the promise Idea BusinessMany, many iterations Video Advert (A/B) Web Page (A/B)
  • 44.
    A/B test Big directions Inthe early stages, it’s not a small variable vs. small variable (aka optimisation) We have to answer and find the big directions before we can optimise.
  • 45.
    Choose the bestchannel for your communication http://www.growthtribe.io/blog/brass-method-your-ultimate-guide-to-prioritising-which-customer-acquisition-channel-to-test-first/ Facebook isn’t always the right channel to reach your audience.
  • 46.
    Analytics How many &when? - e.g. Google Analytics - e.g. Advert analytics What did they do? - E.g. Hotjar n.b. The numbers never match
  • 47.
    Exp 1 Exp2 Exp 3 Exp 4 Experiment along the journey Check if the performance matches the promise Idea BusinessMany, many iterations Video Advert (A/B) Web Page (A/B) Paper Prototype
  • 48.
    1st consult skills 2nd consult habit 3rdconsult Last resort basic instruction one remedy one Talcum powder (messy but it works) basic instruction two remedy two gauge skill level advanced instruction introduction to new skill error strategy terminology understanding confront habit surprising instruction Concierge MVP Team identified an advice model which appears to very effective
  • 49.
  • 50.
    GoodBad While the oneon the left looks nicer, it is not usable. Iterate through your sketches, throw lots away, keep the best stuff.
  • 51.
    How useful? Try http://popapp.inand see how quickly you can get a prototype running. Use this to test navigation & meaning very early on. Use the results to iterate through a web app with full analytics. Use this to understand if a feature is worth building.
  • 52.
  • 53.
    Draw on expertisearound you to: Structure Effective Hypotheses Identify the right Risks in the Idea - Elephant in the room Ensure Stakeholders stay accountable to the Results Create the ability to Run Quick, Iterative Experiments Stop the experiment when you’ve collected the full data Hire statistical knowledge (sample # you must use to find a significant result) Run at same times - compare apples with apples Create control variables across experiments Integrate traditional testing where relevant Build your individual expertise Use clear assessment criteria (Desirability, Feasibility, Viability)
  • 54.
    A TEAM MARKETING PROFESSIONALS DESIGNERS& STRATEGISTS TECHNOLOGY SPECIALISTS DEVELOPERS SUBJECT MATTER EXPERTS STAKEHOLDERS
  • 55.
  • 56.
    After each experiment,there are always 4 possible outcomes 1. GO You are confident that your hypothesis is valid and have the evidence to show it 2. Confirm 3. Pivot 4. STOP You are confident that your hypothesis is correct but you need more evidence to support it You no longer believe that this offer is a right for you to pursue at this time. You believe in the vision but you need to re-visit the means & path to get to the vision
  • 57.
  • 58.
    KPI driven Simple As closeto reality as possible Prioritized Statistically significant Allowed to fail! Experiments must be:
  • 59.
    Build the wholejourney Speed is your ally. Perfection comes later Your first experiment is a rehearsal Your approach & mindset
  • 60.
  • 61.
    The only way to proceed isto get your feet wet
  • 62.