-
1.
@nataliewarnert
#dpm2016
Show me
the
MVP!Natalie Warnert – #dpm2016
-
2.
@nataliewarnert
#dpm2016
-
3.
@nataliewarnert
#dpm2016 Natalie
Warnert
Agile Coach & Trainer
www.nataliewarnert.com
@nataliewarnert
-
4.
@nataliewarnert
#dpm2016
A Minimum Viable
Product is that
version of a new
product which allows
a team to collect the
maximum amount of
validated learning
about customers
with the least effort.
– Eric Ries, Lean Startup
”
-
5.
@nataliewarnert
#dpm2016
What is a Minimum Viable
Product?
• Building just enough to learn and validate a hypothesis
• Learning not optimizing
• Find a plan that works before running out of resources
• Provide enough value to justify charging
THE MVP
-
6.
@nataliewarnert
#dpm2016
What is Lean UX?
• Interaction design
• Constant measurement and learning loops
• Business, development, customer, and UX unification
• Less focused on deliverables, more on data
LEAN USER EXPERIENCE
-
7.
@nataliewarnert
#dpm2016
*The difference
between BUILDING
the right thing and
LEARNING the right
thing
THEPOINT
-
8.
@nataliewarnert
#dpm2016
Problem/Solution fit
Do I have a problem worth solving?
Product Market Fit
Have I built something people want?
Scale
How do I accelerate growth and maximize
learning?
WHERE TO START
-
9.
@nataliewarnert
#dpm2016
Problem/Solution fit
Do I have a problem worth solving?
Product Market Fit
Have I built something people want?
Scale
How do I accelerate growth and maximize
learning?
WHERE TO START
Ideas are cheap!
Acting on them
is expensive $$
-
10.
@nataliewarnert
#dpm2016
Problem/Solution fit
Do I have a problem worth solving?
Product Market Fit
Have I built something people want?
Scale
How do I accelerate growth and maximize
learning?
WHERE TO START
Ideas are cheap!
Acting on them
is expensive $$
Learning over
growth
Don’t scale
-
11.
@nataliewarnert
#dpm2016
Customers
don’t care
about your
solution.
They care
about their
problems.
- Dave McClure, 500
Startups
”
-
12.
@nataliewarnert
#dpm2016
UNDERSTAND THE PROBLEM
• What is the customer’s problem?
• Fit into the business model
-
13.
@nataliewarnert
#dpm2016
UNDERSTAND THE PROBLEM
• What is the customer’s problem?
• Fit into the business model
-
14.
@nataliewarnert
#dpm2016
UNDERSTAND THE PROBLEM
• What is the customer’s problem?
• Fit into the business model
What is the
customer hiring
your product to
do?
-
15.
@nataliewarnert
#dpm2016
DEFINE THE SOLUTION
Smallest possible solution to speed up
learning
Build only what is needed (MVP)
Pick bold outcomes to validate learning
Business outcomes over solution
-
16.
@nataliewarnert
#dpm2016
DEFINE THE SOLUTION
Smallest possible solution to speed up
learning
Build only what is needed (MVP)
Pick bold outcomes to validate learning
Business outcomes over solution
More relevant
product
recommendation +
in-cart experience
=
increased
-
17.
@nataliewarnert
#dpm2016 most Plan A’s
don’t work
-
18.
@nataliewarnert
#dpm2016
VALIDATE QUALITATIVELY
What are our customers doing?
Continuous feedback loop with customers
Get out of the building!
-
19.
@nataliewarnert
#dpm2016
VALIDATE QUALITATIVELY
What are our customers doing?
Continuous feedback loop with customers
Get out of the building – or get customers in the building!
-
20.
@nataliewarnert
#dpm2016
A startup can
focus on only
one metric.
So you have
to decide
what that is
and ignore
everything
else
– Noah
Kagan,
”
-
21.
@nataliewarnert
#dpm2016
VALIDATE QUANTITATIVELY
Pirate metrics
Acquisition
Activation
Retention
Revenue
Referral
-
22.
@nataliewarnert
#dpm2016
VALIDATE QUANTITATIVELY
Pirate metrics
Acquisition
Activation
Retention
Revenue
Referral
What is your one metric to
rule them all? What are you
trying to validate (hypothesis)
-
23.
@nataliewarnert
#dpm2016
Product risk
Getting it right
Customer
riskRight path
Market risk
Viable business
BUT RISKS!
-
24.
@nataliewarnert
#dpm2016
Run a
Business
Satisfy
the
Customer
BALANCING NEEDS
-
25.
@nataliewarnert
#dpm2016
Run a
Business
Satisfy
the
Customer
BALANCING NEEDS
-
26.
@nataliewarnert
#dpm2016
BUILD MODEL
Build right
thing
Build
thing
right
Build it
fast
-
27.
@nataliewarnert
#dpm2016
LEARNING MODEL
Build right
thing
Build
thing
right
Build it
fast
Learning
FocusSpeed
-
28.
@nataliewarnert
#dpm2016
WHERE IS THE LEARNING?
Requiremen
ts
Dev QA Releas
e
-
29.
@nataliewarnert
#dpm2016
WHERE IS THE LEARNING?
Requiremen
ts
Dev QA Releas
eLittle learning
some learning Most learning
-
30.
@nataliewarnert
#dpm2016
You stand to
learn the most
when the
probability of
the expected
outcome is
50%; that is,
when you don’t
know what to
expect
-Lean Analytics ”
-
31.
@nataliewarnert
#dpm2016
SHOW ME THE MVP!
Declare
assumptions and
hypothesis Create an MVP
Run an experiment
to prove or disprove
hypothesis
Customer
feedback (qual
and quant) and
research
alternatives
-
32.
@nataliewarnert
#dpm2016
WHAT DID WE LEARN?
Reduce scope
Shorten the cycle time to
feedback
Get out the deliverables
business
-
33.
@nataliewarnert
#dpm2016
QUESTIONS
?
Running Lean - Ash Maurya
Lean UX – Jeff Gothelf, Josh Seiden
TOM
3 min
Corrupted and confused with MMP
This is my definition
What are resources
It doesn’t mean half baked or buggy
What are table stakes?
6 min
Designing for the customer and with the customer – how they want to work, not how we want them to, hard shift to make, think testing a product you wrote
Deliverables – less over the wall and more collaborative
All of our traditional language is missing
8 min
9 min
This is where the investigation starts – don’t just keep the idea in your head but SHARE it….start with problem
13 min
14 min
THIS COULD BE ANY RETAILER – ultimately bought, required, also viewed
BBY example here with CWBAB and it not working – started with customers wanting add on accessories
How that ties into the business model and problem of attach rates and conversion
Business problem and customer problem – our one hypothesis fits both
Halloween costume – snoopy book and not even the great pumpkin
So much data we don’t know what to do with it – so focused on this data but it was the wrong data
18 min
Bold outcomes – change something big over small. Page over button. Or in this case embedded in cart vs. just changing items
BBY example – we have our problem and our market fit. So what did we do with it.
Bold outcomes – change something big over small. Page over button. Or in this case embedded in cart vs. just changing items
BBY example – we have our problem and our market fit. So what did we do with it.
Interruption of the data they are used to seeing
Besides CYP what else do we see in Carts? Buy it again…save for later...guests also bought – we have so much data we don’t know what to do with it.
Bad example - polaris and having what we were going to build so defined that it couldn’t pivot (large release one, op model/workstation) – total customization vs. Packages what the customer actually needed vs. What we thought they wanted SOLUTION and BDUF
What is the unique value proposition? This is somewhere in “solution” but we don’t know it...YET
25 min
Remember our plan A here was rich relevance data…
A – assumption that you know what the customer wants...and you know what you do when you assumption...
walk the walls at polaris
Qualitatively – WHAT are they doing once we get them into cart – conversion from there is higher than browse…we know they are looking for more to buy...UX programs to see where hovers are, time spent, clicks, paper prototyping and contextual inquiry (modified)
When did we get out of the building? Went an did customer interviews and showed them what we were thinking (small margin) – is it stat sig? No. Is it sig? yes
Also by walking through with everyone it’s easier to determine what is too high scope
30 min
Validate quantitatively – analytics and A/B testing –
How do we quantify what they are doing? Conversion rates, add rates, abandon rates…we wouldn’t have gotten here if we didn’t build something though. (large releases don’t work for this)
Validate quantitatively – analytics and A/B testing - one ring to rule them all
You can tell any story with numbers, so what are the important ones? This is why we also use qualitative. It tells more of the story
How do we quantify what they are doing? Conversion rates, add rates, abandon rates…we wouldn’t have gotten here if we didn’t build something though. (large releases don’t work for this)
But we are addressing these risks up front by talking with customers, experimenting, and validating with data
Failing fast so oppty cost is less – talk about massive req and traditional stuff
35 min
So what did we learn????
So we have to shorten this cycle!
Mvp != half baked or buggy
Deliver enough value to justify charging (or running more experiments)
Prioritize hypotheses
Attach rate went up
Interaction in cart when the user has already showed more intent to buy
Language of CYP vs. required, vs. CWBAB vs. Suggested…
Try other products before we start to scale more – rich relevance data was expensive and didn’t want to “waste the money” – sunk cost