3. @andreasklinger
āStartup Founderā
āProduct Guyā
What we will cover
- Why early stage metrics are different.
- Applicable methods & Lessons Learned.
(this is an excerpt of 2h workshop - but with prettier slides ;) )
#scb13 ā @andreasklinger
4. The Main Problem with Metrics in Early Stage:
- Product not ready or even wrong.
- Little to no useable data.
- Data points contradict each other.
- External Trafļ¬c can easily mess up our insights.
- What is actionable?
- Are we on the ārightā track?
#scb13 ā @andreasklinger
6. Some startups have
ideas for a new product.
Looking for customers
to buy (or at least use) it.
Customers donāt buy.
āearly stageā
#scb13 ā @andreasklinger
7. Product/Market Fit
traction
time
With early stage
I do not mean āX Yearsā
I mean before product/market ļ¬t.
#scb13 ā @andreasklinger
8. Product/market ļ¬t
Being in a good market
with a product that can satisfy
that market.
~ Marc Andreessen
#scb13 ā @andreasklinger
9. Product/market ļ¬t
Being in a good market
with a product that can satisfy
that market.
~ Marc Andreessen
= People want your stuff.
#scb13 ā @andreasklinger
11. Product/Market Fit
traction
time
Discovery Validation Efļ¬ciency Scale
Steve Blank - Customer Development
#scb13 ā @andreasklinger
12. Product/Market Fit
traction
time
Discovery Validation Efļ¬ciency Scale
Find a product
the market wants.
#scb13 ā @andreasklinger
13. Product/Market Fit
traction
time
Discovery Validation Efļ¬ciency Scale
Find a product Optimise the product
the market wants. for the market.
#scb13 ā @andreasklinger
14. Product/Market Fit
traction
time
Discovery Validation Efļ¬ciency Scale
Find a product Optimise the product
the market wants. for the market.
People in search Most clones
for new product start here.
start here. #scb13 ā @andreasklinger
15. Product/Market Fit
traction
time
Discovery Validation Efļ¬ciency Scale
Product & Customer Scale Marketing
Development & Operations
#scb13 ā @andreasklinger
16. Product/Market Fit
traction
time
Discovery Validation Efļ¬ciency Scale
Startups have phases
but they overlap.
#scb13 ā @andreasklinger
17. Product/Market Fit
traction
time
Discovery Validation Efļ¬ciency Scale
83% of all startups are in here.
#scb13 ā @andreasklinger
18. Product/Market Fit
traction
time
Discovery Validation Efļ¬ciency Scale
83% of all startups are in here. Most stuff we learn about
web analytics is meant for this part
#scb13 ā @andreasklinger
20. What does this mean for my product?
Are we on the right track?
Meant for channel (referral)
optimization.
21. Use of Metrics in Early Stage
#scb13 ā @andreasklinger
22. Use of Metrics in Early Stage
Focus on People
- Not Hits, Pageviews, Visits, Events
Validation of customer feedback
- saying vs doing
- eg. did they really use the app?
- does the app do what they need it to?
Validation of internal opinions
- believing vs knowing
- eg. āOur users need/are/do/tryā¦ā
Doublecheck + Falsify
#scb13 ā @andreasklinger
23. Segment Users into Cohorts
Cohorts = Groups of people that share attributes.
#scb13 ā @andreasklinger
26. Acquisition
Visit / Signup / etc
Activation
Use of core feature
Retention
Come + use again
Referral
Invite + Signup
Revenue
$$$ Earned
(c) Dave McClure
28. Acquisition
Visit / Signup / etc
Activation
Use of core feature
Which Metrics to focus on?
Retention
Come + use again
Referral
Invite + Signup
Revenue
$$$ Earned
(c) Dave McClure
29. Acquisition
Visit / Signup / etc
Activation
Use of core feature
Short Answer:
Retention
Focus on Retention Come + use again
Referral
Invite + Signup
Revenue
$$$ Earned
(c) Dave McClure
30. Acquisition
Visit / Signup / etc
Activation
Use of core feature
Long answer - It depends on two things:
Retention
Come + use again
Phase of company Referral
Invite + Signup
Type of Product (esp. Engine of Growth)
Revenue
$$$ Earned
(c) Dave McClure
31. Long answer - It depends on two things:
Acquisition
Visit / Signup / etc
Activation
Use of core feature
Retention
Come + use again
Referral
Invite + Signup
Revenue
$$$ Earned
Source: Lean Analytics Book - highly recommend
32. Acquisition
Visit / Signup / etc
Activation
Use of core feature
Short Answer:
Retention
Focus on Retention Come + use again
Referral
Invite + Signup
Revenue
$$$ Earned
(c) Dave McClure
34. Because
Retention = f(user_happiness)
Crashpadderās Happiness Index
e.g. Weighted sum over core activities by hosts.
Cohorts by cities and time.
= Health/Happiness Dashboard
35. AARRR misses something
Acquisition
And Happiness is not everything
Activation
Retention
Referral
Revenue
(c) Dave McClure
36. CUSTOMER INTENT
Acquisition
Activation
Retention
Referral
Revenue
FULFILMENT OF CUSTOMER INTENT
(c) Dave McClure
38. Metrics are horrible way to understand customer intent
Customer Intent = His āJob to be doneā
Products are bought because
they solve a ājob to be doneā.
Learn about Jobs to be done Framework
Watch: http://bit.ly/cc-jtbd
(c) Dave McClure
39. Startups are obsessed by their solution
And ignore the customers job/problem
Market
Job/
Problem
Our
Solution
#scb13 ā @andreasklinger
41. Metrics are horrible way to understand customer intent
Great Way: Customer Interviews
But: We bias our people,
when we ask them.
Even if we try not to.
Reason: we believe our
own bullshit.
Watch: www.hackertalks.io
(c) Dave McClure
42. Metrics are horrible way to understand customer intent
OK Way: Smoke Tests
If interviews suggest
a new feature but you are
Download Mobile Client
unsure about critical mass
(e.g. due to sample bias).
Create Smoke Tests
measure Click Conversion/
Signups
Not for veriļ¬cation but falsiļ¬cation
(c) Dave McClure
45. Dig deeper - Good product centric KPIs:
Linked to assumptions of your product (validation/falsify)
Rate or Ratio (0.X or %)
Framework: AARRR
Comparable (To your history (or a/b). Forget the market)
Explainable (If you donāt get it it means nothing)
#scb13 ā @andreasklinger
46. āIndustry Standardsā
Framework: AARRR Use industry averages as reality check.
Not as benchmark.
- Usually very hard to get.
- Everyone deļ¬nes stuff different.
- You might end up with another business
model anyway.
- Compare yourself vs your history data.
#scb13 ā @andreasklinger
47. Example Mobile App: Pusher2000
Trainer2peer pressure sport app (prelaunch ābetaā).
Rev channel: Trainers pay monthly fee.
Two sided => Segment AARRR for both sides (trainer/user)
Marketplace => Value = Transactions / Supplier
Social Software => DAU/MAU to see if activated users stay active
Chicken/Egg => You need a few very happy chickens for loads of eggs.
Week/Week retention to see if public launch makes sense
Framework: AARRR
Optimize retention: Interviews with Users that left
Measure Trainer Happiness Score
Activated User: More than two training sessions
Pushups / User / Week to see if the core assumption (People will do
more pushups) is valid
#scb13 ā @andreasklinger
48. Dig Deeper - Dataschmutz
A layer of dirt obfuscating
your useable data.
Usually āwrong intentā.
Usually our fault.
(~ sample noise we created ourselves)
#scb13 ā @andreasklinger
49. Dataschmutz
A layer of dirt obfuscating
your useable data.
e.g. Trafļ¬c Spikes of wrong
customer segment.
(have wrong intent)
#scb13 ā @andreasklinger
50. Dataschmutz
Exam
MySugr
is praised as
ābeautiful appā
example.ā¦
=> Downloads
=> Problem:
Not all are diabetic
They focus on people
who activated.
51. How to minimize the impact of Dataschmutz
Base your KPIs on wavebreakers.
WK visitors acquisition activation retention referral revenue
twice a
Birchbox visit registration ļ¬rst photo share ā¦
month
1 6000 66% / 4000 62,5% 25% 10%
2 25000 35% / 8750 65% 23% 9%
3 5000 70%Ā / 3500 64% 26% 4%
52. Dataschmutz
Competitions create artiļ¬cial incentive
Competition Created
āDataschmutzā
Competitions (before P/M Fit) āWould you use my app and might
are nothing but Teļ¬on Marketing win 1.000.000 USD?ā
* Users had huge extra incentive.
People come.can hurt your numbers.
* Marketing People leave.
* While we decided on how to
relaunch we had dirty numbers.
#scb13 ā @andreasklinger
53. Dig Deeper - Metrics need to hurt
#scb13 ā @andreasklinger
54. Dig Deeper - Metrics need to hurt
If you are not ashamed about the KPIs in
your dashboard than something is wrong.
Either you do not drill deep enough.
Or you focus on the wrong KPIs.
#scb13 ā @andreasklinger
55. Dig Deeper - Metrics need to hurt
Example: Garmz/LOOKK
Great Numbers:
90% activation (activation = vote)
But they only voted for friends
instead of actually using the platform.
We drilled (not far) deeper:
Activation = Vote for 2 different designers. Boom. Pain.
#scb13 ā @andreasklinger
56. User activation.
Some users are happy (power users)
Some come never again.
What differs them? Itās their activities in their ļ¬rst 30 days.
How we think about Churn is wrong.
#scb13 ā @andreasklinger
57. Example Twitter
How often did activated users
use twitter in the ļ¬rst month:
7 times
What did they do?
Follow 20 people, followed
back by 10
Churn:
If they donāt keep them 7 times
in the ļ¬rst 30 days.
They will lose them forever.
It doesnāt matter when a user
remembers to unsubscribe
#scb13 ā @andreasklinger
58. Example Twitter
Example Twitter:
How did they get more people
to follow 30people within
7visits in the ļ¬rst 30 days?
Ran assumptions, created
features and ran experiments!
Watch: http://www.youtube.com/watch?v=L2snRPbhsF0
#scb13 ā @andreasklinger
61. Summary
- Use Metrics for Product and Customer Development.
- Use Cohorts.
- Use AARRR.
- Figure Customer Intent through non-biasing interviews.
- Understand your type of product and itās core drivers
- Find KPIs that mean something to your speciļ¬c product.
- Avoid Telfonmarketing (eg Campaigns pre-product).
- Filter Dataschmutz
- Metrics need to hurt
- Focus on the ļ¬rst 30 days of customer activation.
TL;DR: Use metrics to validate/doublecheck.
Use those insights when designing for/speaking to your customers.
#scb13 ā @andreasklinger
62. Read on
Startup metrics for Pirates by Dave McClure
http://www.slideshare.net/dmc500hats/startup-metrics-for-pirates-long-version
Actionable Metrics by Ash Mauyra
http://www.ashmaurya.com/2010/07/3-rules-to-actionable-metrics/
Data Science Secrets by DJ Patil - LeWeb London 2012
http://www.youtube.com/watch?v=L2snRPbhsF0
Twitter sign up process
http://www.lukew.com/ff/entry.asp?1128
Lean startup metrics - @stueccles
http://www.slideshare.net/stueccles/lean-startup-metrics
Cohorts in Google Analytics - @serenestudios
http://danhilltech.tumblr.com/post/12509218078/startups-hacking-a-cohort-analysis-with-google
Rob Fitzpatrickās Collection of best Custdev Videos - @robļ¬tz
http://www.hackertalks.io
Lean Analytics Book
http://leananalyticsbook.com/introducing-lean-analytics/
Actionable Metrics - @lļ¬ttl
http://www.slideshare.net/lļ¬ttl/actionable-metrics-lean-startup-meetup-berlin
App Engagement Matrix - Flurry
http://blog.ļ¬urry.com/bid/90743/App-Engagement-The-Matrix-Reloaded
My Blog
http://www.klinger.io
#scb13 ā @andreasklinger