7. We all care about
Growth!
What we saw
Release/Launch
8. We all care about
Growth!
What we saw
Release/Launch
Dammit, let’s
ship this!
9. We all care about
Growth!
What we saw
Release/Launch
Dammit, let’s
ship this!
Revenue
Activation
Retention
Acquisition
We all care about
Growth again!
10. Any ETA on the new
instrumentation?
PM/PMM
We have these
features for the June
release first
This will have to go on
the backlog
EM
Product
First attempt
38. Growth Product
We have code ready
we need to checkin
We have this
experiment we would like
to run!
PM
Engineer
Who are you? Let me
see your code
Great! Do you
have resources to make it
happen?
PM
Engineer
Second attempt
39. Why are you in my code base?
Why do you get to take short cuts?
We are already planning on doing this!
That’s not how we work!
49. Go back to the data
Focus on impact and ROI (ICE score)
If successful, what’s the projected impact?
Conversion, open rates,
notifications are good candidates
52. Changes need to be made inside the product
Don’t bring more water to a leaky bucket
Improving Activation reduces CAC (Cost of Acquisition)
Better Activation drives retention and ultimately conversion to paid
When the right time, reinvest that money saved in Acquisition
55. The Growth team now needs to focus on the entire funnel (ARRR)
Acquisition, Retention, Referral, Revenue
Split into mini-squads focused on each lever
90 day sprint focused on one metric to move
63. Experiment Pipeline
(lever, hypothesis, ICE
prioritization)
Experiment Plan
(control vs variant, baseline,
metric)
Backlog
(sprint planning, link to
experiment)
64. Experiment Pipeline
(lever, hypothesis, ICE
prioritization)
Experiment Plan
(control vs variant, baseline,
metric)
Backlog
(sprint planning, link to
experiment)
Ship Meeting
(stakeholders signoff)
65. Experiment Pipeline
(lever, hypothesis, ICE
prioritization)
Experiment Plan
(control vs variant, baseline,
metric)
Backlog
(sprint planning, link to
experiment)
Ship Meeting
(stakeholders signoff)
Ship
(soft launched)
66. Experiment Pipeline
(lever, hypothesis, ICE
prioritization)
Experiment Plan
(control vs variant, baseline,
metric)
Backlog
(sprint planning, link to
experiment)
Ship Meeting
(stakeholders signoff)
Ship
(soft launched)
Experiment
Results Review
(what did we learn)
68. Background:
We have a secret Facebook Group called Spark Insiders, we have had a challenge recruiting users easily and
most importantly truly active users.
Hypothesis:
We believe that by elevating our most active users to Spark Insiders we can build an automated system to
recruit users automatically. This will result in higher quality users with higher engagement on the FB group. We
also believe that making these users Insiders will boost their return use and we can learn what are the traits of
an engaged user.
KPIs: # New Spark Insiders, Group activity/engagement
iOS: Spark Insiders Recruitment
75. 0.76 shares per user/day 0.97 shares per user/day
27% increase
iOS: Spark Insiders - Impact on shares, last 60 days
76. Understanding power users
What features do they use in the first week?
How often do they open the app?
What time of day do they use the app and on which days?
What sources were they acquired from?
Was it an ad, a promotional email to the chain’s customer base, or some other place?
What devices do they use?
What is their demographic background, including age, income, and more?
What other apps do they use?
77. Emails collected used as Facebook Lookalikes Audiences
A Lookalike Audience is a way to reach new people on
FB who are likely to be interested in your business
because they're similar to your best existing customers.
Ads for acquisition can be created targeting audiences
“similar” to the power users you collected from within
the app.
http://bit.ly/2shVlzf
78. Background:
Our current M1 retention is 25%, we have to learn from users not returning enough on what we can do better
for them.
Hypothesis:
We believe that we can retarget lapsed users and ask them what we can improve to make Spark better for them.
We tried to do that manually, we now want to automate this to see if we can get feedback automatically
without any manual work.
KPIs: # Surveys completed, Quality of feedback provided
Spark Web: Lapsed User(Has not returned in the past 14 days)
80. Top 1: Users are happy with the
product, just not using the product at
the cadence we expect.
Top 2: More product features.
Top 3: More control/flexibility/variety.
Top 4: Performance issues.
Top 5: Android.
Spark Web: Lapsed User(Has not returned in the past 14 days) : Results
81. “There are tons of platforms and sites out there vying for attention. How
can you stay top of mind for me? Online events like chats, featuring
more user content and alerting them, etc. Haven't used you only
because I forgot about you.”
Key quotes: Marketing
82. “Is this free if I own the Adobe Suite? Is there an educator discount? Do
you love people that need your awesome program to make the world a
better place? Do you offer affiliate programs so that I can work the cost
of it off by making more awesome videos and telling people to use the
program because it is the best. Bottomline I like the product just unable
to afford it right now. Thank you”
Key quotes: Awareness
83. Web: Lapsed User - Impact on shares, last 60 days
0.22 shares per user/day 0.22 shares per user/day
No impact
Control Treatment
85. Start with organic retention
It’s not about hacking
Hire the right people
Buy before build
A squad, not a team
Focus on one thing first
Move to product changes
Own the full funnel (ARRR)
Data does not tell you why
Have a process