Metrics - Lessons Learned - LeanCamp London 2012


Published on

The presentation Daniel Hill (@serenestudios) and I (@andreasklinger) did at LeanCamp London 2012

Metrics - Lessons Learned - LeanCamp London 2012

  1. 1. METRICS LESSONS LEARNEDLEANCAMP | @serenestudios | @andreasklinger
  2. 2. Dan HillCTO of Crashpadder @serenestudios Andreas KlingerCWTFO of LOOKK @andreasklinger
  3. 3. BEFORE PRODuCT MARKET FIT FOCuS ON RETENTIONLEANCAMP | @serenestudios | @andreasklinger
  4. 4. BEFORE PMF FOCuS ON RETENTION actionable-metrics/ Summary: In Discovery/Validation Focus on Retention & Activation In Validation/Growth Focus on Revenue and Referral
  5. 5. ThE REASON: RETENTION = f(uSERhAPPINESS)LEANCAMP | @serenestudios | @andreasklinger
  6. 6. RETENTION =f(uSERhAPPINESS)Actually you don’t want to measure retention. You want to measure user happiness. Especially before PMF. Rentention is the best signal for userhappiness. But if you can - narrow down your KPIs to show user happiness and the health of your product. IF YOu WILL DO ONE ThING ONLY: FOCuS ON uSER hAPPINESS
  7. 7. KPI’S NEED TO BE YOuR hEALTh MONITOR NOT YOuR PENISPuMPLEANCAMP | @serenestudios | @andreasklinger
  8. 8. KPI’S NEED TO BE YOuR hEALTh MONITOR What numbers do actually show you the (real) health of your product?What numbers could show user happiness? It’s not visits or one-time engagement. In Discovery / Validation “Visits” or even “Engagementclicks” don’t mean much.
  9. 9. IN DISCOVERY MARKETING CAN huRT YOuR NuMBERSLEANCAMP | @serenestudios | @andreasklinger
  10. 10. (WRONG) MARKETINGSTuNTS (CAN) huRT YOuR (ACTIONABLE) NuMBERS E.g. PR BuRSTS OR COMPETITIONS - Created visits in Google Analytics - Created userentries in Database - Disappeared. - Lower your retention and activation (%) rates - Defocus you from understanding our product. Example: LOOKK We asked users to vote designers in a competition. Effect: People voted their friends - didn’t gave a rat’s ass about LOOKK. Huge traffic spikes. Huge “feelgood”. Little learnings and longterm effect. They just created “Dataschmutz”
  11. 11. DRILL DOWN TO REMOVE DATASChMuTZLEANCAMP | @serenestudios | @andreasklinger
  12. 12. DRILL DOWN TO REMOVE DATASChMuTZ* What numbers really show the health of your business and are affected by your product changes and how can you make them more “stable” against outside effects. Example: Base your activation and retention KPIs on registered users not visitors to remove traffic spikes. * Schmutz (noun, german/jiddish) means “dirt”
  13. 13. ThE METRICS FAIRY SAYS… METRICS NEED TO huRTLEANCAMP | @serenestudios | @andreasklinger
  14. 14. METRICS NEED TO huRTIf the metrics you focus on do not hurt youeverytime you look at them and you are too embarassed to show them around. They are either a) not narrowed down enough or b) nothing you should focus on.
  15. 15. … AND “AARRR” ShALL BE YOuR SOuND OF PAINEXAMPLE - LOOKK Nice to look at. useless to work with.ACQuISITION - user registered Eg.ACTIVATION - user Voted (eg 90%) We changed activiation from “Registered users that voted”RETENTION - user Visited Again toREFERRAL - … “Registered users that votedREVENuE - … for more than two designers” Everybody else is just voting for a friend. Now: AARRRR = ThE PAIN ThE PAIN = ACTIONABLE
  16. 16. “AARRR” IS ThE SOuND OF PAIN ACQuISITION You need a way to reach the user. (Dave is wrong. A visit is just fog.s) (eg. registered + confirmed email) ACTIVATION The user has seen the real core of your product and used it. (segment in user groups if multifaced) RETENTIONThe user did an action that leads to value. (if possible close to money or coreaction) REFERRAL Not only shared but invited users came. REVENuE Money.
  17. 17. TOO MANY METRICS ARE BAD FOR YOuR EYES FOCuSLEANCAMP | @serenestudios | @andreasklinger
  18. 18. TOO MANY METRICS ARE BAD FOR YOuR EYES On your dashboards only watch those metrics you are currently iterating with your product on. There are always numbers going up and down until you have a stable userflow. Don’t go crazy because of that. Focus your product discovery around assumptions. Align your metrics to that.Example: At LOOKK we removed revenue from our dashboards while focusing on newsfeed and community features.
  19. 19. LESSONS LEARNED TOOLSLEANCAMP | @serenestudios | @andreasklinger
  20. 20. TOOLSTL;DR: All Suck.
  21. 21. TOOLSTL;DR: All Suck.…but are useful for certain parts:
  22. 22. TOOLSTL;DR: All Suck.…but are useful for certain parts:Google Analytics = Traffic AnalysisKissmetrics, Mixpanel = People AnalysisYour own Database = drilldownBe consistent throughout all tools:- When/Where to measure a final goal- Naming of goals- Logging and Naming of user types (eg. user, Designer, Admin, Visitor)
  23. 23. TOOLSGoogleAnalytics - Referrals - Optimize Online Marketing, - Bird eye view Website - Suck for funnel analysis (no ad-hoc eg) - Suck for consistancy (goals are easy to break) Don’t do funnel testing in GA - only track the last step Dashboards rock. Don’t go too crazy with customization Unfortunately single - it will break. player only (yet)
  24. 24. TOOLSKissMetrics - Insights on Conversions - Quick reports on features/pages - Suck (yet) at dashboards use it for funnel optimization. Use it for quick flexible insights. Use it to find users groups “View this people” feature is genius If you are a SaaS use it for everything ;) Don’t expect too much and also test mixpanel.
  25. 25. TOOLSLOOKKButler - SQL = quick, Tables = cheap - You can drilldown to your actionable KPIs - Dashboards for team members - Suck at bird-eye view, not ad- hoc flexible Questions: Who are the most popular designers if you only count people who vote for more than one designer? Who are our most influential users?
  26. 26. TOOLSLOOKKButler Team dashboards are easy to do