Successfully reported this slideshow.
Your SlideShare is downloading. ×

Intro to Data Analytics with Oscar's Director of Product

Intro to Data Analytics with Oscar's Director of Product

Download to read offline

The Director of Product at Oscar, Vasudev Vadlamudi, went over key types of quantitative analysis that B2C product managers use on the job including: funnels, cohorts, and a/b testing. For each one he looked into when and why they are used, and used examples.

The Director of Product at Oscar, Vasudev Vadlamudi, went over key types of quantitative analysis that B2C product managers use on the job including: funnels, cohorts, and a/b testing. For each one he looked into when and why they are used, and used examples.

Advertisement
Advertisement

More Related Content

Advertisement

More from Product School

Advertisement

Intro to Data Analytics with Oscar's Director of Product

  1. 1. Intro to Data Analytics with Oscar’s Director of Product
  2. 2. FREE INVITE Join 10,000+ Product Managers on
  3. 3. Product Management Courses
  4. 4. Coding for Managers Courses
  5. 5. Data Analytics for Managers Courses
  6. 6. Include @productschool and #prodmgmt at the end of your tweet Tweet to get a free ticket for our next Event!
  7. 7. Vasudev Vadlamudi Tonight’s Speaker
  8. 8. Introduction to Product Analytics Vasu Vadlamudi June 22, 2017
  9. 9. Who is this guy?
  10. 10. Overview Why does this matter? Types of product analytics Infrastructure!
  11. 11. Common problems “We keep shipping features and things are going OK, but I’m not really sure we know if things are going well.” “We keep basing our decisions on gut / OPO / HIPPOs. People don’t really trust the PMs.” “My (CEO, co-founder, eng lead, designer, dog) wants us to be more data-driven. What does that even *mean*?” How do you generate insights, then turn them into action?
  12. 12. Common problems “We keep shipping features and things are going OK, but I’m not really sure we know if things are going well.” “We keep basing our decisions on gut / OPO / HIPPOs. People don’t really trust the PMs.” “My (CEO, co-founder, eng lead, designer, dog) wants us to be more data-driven. What does that even *mean*?” How do you generate insights, then turn them into action?
  13. 13. Why does this matter?
  14. 14. Why does this matter?WhWhy does this matter?
  15. 15. Why does this matter for product managers? As a PM, you’re expected to make good decisions measure success (and failure) convince others! At all levels of the product components within a feature individual features product roadmap product vision
  16. 16. Lots of ways to collect data How do we choose the right ones? ● product KPIs and trends ● feature a/b tests ● feature postmortem analysis ● web/email surveys ● focus groups ● interviews ● usability testing ● prototype/beta testing (in-house or with users) ● app feedback Lots of ways to collect data How do we choose the right ones?
  17. 17. Lots of ways to collect data How do we choose the right ones? ● product KPIs and trends ● feature a/b tests ● feature postmortem analysis ● web/email surveys ● focus groups ● interviews ● usability testing ● prototype/beta testing (in-house or with users) ● app feedback Lots of ways to collect data How do we choose the right ones? today’s focus
  18. 18. Overview Why does this matter? Types of product analytics Infrastructure!
  19. 19. Funnel analysis What is it? identifying a desired metric breaking up the steps leading up to that metric calculating conversion rate from each step to the next Useful for thinking through steps identifying opportunities things like: install, purchase flows
  20. 20. Funnel case study: measuring a viral campaign You are a PM on a ridesharing app. You are launching an in-app referral campaign. What are the steps in your funnel, and what are the metrics?
  21. 21. Total users User is in app today User completes trigger, sees prompt User sent Recipient taps the link Recipient signs up Metric Conversion from previous step Total Install base --- 1M DAU 10% 100K Prompt viewers 10% 10K Unique senders x sends per sender 10% x 3 3K sends Unique clickers 5% 150 Signups attributable to campaign 30% 45/day Funnel case study: measuring a viral campaign
  22. 22. Funnel analysis: things to watch out for thinking above and below the funnel interactions between steps overoptimization what is “good”?
  23. 23. Cohort analysis What is it? identifying a key metric breaking up your users into groups, comparing the performance of those groups against each other Useful for retention analysis managing the user lifecycle identifying breaks / bugs
  24. 24. Cohort analysis case study: DAU code red! You launched your app 2 weeks ago. It was growing daily, but growth has flatlined. Everyone is freaking out. What do you do?
  25. 25. Cohort analysis case study: DAU code red! Lots of ways to figure this out, but let’s start by looking at your cohorts... What do you see?
  26. 26. Cohort analysis case study: DAU code red! Lots of ways to figure this out, but let’s start by looking at your cohorts... D1 retention trending down! Why? D7-D8 dropoff is very high. Why? What can we do?
  27. 27. Cohort analysis: things to watch out for mix shifts time to get data what is “good”?
  28. 28. A/B testing What is it? launching two (or more) versions of an experience to compare performance vs. key metrics Useful for understanding response to new features optimizing algorithms optimizing feature marketing
  29. 29. A/B case study: new user drip notifications You want to send new users push notifications during their first 7 days, but your lead engineer thinks it will be spammy. You want to balance your company’s growth needs with UX, and you also want to convince the engineering team this is worthwhile. What do you do?
  30. 30. A/B case study: new user drip notifications Variant A 25% of new users Variant A’ 25% of new users Variant B 25% of new users Variant C 25% of new users Install D1 D3 D5 D7 (no notifications) (no notifications) PN #2 PN #1 PN #2 PN #3
  31. 31. A/B case study: new user drip notifications Variant A 25% of new users control Variant A’ 25% of new users control Variant B 25% of new users “nudge” Variant C 25% of new users “drip campaign” D3 retention D7 retention D14 retention 30.1% 18.1% 13.0% 29.9% 18.1% 13.1% 37.9% 20.4% 15.1% 38.1% 22.0% 13.5%
  32. 32. A/B case study: feature prompt Control (original) Variant New feature XYZ is here! Do you want to try it out now? Try it Cancel New feature XYZ is here! Do you want to try it out now? Try itCancel This is an example; no actual users were harmed in the making of this slide
  33. 33. A/B testing case study #2 ● CTR increased from 15% to >19% ● this is huge! we’re gonna be rich! ● where else can we apply this learning? Also an example A/B case study: feature prompt
  34. 34. A/B testing case study #2 Also an example A/B case study: feature prompt
  35. 35. A/B testing: things to watch out for dev cost sample size user and QA complexity what does this data really mean?
  36. 36. Overview Why does this matter? Types of product analytics Infrastructure!
  37. 37. Product analysis isn’t free Even simple questions require previous investments Q: How many daily active users do we have? A: It depends... How do we define “daily”, “active”, and “users”? How do we track sessions? Where does tracking happen? How do we know we can trust this data? Where do we send this data?
  38. 38. Key infrastructure Both technology AND process defining tracking taxonomy, data model implementing tracking hooks + QA process event-based vs page-based logged-in vs not defining KPIs identifying and ingesting comps data storage monitoring / alerting Product analytics is an upfront investment AND an ongoing tax on team velocity. Don’t overdo it!
  39. 39. How does analytics evolve with your product*? Key questions Will this be good? What’s busted? What do users like? What’s generating business value? How do we scale sustainably? How do we maximize returns everywhere? What’s tracked n/a +big 3 metrics +key funnels +engagement metrics +core funnels +user acq funnels +retention +a/b test results +ALL THE TESTS Data infrastructure n/a database MVP for alerting +some a/b test infra +data warehouse(s) +full monitoring +full a/b test infra Finding insights qualitative fighting fires (ad-hoc) +define analysis upfront Sharing insights ad hoc +regular review process Prelaunch Just launched Finding product/ market fit Growth Optimization *YMMV
  40. 40. Example tracking taxonomy
  41. 41. What’s a data warehouse?
  42. 42. Questions?
  43. 43. Part-time Product Management Courses in New York

Editor's Notes

  • Thanks for coming!





  • We currently have a slack community of 10,000
  • For those of you new here to our meetup, Product School offers 8-week, part-time courses on how to be a product manager.
  • We also offer a program called Coding for Managers. This course is for professionals without a technical background who want to learn how to code, build a better rapport with engineer teams and increase visibility with hiring managers.
  • If you'd like a free ticket to our next event, be sure to tweet a picture of your presenter using @productschool or check-in to the event on facebook. Following the presentation, please come show me and I'll take your email to send your free ticket.
  • Etugo Nwokah
  • Feature level: build-measure-learn (small feature / huge feature level => make better features)
  • scientific method (rethink your mental model of the world ex. retrain your userbrain)
    Ok, so why does this matter FOR ME?


  • Feature level: build-measure-learn (small feature / huge feature level => make better features)
    But also at roadmap level (measurement allows you to create EOs => make better ROI trade-offs)
    side impact: fastest, easiest, lowest-drama way to resolve disagreements (important as a PM)
  • Funnels
    Cohort
    Ab tests
  • Ask audience. Whiteboard them out.
  • -how many of you missed the first step?

    -can someone see prompt more than once per day?
    -what if 2 senders send to same person? Should we add “unique recipients”?

    Where/how could we improve this?

    What is “good enough”?
  • Above the funnel: if we really wanted to, we could extend all the way up to “# of sentient beings in the universe”... what’s the right stopping point?
  • breaks/bugs includes changes in your userbase
  • Explain what this is.

    https://docs.google.com/spreadsheets/d/1UGSWT4iDj1A5eYxNCf78Bgvl1qhMkF_CCskbEmnmcqI/edit#gid=1372512304

  • https://docs.google.com/spreadsheets/d/1UGSWT4iDj1A5eYxNCf78Bgvl1qhMkF_CCskbEmnmcqI/edit#gid=1372512304

    Installs are going up slowly. Good.
    However, D1 retention going down over time (started at 60%, now 30%). Cohorts are getting less good
    And, D7->D8 retention there is a sharp dropoff of almost 20% (2000bps). Why? Maybe it was pumped up by D1-D7 email/PN drip campaign, but after that we stop? What do we do?
  • Again, “what is good”?
    Good: 40/20/10 rule for d1/d7/d30 retention in a freemium app. How else could you get comps?
  • Ask to design an A/B test
  • Small variance. You have tiny sample size
    When you send PNs, they work to bring ppl in! Both C and D are better on the day they sent the PN
    But over time the drip campaign performs worse. Why?
    maybe the drip campaign actually drove people to come in and turn off PN permissions or even come in and delete the app, so measurement AFTER the campaign shows it’s worse!
    We should dig into the UX of the notifications to see if there are improvements to be made.
  • -design your experiments to validate not just business hypotheses, but also user ones!
  • -design your experiments to validate not just business hypotheses, but also user ones!
    -more broadly, this is a combination of art + tactics
  • Daily: midnight->115959PM Pacific? GMT? Will some of these be rolling, so we’ll want to analyze on a 24h rolling period?
    Active: logged in? logged in and did activities X, Y or Z but not A, B, and C? (true story - step tracking example).
    Users: unique users only? Logged-in only? If we don’t have logged-in tracking and rely on fingerprinting, what if we think we’ve seen this person before but we’re not sure?

    Where does tracking happen: server? Client? Both? ...and which one do we choose to believe if they vary?
    How do we know we can trust this tracking: both when we built it, AND also today when we’re staring at it
    Where do we send this data: Do we just store it in a table directly or do we pre-process the data a bit? Additionally, should we store it in a data warehouse?
    Access = complex SQL query; simple SQL query to grab the data => analyze in Excel; build a simple splunk dashboard; send it to a visualization/exploration tool like Mixpanel or Tableau or Looker or Periscope; build a custom dashboard

    ...and this doesn’t even COVER whether this is good, bad, what direction is it going in, why, how fast…
    Also doesn’t cover what happens when your app, tracking, or definitions change… how do you compare today’s DAU to last week’s/month’s/year’s DAU on the same day?!?
  • There’s a LOT of A/B infra you could build: sorting, ramp up/down, blacklist/whitelist, testing in staging, analytics...
  • “Just launched” big 3 metrics ex = installs, dau, rev/dau

    1. This is very personal. YMMV. For example, if your company is really advanced (or has several product lines already) you could re-use their infrastructure and pull some of the advanced stuff into earlier stages. Zynga was great at this.

    2. Ideally you’ll think through how not to hammer your production databases when doing analysis, ex. by having a second read-only database for analytics queries. So if you crash that, you haven’t locked your production tables...

    3. Why are the colors out of order?

    Colors roughly correspond to my emotional state during a launch. Prelaunch yellow = high energy and exploratory. Growth and optimization green = a lot of the crazy risk is gone, now it’s more about executing. Shit’s on fire during launch, there’s a reason that one’s red...
  • -simplifies queries for novice SQL users (many PMs!)
    -increases query speed
    -allows for more data exploration
  • Thanks for coming!





×