Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

How Spotify Uses Big Data For Fast Product Iterations | Wouter de Bie - Spotify

1,076 views

Published on

How Spotify Uses Big Data For Fast Product Iterations | Wouter de Bie - Spotify during Social Media Week Rotterdam
26-09-2014
In this talk Wouter de Bie looks at how Spotify uses data and Big Data technologies to make fast iterations on the Spotify product. Some of the questions he’ll try to answer are ”Why is fast product iteration important for us?”, ”How does data tie into this?” and ”What is it we do to achieve this”?

Published in: Data & Analytics
  • Be the first to comment

How Spotify Uses Big Data For Fast Product Iterations | Wouter de Bie - Spotify

  1. 1. September 26, 2014 Big Data for fast product iterations to drive user growth Wouter de Bie Big Data Architect @xinit / wouter@spotify.com
  2. 2. Big Data for fast product iterations to drive user growth September 26, 2014
  3. 3. Big Data for fast product iterations to drive user growth September 26, 2014
  4. 4. Big Data for fast product iterations to drive user growth September 26, 2014
  5. 5. Big Data • 40 million Monthly Active Users • 20+ million tracks • 57 countries • 2 TB of compressed data from users per day • 70 TB of data generated in Hadoop each day • 700 node Hadoop cluster • 28 PB of storage • 8500 – 12.000 jobs per day September 26, 2014
  6. 6. Big Data for fast product iterations to drive user growth September 26, 2014
  7. 7. Why fast iteration? • We were the first ones to do free streaming • Not any longer.. • We need to build the best product • But nobody has done this before.. What does best mean?
  8. 8. 8 How do we develop our product? Think it Build it Ship it Tweak it
  9. 9. Support with data and analytics Manage roll out, evaluate test groups, understand problems September 26, 2014 Think it Build it Ship it Tweak it Understand consumers, behavior, opportunities, what to test Build prototypes, ensure right metrics / tracking is in place Understand what works, measure, evaluate, learn, optimize
  10. 10. 10 A/B testing • Select a test group of (< all) users • Select a control group (e.g. all remaining users) • Let the test group try a new version of a feature • Gather metrics about the test and control group • Compare the groups and roll out the new feature if the test group performs better
  11. 11. What did we find?
  12. 12. Ad frequency Song Song Song Song Ads Song Song Song Song VS Song Ad Song Song Ad Song Song Song Ad Song Song
  13. 13. Result: no difference
  14. 14. Personalized email subjects
  15. 15. Result: 200% higher CTR
  16. 16. Radio
  17. 17. Result: Constantly improving radio experience
  18. 18. Sign-up flow Signup button
  19. 19. Group B - test Group A - control Performance Time Download Okay, listen to music A - control 50% of users B - test 50% of users 100% of users Massive boost: ca 3x! Layout of signup -button
  20. 20. Result: 3x more sign-ups!
  21. 21. • Auto play music • Great cover art • Clear download instructions • Screenshot in the background “This signup-flow will kill it!”
  22. 22. Result
  23. 23. Some history.. September 26, 2014
  24. 24. September 26, 2014
  25. 25. September 26, 2014
  26. 26. September 26, 2014
  27. 27. September 26, 2014
  28. 28. September 26, 2014
  29. 29. September 26, 2014
  30. 30. And not only for ourselves.. September 26, 2014
  31. 31. Jay-Z
  32. 32. Artist analytics
  33. 33. Artist analytics
  34. 34. Artist analytics
  35. 35. Some words of wisdom.. September 26, 2014
  36. 36. 36 It’s a numbers game Only 10% will lead to a change – Google after 12.000 tests
  37. 37. 37 “80% of the times, we are wrong about what consumers want” Leverage your data!
  38. 38. Section name 38 You’re the expert, but prepare for the truth
  39. 39. Thank you! September 26, 2014

×