Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Deep Learning for Recommender Systems

4,826 views

Published on

Talk with Yves Raimond at the GPU Tech Conference on Marth 28, 2018 in San Jose, CA.

Abstract:
In this talk, we will survey how Deep Learning methods can be applied to personalization and recommendations. We will cover why standard Deep Learning approaches don't perform better than typical collaborative filtering techniques. Then we will survey we will go over recently published research at the intersection of Deep Learning and recommender systems, looking at how they integrate new types of data, explore new models, or change the recommendation problem statement. We will also highlight some of the ways that neural networks are used at Netflix and how we can use GPUs to train recommender systems. Finally, we will highlight promising new directions in this space.

Published in: Technology

Deep Learning for Recommender Systems

  1. 1. Deep Learning for Recommender Systems Justin Basilico & Yves Raimond March 28, 2018 GPU Technology Conference @JustinBasilico @moustaki
  2. 2. The value of recommendations ● A few seconds to find something great to watch… ● Can only show a few titles ● Enjoyment directly impacts customer satisfaction ● Generates over $1B per year of Netflix revenue ● How? Personalize everything
  3. 3. Deep learning for recommendations: a first try
  4. 4. 0 1 0 1 0 0 0 1 1 0 1 0 0 1 1 0 1 0 0 0 0 0 0 0 1 UsersItems Traditional Recommendation Setup
  5. 5. U≈R V A Matrix Factorization view
  6. 6. U A Feed-Forward Network view V
  7. 7. U A (deeper) feed-forward view V Mean squared loss?
  8. 8. A quick & dirty experiment ● ○ ○ ● ○ ■ ■ ○ ■ ■ ■ ■ ■ ●
  9. 9. GPU vs. CPU ● ● ●
  10. 10. What’s going on? ● ● ● ●
  11. 11. Conclusion? ● ●
  12. 12. Breaking the ‘traditional’ recsys setup ● ● ●
  13. 13. Alternative data
  14. 14. Content-based side information ● ● ●
  15. 15. Metadata-based side information ● ○ ● ○ ● ●
  16. 16. YouTube Recommendations ● ●
  17. 17. Alternative models
  18. 18. Restricted Boltzmann Machines ● ● ●
  19. 19. Auto-encoders ● ● ○ ● ● ●
  20. 20. (*)2Vec ● ● ● prod2vec (Skip-gram) user2vec (Continuous Bag of Words)
  21. 21. Wide + Deep models ● ● [Cheng et. al., 2016]
  22. 22. Alternative framings
  23. 23. Sequence prediction ● ○ ○ ● ○ ○ ●
  24. 24. Contextual sequence prediction ● ● ● ●
  25. 25. Contextual sequence data 2017-12-10 15:40:22 2017-12-23 19:32:10 2017-12-24 12:05:53 2017-12-27 22:40:22 2017-12-29 19:39:36 2017-12-30 20:42:13 Context ActionSequence per user ? Time
  26. 26. Time-sensitive sequence prediction ● ○ ● ○ ■ ● ● ■ ○
  27. 27. Other framings ● ○ ● ○ ●
  28. 28. Conclusion
  29. 29. Takeaways ● ● ● ●
  30. 30. More Resources ● ● ● ● ● ●
  31. 31. Thank you. @JustinBasilico @moustaki Justin Basilico & Yves Raimond Yes, we’re hiring...

×