Your SlideShare is downloading.
×

- 1. Deep Learning for Recommender Systems Alexandros Karatzoglou Senior Research Scientist @ Telefonica Research first.lastname@gmail.com @alexk_z
- 2. Telefonica Research Machine Learning HCI Network & Systems Mobile Computing http://www.tid.es
- 3. Why Deep? ImageNet challenge error rates (red line = human performance)
- 4. Why Deep?
- 5. Inspiration for Neural Learning Early aviation attempts aimed at imitating birds, bats
- 6. Neural Model
- 7. Neuron a.k.a. Unit
- 8. Feedforward Multilayered Network
- 9. Learning
- 10. Stochastic Gradient Descent Generalization of (Stochastic) Gradient Descent
- 11. Stochastic Gradient Descent
- 12. Stochastic Gradient Descent
- 13. Stochastic Gradient Descent
- 14. Feedforward Multilayered Network
- 15. Backpropagation
- 16. Backpropagation Does not work well in plain a “normal” multilayer deep network Vanishing Gradients Slow Learning SVM’s easier to train 2nd Neural Winter
- 17. Modern Deep Networks Ingredients: Rectified Linear Activation function a.k.a. ReLu
- 18. Modern Deep Networks Ingredients: Dropout:
- 19. Modern Deep Networks Ingredients: Mini-batches: Stochastic Gradient Descent Compute gradient over many (50 -100) data points (minibatch) and update.
- 20. Modern Deep Networks Ingredients: Softmax output:
- 21. Modern Deep Networks Ingredients: Categorical cross entropy loss
- 22. Modern Feedforward Networks Ingredients: Batch Normalization
- 23. Modern Feedforward Networks Ingredients: Adagrad a.k.a. adaptive learning rates
- 24. Restricted Boltzmann Machines
- 25. Restricted Boltzmann Machines
- 26. Convolutional Networks
- 27. Convolutional Networks [Krizhevsky 2012]
- 28. Convolutional Networks [Faster R-CNN: Ren, He, Girshick, Sun 2015] [Farabet et al., 2012]
- 29. Convolutional Networks [Faster R-CNN: Ren, He, Girshick, Sun 2015] [Farabet et al., 2012]
- 30. Convolutional Networks Self Driving Cars Convolutional example slides from Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 6 75
- 31. Convolutional Networks Standford CS231n: Convolutional Neural Networks for Visual Recognition
- 32. Convolutional Networks
- 33. Convolutional Networks
- 34. Convolutional Networks
- 35. Convolutional Networks
- 36. Convolutional Networks
- 37. Convolutional Networks
- 38. Convolutional Networks AlexNet [Krizhevsky et al 2014]
- 39. d d D-tour → Matrix Factorization 2 5 4 5 4 4 1 5 5 4 1 2 5 2 4 1
- 40. Convolutional Networks for enhancing Collaborative Filtering VBPR: Visual Bayesian Personalized Ranking from Implicit Feedback He, etl AAAI 2015
- 41. Convolutional Networks for Music feature extraction Deep Learning can be used to learn item profiles e.g. music Map audio to lower dimensional space where it can be used directly for recommendation Useful in recommending music from the long tail (not popular) A solution to the cold start problem
- 42. Convolutional Networks for Music feature extraction A. van den Oord, S. Dielmann, B. Schrauwen Deep content- based music recommendation NIPS 2014
- 43. Convolutional Networks deepart.io
- 44. Recurrent Neural Networks
- 45. Recurrent Neural Networks Long Short Term Memory
- 46. Recurrent Neural Networks
- 47. Recurrent Neural Networks PANDARUS: Alas, I think he shall be come approached and the day When little srain would be attain'd into being never fed, And who is but a chain and subjects of his death, I should not sleep. Second Senator: They are away this miseries, produced upon my soul, Breaking and strongly should be buried, when I perish The earth and thoughts of many states. DUKE VINCENTIO: Well, your wit is in the care of side and that. Second Lord: They would be ruled after this chamber, and my fair nues begun out of the fact, to be conveyed, Whose noble souls I'll have the heart of the wars. Clown: Come, sir, I will make did behold your worship. VIOLA: I'll drink it.
- 48. Recurrent Neural Networks
- 49. Recurrent Neural Networks
- 50. Recurrent Neural Networks
- 51. Recurrent Neural Networks
- 52. Session-based recommendation with Recurrent Neural Networks RNN (GRU) with ranking loss function ICLR 2016 [B. Hidasi, et.al.] Treat each user session as sequence of clicks
- 53. Session-based recommendation with Recurrent Neural Networks RNN (GRU) with ranking loss function ICLR 2016 [B. Hidasi, et.al.] Treat each user session as sequence of clicks
- 54. Autoencoders
- 55. Autoencoders
- 56. Autoencoders
- 57. Personalized Autoencoders Collaborative Denoising Auto-Encoders for Top-N Recommender Systems Wu et.al. WSDM 2016
- 58. (Some) Deep Learning Software Theano: Python Library TensorFlow: Python Library Keras: High Level Python Library (Theano &TF) MXNET: R, Python, Julia
- 59. Thanks ● Some slides or parts of slides are taken from other excellent talks and papers on Deep Learning (e.g. Yan Lecun, Andrej Karpathy and other great deep learning researchers)