Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Successfully reported this slideshow.

Like this presentation? Why not share!

- Sustainable Urban Transport Plannin... by Daniel Emaasit 7416 views
- 018 20160902 Machine Learning Frame... by Ha Phuong 554 views
- 019 20160907 Decoupled Neural Inter... by Ha Phuong 549 views
- SparkR + Zeppelin by felixcss 6038 views
- Finding Security Issues Fast! by Salesforce Engine... 618 views
- Global State Management of Micro Se... by Salesforce Engine... 427 views

13,662 views

Published on

Published in:
Data & Analytics

No Downloads

Total views

13,662

On SlideShare

0

From Embeds

0

Number of Embeds

11,661

Shares

0

Downloads

96

Comments

0

Likes

5

No embeds

No notes for slide

- 1. Introduction Case Study Probabilistic Programming Conclusion Introduction to Model-Based Machine Learning Daniel Emaasit1 1Ph.D. Student Department of Civil and Environmental Engineering University of Nevada Las Vegas Las Vegas, NV USA emaasit@unlv.nevada.edu July 20 2016 1 / 39
- 2. Introduction Case Study Probabilistic Programming Conclusion Introduction 2 / 39
- 3. Introduction Case Study Probabilistic Programming Conclusion 3 / 39
- 4. Introduction Case Study Probabilistic Programming Conclusion Current Challenges in Adopting Machine Learning Generally, current challenges in adopting ML: Overwhelming number of traditional ML methods to learn Deciding which algorithm to use or why Some custom problems may not ﬁt with any existing algorithm 4 / 39
- 5. Introduction Case Study Probabilistic Programming Conclusion What is Model-Based Machine Learning? A diﬀerent viewpoint for machine learning proposed by Bishop (2013)1, Winn et al. (2015)2 Goal: Provide a single development framework which supports the creation of a wide range of bespoke models The core idea: all assumptions about the problem domain are made explicit in the form of a model 1 Bishop, C. M. (2013). Model-Based Machine Learning. Philosophical Transactions of the Royal Society A, 371, pp 1–17 2 Winn, J., Bishop, C. M., Diethe, T. (2015). Model-Based Machine Learning. Microsoft Research Cambridge. http://www.mbmlbook.com. 5 / 39
- 6. Introduction Case Study Probabilistic Programming Conclusion What is a Model in MBML? A Model: is a set of assumptions, expressed in mathematical/graphical form expresses all parameters, variables as random variables shows the dependency between variables 6 / 39
- 7. Introduction Case Study Probabilistic Programming Conclusion Key Ideas of MBML? MBML is built upon 3 key ideas the use of Probabilistic Graphical Models (PGM) the adoption of Bayesian ML the application of fast, approximate inference algorithms 7 / 39
- 8. Introduction Case Study Probabilistic Programming Conclusion Key Idea 1: Probabilistic Graphical Models Combine probability theory with graphs (e.g Factor Graphs) 8 / 39
- 9. Introduction Case Study Probabilistic Programming Conclusion Key Idea 2: Bayesian Machine Learning Everything follows from two simple rules of probability theory 9 / 39
- 10. Introduction Case Study Probabilistic Programming Conclusion Key Idea 3: Inference Algorithms the application of fast, deterministic inference algorithms by local message passing Variational Bayes Expectation Propagation 10 / 39
- 11. Introduction Case Study Probabilistic Programming Conclusion Stages of MBML 3 stages of MBML Build the model: Joint probability distribution of all the relevant variables (e.g as a graph) Incorporate the observed data Perform inference to learn parameters of the latent variables 11 / 39
- 12. Introduction Case Study Probabilistic Programming Conclusion Special cases of MBML 12 / 39
- 13. Introduction Case Study Probabilistic Programming Conclusion Beneﬁts of MBML Potential beneﬁts of this approach Provides a systematic process of creating ML solutions Allows for incorporation of prior knowledge Allows for handling uncertainity in a principled manner Does not suﬀer from overﬁtting Custom solutions are built for speciﬁc problems Allows for quick building of several alternative models Easy to compare those alternatives It’s general purpose: No need to learn the 1000s of existing ML algorithms Separates model from inference/training code 13 / 39
- 14. Introduction Case Study Probabilistic Programming Conclusion Case Study 14 / 39
- 15. Introduction Case Study Probabilistic Programming Conclusion TrueSkillTM by Microsoft Objective: Determine the true skill of millions of players on Xbox Live Why: So that players of the same skill can be matched with each other 15 / 39
- 16. Introduction Case Study Probabilistic Programming Conclusion Stage 1: Build the Model 16 / 39
- 17. Introduction Case Study Probabilistic Programming Conclusion Stage 2: Incorporate Observed data 17 / 39
- 18. Introduction Case Study Probabilistic Programming Conclusion Stage 3: Learn the parameters 18 / 39
- 19. Introduction Case Study Probabilistic Programming Conclusion Convergence 19 / 39
- 20. Introduction Case Study Probabilistic Programming Conclusion Multiple players - Model 20 / 39
- 21. Introduction Case Study Probabilistic Programming Conclusion Team players - Model 21 / 39
- 22. Introduction Case Study Probabilistic Programming Conclusion Skill through time - Model 22 / 39
- 23. Introduction Case Study Probabilistic Programming Conclusion Probabilistic Programming 23 / 39
- 24. Introduction Case Study Probabilistic Programming Conclusion What is Probabilistic Programming? A software package that takes the model and then automatically generate inference routines (even source code!) to solve a wide variety of models Takes programming languages and adds support for: random variables constraints on variables inference Examples of PP software packages Infer.Net (C#, C++) Stan (R, python, C++) BUGS church PyMC (python) 24 / 39
- 25. Introduction Case Study Probabilistic Programming Conclusion How Probabilistic Programming works How infer.NET works 25 / 39
- 26. Introduction Case Study Probabilistic Programming Conclusion Conclusion 26 / 39
- 27. Introduction Case Study Probabilistic Programming Conclusion Closing Remarks Objective of webinar: Introduce the basics of model-based machine learning, Introduce probabilistic programming. Moving forward: Now you can look into specialized topics like: Fast Bayesian inference techniques, Model building, 27 / 39
- 28. Introduction Case Study Probabilistic Programming Conclusion References 1. J. Winn, C. Bishop, and T. Diethe, Model-Based Machine Learning,Microsoft Research,2015. 2. C. M. Bishop, “Model-based machine learning” Phil Trans R Soc, A 371: 20120222, Jan. 2013 3. T. Minka, J. Winn, J. Guiver, and D. Knowles, Infer.NET, Microsoft Research Cambridge,2010. 4. Stan Development Team, “Stan Modeling Language Users Guide and Reference Manual,” Version 2.9.0, 2016. 5. Stan Development Team, “RStan: the R interface to Stan,” Version 2.9.0”. 6. D. Emaasit, A. Paz, and J. Salzwedel (2016). “A Model-Based Machine Learning Approach for Capturing Activity-Based Mobility Patterns using Cellular Data”. IEEE ITSC 2016. Under Review. 28 / 39
- 29. Introduction Case Study Probabilistic Programming Conclusion Contact Me daniel.emaasit@gmail.com github.com/Emaasit www.danielemaasit.com [@Emaasit](https://twitter.com/Emaasit) 29 / 39
- 30. Introduction Case Study Probabilistic Programming Conclusion Two players - Model 30 / 39
- 31. Introduction Case Study Probabilistic Programming Conclusion Two players - Code 31 / 39
- 32. Introduction Case Study Probabilistic Programming Conclusion Multiple players - Model 32 / 39
- 33. Introduction Case Study Probabilistic Programming Conclusion Multiple players - Code 33 / 39
- 34. Introduction Case Study Probabilistic Programming Conclusion Team players - Model 34 / 39
- 35. Introduction Case Study Probabilistic Programming Conclusion Team players - Code 35 / 39
- 36. Introduction Case Study Probabilistic Programming Conclusion Skill through time - Model 36 / 39
- 37. Introduction Case Study Probabilistic Programming Conclusion Skill through time - Code 37 / 39
- 38. Introduction Case Study Probabilistic Programming Conclusion y w x τ N M y N dot w x µw αw µx αx τ ατ βτ N N G N M Figure 1: PCA model as a Bayesian network and a directed factor graph. 38 / 39
- 39. Introduction Case Study Probabilistic Programming Conclusion X T θ αθ Multi Multi Dir φ αφ Dir ∀1 ≤ i ≤ nd ∀t ∈ T 39 / 39

No public clipboards found for this slide

Be the first to comment