Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Successfully reported this slideshow.

Like this presentation? Why not share!

- Intro to Machine Learning by Micros... by microsoftventures 2574 views
- Machine Learning for Dummies by Venkata Reddy Kon... 46062 views
- Introduction to Machine Learning by Lior Rokach 67328 views
- An Exemplar Model For Learning Obje... by Shao-Chuan Wang 1365 views
- 12 pattern recognition by Talal Khaliq 269 views
- Introduction to Machine Learning: A... by Muhammad Imran 567 views

4,261 views

Published on

No Downloads

Total views

4,261

On SlideShare

0

From Embeds

0

Number of Embeds

32

Shares

0

Downloads

173

Comments

0

Likes

7

No embeds

No notes for slide

- 1. Introduction to Machine Learning<br />Shao-Chuan Wang<br />Research Center for IT Innovation<br />Multimedia and Machine Learning Lab<br />Academia Sinica<br />中央研究院資訊科技創新研究中心<br />多媒體與機器學習實驗室 <br />NTNU<br />1<br />Shao-Chuan Wang, Academia Sinica<br />
- 2. Outline<br />What is involved in intelligence?<br />Why is machine learning important?<br />What can machine learning do?<br />Overview of machine learning applications<br />Challenges of machine learning<br />Future of machine learning<br />2<br />Shao-Chuan Wang, Academia Sinica<br />
- 3. What Is Intelligence?<br />3<br />Shao-Chuan Wang, Academia Sinica<br />
- 4. What Is Involved in Intelligence?<br />From Merriam-Webster: <br />“intelligence”: (1) the ability to learn or understand or to deal with new or trying situations. (2) the ability to apply knowledge to manipulate one's environment or to think abstractly as measured by objective criteria<br />Abstraction (finding the common patterns)<br /> V.S.<br />Adaptation<br />Learning is dynamic; e.g. a computer chess.<br />4<br />Shao-Chuan Wang, Academia Sinica<br />
- 5. Shao-Chuan Wang, Academia Sinica<br />Why Is Machine Learning Important?(1/4)<br />The explosion of data<br />5<br />
- 6. Shao-Chuan Wang, Academia Sinica<br />Why Is Machine Learning Important?(2/4)<br />6<br />Some places are<br />NOT for humans<br />
- 7. Shao-Chuan Wang, Academia Sinica<br />Why Is Machine Learning Important?(3/4)<br /> Machine learning can help us understand human learning<br />7<br />
- 8. Shao-Chuan Wang, Academia Sinica<br />Why Is Machine Learning Important?(4/4)<br />8<br />Intelligent machines <br />can help!<br />
- 9. What Can Machine Learning Do?<br />9<br />Shao-Chuan Wang, Academia Sinica<br />
- 10. Application One:Handwriting Recognition<br />10<br />Shao-Chuan Wang, Academia Sinica<br />Video<br />
- 11. Application Two: Face Detection and Tracking<br />11<br />Shao-Chuan Wang, Academia Sinica<br />Video<br />
- 12. Application Three: Autonomous Driving<br />12<br />Shao-Chuan Wang, Academia Sinica<br />Video<br />
- 13. Shao-Chuan Wang, Academia Sinica<br />Overview of Machine Learning Applications<br />Speech recognition<br />Computer vision<br />Bio-surveillance<br />Robotics<br />Data mining<br />13<br />
- 14. What Is Learning?<br />14<br />Shao-Chuan Wang, Academia Sinica<br />
- 15. Shao-Chuan Wang, Academia Sinica<br />A Tree Recognition Example (1/2)<br />Suppose that you have never seen trees before, and I give you some “EXAMPLES”<br />Trees examples<br />‘Not’ Trees examples<br />15<br />
- 16. A Tree Recognition Example (2/2)<br />I will ask you if these unseen photos are trees or not.<br />YES<br />Is it a tree?<br />or<br />NO<br />Query Images<br />16<br />Shao-Chuan Wang, Academia Sinica<br />(AND) How much confidence?<br />
- 17. What Is Learning?<br />(Mitchell 2002) Learning is to improve the performance measure P of the task T based on the past experience E.<br />T: To recognize a tree<br />P: Recognition accuracy<br />E: The examples that I gave to you<br />Two key elements of learning:<br />Memorization of past experiences.<br />“Generalization” ability (舉一反三).<br />17<br />Shao-Chuan Wang, Academia Sinica<br />
- 18. Shao-Chuan Wang, Academia Sinica<br />A Simple Algorithm: Nearest Neighbor<br />For a given query image<br />Find the nearest image to the query image in the database<br />Assign the label of the nearest one to the query image.<br />Query<br />Tree!<br />Difference = 13<br />Difference = 1.5<br />Difference = 11<br />Difference = 5.5<br />Difference = 10<br />18<br />
- 19. What Were We Modeling?<br />YES<br />YES<br />YES<br />NO<br />NO<br />NO<br />…<br />TREE<br />Human Concept<br />(exist but unknown)<br />Prediction: NO<br />Infer<br />A Machine<br />(A learning algorithm)<br />Query<br />19<br />Shao-Chuan Wang, Academia Sinica<br />Training…<br />
- 20. What if we do not have label ground truth?? (or labels are very expensive)<br />20<br />Shao-Chuan Wang, Academia Sinica<br />
- 21. Shao-Chuan Wang, Academia Sinica<br />Unsupervised Learning<br />Clustering<br />21<br />。Each segment <br /> forms a “Cluster”.<br />。Pattern discover<br />
- 22. Examples: Amazon.com<br />Marketing<br />Recommendation on the similar goods.<br />22<br />Shao-Chuan Wang, Academia Sinica<br />
- 23. Challenges of Machine Learning <br />How do we model the “difference” between two images?<br />Data Representation<br />Difference Metric<br />What is the “score” or “difference” function?<br />How did we calculate the distance value in the tree example?<br />Learning<br />Does it model well? (can it accurately predict the seen data?)<br />Does it generalize well? (can it be proved?)<br />23<br />Shao-Chuan Wang, Academia Sinica<br />OR<br />?<br />
- 24. Shao-Chuan Wang, Academia Sinica<br />Example: sea bass or salmon?<br />Suppose that we have only two kinds of fish, and we want a computer system that aids our distinction between sea bass and salmon.<br />Process:<br />24<br />Take <br />A <br />Picture<br />Computer<br />Decision<br />
- 25. Shao-Chuan Wang, Academia Sinica<br />Example: sea bass or salmon?<br />How do we describe a fish? (Data representation)<br />What kinds of information can help us distinguish one from the other?<br />Length, width, size of fins, tail shape, color, etc?<br />How do we measure its distinctness under the chosen data representation? (Difference metric)<br />E.g. if we choose length, than their “distinctness” can be measured using its absolute relative values. <br />25<br />
- 26. Shao-Chuan Wang, Academia Sinica<br />Example: sea bass or salmon?<br />Assume that a fisherman (prior domain knowledge) told us that salmon is generally longer than a sea bass.<br /> We may use length as a feature to discriminate between them.<br />But how?<br />26<br />
- 27. Shao-Chuan Wang, Academia Sinica<br />Example: sea bass or salmon?<br />We use “past experiences” and we calculate a histogram of lengths for two types of fishes.<br />Apply Nearest Neighbor to their average length.<br />27<br />
- 28. Shao-Chuan Wang, Academia Sinica<br />Example: sea bass or salmon?<br />The difficulty comes from the ambiguity around the threshold value.<br />Length itself is insufficient to “describe” the fishes.<br />Use more features like width and color, etc.<br />Other manipulation. E.g. use nearest neighbor to “median” of the length; will it be better?<br />Let’s try one more feature: width<br />28<br />
- 29. Shao-Chuan Wang, Academia Sinica<br />Example: sea bass or salmon?<br />We can use two features and wrote them down as a vector:<br />Each fish image is represented as a 2-D feature vector:<br />29<br />Length : x1<br />Width : x2<br />
- 30. Shao-Chuan Wang, Academia Sinica<br />Example: sea bass or salmon?<br />30<br />There are still misclassified training examples<br />
- 31. Shao-Chuan Wang, Academia Sinica<br />Example: sea bass or salmon?<br />Why use Line?<br />We can use complex boundary, but we radically change the boundary just because of some heretics. => may not generalize well.<br />31<br />
- 32. Shao-Chuan Wang, Academia Sinica<br />Challenges of Machine Learning <br />Conclusion on this example:<br />We have to incorporate prior knowledge to decide which features we are going to use. At present, there is no universal learning machines.<br />We want a feature that is invariant within certain specie but distinct between different species.<br />There is a trade-off between complexity of decision model s and their “training errors”.<br />32<br />
- 33. Shao-Chuan Wang, Academia Sinica<br />The Future of Machine Learning<br />Theoretic foundations of learning<br />Scalability (Parallel)<br />Robustness to dynamic environment<br />33<br />
- 34. Shao-Chuan Wang, Academia Sinica<br />Questions?<br />34<br />
- 35. Thank you for your attention!<br />35<br />Shao-Chuan Wang, Academia Sinica<br />
- 36. Shao-Chuan Wang, Academia Sinica<br />Learning schemes<br />Supervised learning:<br />The tree example is a supervised learning problem.<br />Supervised learning provides label ground truth.<br />Unsupervised learning:<br />Unsupervised learning DOES NOT provide label ground truth.<br />Reinforcement learning:<br />The way you train your pets.<br />36<br />

No public clipboards found for this slide

×
### Save the most important slides with Clipping

Clipping is a handy way to collect and organize the most important slides from a presentation. You can keep your great finds in clipboards organized around topics.

Be the first to comment