Successfully reported this slideshow.
Upcoming SlideShare
×

# Introduction To Applied Machine Learning

1,725 views

Published on

This is the first lecture on Applied Machine Learning. The course focuses on the emerging and modern aspects of this subject such as Deep Learning, Recurrent and Recursive Neural Networks (RNN), Long Short Term Memory (LSTM), Convolution Neural Networks (CNN), Hidden Markov Models (HMM). It deals with several application areas such as Natural Language Processing, Image Understanding etc. This presentation provides the landscape.

Published in: Technology
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

### Introduction To Applied Machine Learning

1. 1. Introduction to Machine Learning Applied Machine Learning: Unit 1, Lecture 1 Anantharaman Narayana Iyer narayana dot Anantharaman at gmail dot com 8 Jan 2016
2. 2. References • Pattern Recognition and Machine Learning by Christopher Bishop • Machine Learning, T Mitchell • MOOC Courses offered by Prof Andrew Ng, Prof Yaser Mustafa, Prof Pedro Domingos (see image) • CMU Videos Prof T Mitchell • Introduction to Machine Learning, Alpaydin
3. 3. A breakthrough in Machine Learning would be worth 10 Microsofts: Bill Gates
4. 4. Let’s start with a puzzle: Predict what is next? Sample Input Output 1 (10, 1, -6, -1, 200) (-6, -1, 1, 10, 200) 2 (27, 0, 3000, 7, -3) (-3, 0, 7, 27, 3000) 3 (111, 222, 333, 444, 555) (111, 222, 333, 444, 555) 4 (76, 69, 80, 55, 98) (55, 69, 76, 80, 98) 5 (7, 6, 5, 4, 3) (3, 4, 5, 6, 7) 6 (0, -1, -2, -3, 100) ? 7 (1000, 900, 2000, 1, 9999) ?
5. 5. Let’s start with a puzzle: Predict what is next? Sample Input Output 1 (10, 1, -6, -1, 200) (-6, -1, 1, 10, 200) 2 (27, 0, 3000, 7, -3) (-3, 0, 7, 27, 3000) 3 (111, 222, 333, 444, 555) (111, 222, 333, 444, 555) 4 (76, 69, 80, 55, 98) (55, 69, 76, 80, 98) 5 (7, 6, 5, 4, 3) (3, 4, 5, 6, 7) 6 (0, -1, -2, -3, 100) ? 7 (1000, 900, 2000, 1, 9999) ? ML uses a set of observations to uncover an underlying process
6. 6. When to apply ML? • Distinguishing characteristics of machine learning applications • Problems that can not be solved adequately using analytical approaches • There is an underlying pattern that produces the observable data • Adequate amount of data is available to learn and generalize • Points to ponder: • Can we use Machine Learning techniques to perform sorting of real numbers? • What makes applications like handwriting recognition more suitable as candidates for Machine Learning?
7. 7. Exercise Which problems below are best suited for applying machine learning? 1. Recognizing handwritten characters: e.g MNIST dataset 2. Speaker Identification from a video 3. Searching the telephone number given the name of a person from BSNL telephone directory 4. Searching for documents from web that match a query 5. Removing line frequency noise (50 hz) from a feeble dc signal (say a few micro volts) 6. Performing Fourier analysis on a complex signal 7. Classifying or Clustering a signal in to a male voice, female voice and a child’s voice 8. Identifying a specific conversation out of a several parallel conversations that happen during a party
8. 8. What is ML? • Arthur Samuel (1959). Machine Learning: Field of study that gives computers the ability to learn without being explicitly programmed. • Tom Mitchell (1998): ML is a study of algorithms that: • Improve their performance P • At some Task T • With experience E Well defined learning task (P, T, E) • Examples: • Spam detection • Robot navigation • Image recognition
9. 9. Performance Metrics • These may be selected based on the application that we are solving • Examples: Precision, Recall, F1 score, Mean squared error
10. 10. ML Applications • Well researched applications • Traditional models have yielded sufficiently accurate results (90% and above) • Such applications have become mainstream • This is not to say that research ceases to exist on these topics, but most of the common use cases are adequately addressed • Examples: OCR, Image classification, Isolated Word speech recognition, text classification, basic recommender system, spam detection, part of speech tagging • Emerging, breakthrough applications • Advances in ML such as DL has enabled exciting, novel applications • Examples: Image description, Information Extraction in Video, Personal assistant systems like SIRI, sarcasm detection from multimedia
11. 11. More Examples (ref: T Mitchell)
12. 12. Application Domains for ML (Ref T Mitchell)
13. 13. Applications of ML • Web Search • Information Extraction • Spam detection • Social Networks • E-Commerce • Finance • Speech Recognition • Robotics • Computer Vision • So on…
14. 14. Application of ML techniques in NLP • Core NLP • POS tagging • NER • Text Classification • Language Models • Information Retrieval • Word representation • Information Extraction • Speech/Voice recognition • Applications of NLP • Search Engines • Topic Modelling • Sentiment Analysis • Sentic computing • Intent analysis • Subject identification • Real word spelling correction • Content Synthesis • Speech based applications • Handwritten text recognition
15. 15. What are we going to cover in the course? – key topics • The focus of our course would be the contemporary techniques on Deep Learning that includes: • Multi layer deep learning networks, spatially deep • Recurrent and Recursive networks, temporally deep • LSTM and its variants like GRU • Convolutional networks • Hybrid models that combine deep networks with other systems like HMM • We would also cover the basics such as: • Probability theory and Naïve Bayes classifier • Linear models • Log-linear models • SVM • Feed forward artificial neural networks • Applications (Examples) • Text Processing: Bias/Sarcasm/Animosity detection • Image: Describing Images • Video: Detect interesting events given a video • Audio: Measure the quality of news hour debates of main stream Indian TV media
16. 16. Course Plan • Unit 1: Machine Learning basics • Unit 2: Machine Learning advanced topics part 1 • Unit 3: Machine Learning advanced topics part 2 • Unit 4: Machine Learning for Text processing • Unit 5: Applying ML/DL to multimedia (Audio, Video, Images) Important Note: Due to the fast pace of changes in this domain the working syllabus that will be covered in this semester might deviate from the published syllabus. Students will be assessed as per the working syllabus.
17. 17. Evaluation Plan • T1: 2 hour theory exam (open book) and 4 hour lab: 10% + 10% = 20% • T2: 2 hour theory exam (open book) and 4 hour lab: 10% + 10% = 20% • Final Exam: 3 hour theory exam (open book) and 5 hour lab: 25% + 30% = 55% • Class Participation: 5% Notes: • All written exams will be open book but not open internet. Mobile phones are disallowed. You may use scientific calculator. • There will be assignments in the form of hands on. Submission of these assignments would be a pre requisite for attempting the final exam. • Based on the class strength and aptitude, optionally, each project group may be assigned a mentor, who will be your advisor. This is optional and subject to the discretion of the faculty.
18. 18. Classes of Applications (Examples) Application Example Input Type Output Type Typical Classifier OCR MNIST Image Pixels (static) Class Label ANN with Softmax Speech Recognition Sequence of Phonemes Sequence of words HMM Multiple Object Recognition within an image Image Pixels (static) Labels of each object CNN with Logistic High fidelity Face Recognition Image Pixels (static) Class label DNN with softmax Spam detection Text Class Label MaxEnt Classifier Image Captioning Image Pixels (static) Text CNN + Recursive NN C-Section risk detection Vector of real numbers, Boolean Class Label Decision Trees Cricket Player Selection Vector of real numbers Class Label Logistic Regression IPL Player Auction Price Vector of real numbers Real Number Linear Regression or NN with a linear output layer Named Entity Recognition Sequence of words Sequence of Labels RNN, MEMM, CRF Basic Sentiment Analysis Sequence of words Class Label Naïve Bayes Classifier Parsing Unstructured Text Sequence of words Parse Tree Recursive NN