Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Like this presentation? Why not share!

- Introduction to Machine Learning by Lior Rokach 20453 views
- How to build_a_search_engine by Andrea Iacono 445 views
- T3chFest2016 - Uso del API JavaScri... by David Gómez García 734 views
- Functional Java 8 in everyday life by Andrea Iacono 1677 views
- Graphs are everywhere! Distributed ... by Andrea Iacono 1124 views
- MapReduce by examples by Andrea Iacono 88575 views

1,916 views

Published on

An introduction to several techniques of machine learning.

License: CC Attribution-NonCommercial License

No Downloads

Total views

1,916

On SlideShare

0

From Embeds

0

Number of Embeds

19

Shares

0

Downloads

147

Comments

0

Likes

2

No embeds

No notes for slide

- 1. Machine Learning Andrea Iacono https://github.com/andreaiacono/MachineLearning
- 2. Machine Learning: Intro What is Machine Learning? [Wikipedia]: a branch of artificial intelligence that allows the construction and the study of systems that can learn from data
- 3. Machine Learning: Intro Some approaches: - Regression analysis - Similarity and metric learning - Decision tree learning - Association rule learning - Artificial neural networks - Genetic programming - Support vector machines (classification and regression analysis) - Clustering - Bayesian networks
- 4. Machine Learning: Intro Supervised learning vs Unsupervised learning Machine learning vs Data mining
- 5. Machine Learning: Regression analysis Regression Analysis A statistical technique for estimating the relationships among a dependent variable and independent variables
- 6. Machine Learning: Regression analysis Prediction of house prices Size (x) Price (y) 0.80 70 0.90 83 1.00 74 1.10 93 1.40 89 1.40 58 1.50 85 1.60 114 1.80 95 2.00 100 2.40 138 2.50 111 2.70 124 3.20 172 3.50 172
- 7. Machine Learning: Regression analysis Prediction of house prices Hypothesis: h θ ( x )=θ0 + θ1 x
- 8. Machine Learning: Regression analysis Prediction of house prices Hypothesis: h θ (x )=θ0 + θ1 x Cost function for linear regression: m 1 J (θ 0, θ1 )= (h θ (x (i) )− y(i ) )2 ∑ 2m i=1
- 9. Machine Learning: Regression analysis Prediction of house prices Hypothesis: h θ (x )=θ0 + θ1 x Cost function for linear regression: m 1 J (θ 0, θ1 )= (h θ (x (i) )− y(i ) )2 ∑ 2m i=1 Gradient Descent repeat until convergence : m 1 (i ) (i ) θ 0=θ 0−α ∑ (hθ ( x )− y ) m i =1 m 1 θ1 =θ1 −α ∑ [(h θ (x (i) )− y(i )) x (i) ] m i =1
- 10. Machine Learning: Regression analysis Prediction of house prices Iterative minimization of cost function with gradient descent
- 11. Machine Learning: Regression analysis Hands on
- 12. Machine Learning: Regression analysis Regression analysis - one / multiple variables - linear / higher order curves - several optimization algorithms - linear regression - logistic regression - simulated annealing - ...
- 13. Machine Learning: Regression analysis Overfitting vs underfitting
- 14. Machine Learning: Similarity and metric learning Similarity and metric learning - concept of distance
- 15. Machine Learning: Similarity and metric learning Euclidean distance euclidean distance (p , q )= √ n ∑ (p i −q i )2 i =1
- 16. Machine Learning: Similarity and metric learning Manhattan distance n manhattan distance (p , q )=∑ ∣(p i −q i )∣ i =1
- 17. Machine Learning: Similarity and metric learning Pearson's correlation n n ∑ pi ∑ qi n ∑ (p i q i )− i =1 Pearson ' s correlation ( p , q )= i =1 √ n n 2 i (∑ p − i =1 i =1 n n 2 (∑ p i ) i =1 n 2 n (∑ qi ) i =1 n )( ∑ q 2 − i i =1 )
- 18. Machine Learning: Similarity and metric learning Collaborative filtering Searches a large group of users for finding a small subset that have tastes like yours. Based on what this subset likes or dislikes the system can recommend you other items. Two main approaches: - User based filtering - Item based filtering
- 19. Machine Learning: Similarity and metric learning User based filtering - based on ratings given to the items, we can measure the distance among users - we can recommend to the user the items that have the highest ratings among the closest users
- 20. Machine Learning: Similarity and metric learning Hands on
- 21. Machine Learning: Similarity and metric learning Is user based filtering good for - scalability? - sparse data? - quickly changing data?
- 22. Machine Learning: Similarity and metric learning Is user based filtering good for - scalability? - sparse data? - quickly changing data? No, it's better to use item based filtering
- 23. Machine Learning: Similarity and metric learning Euclidean distance for item based filtering: nothing has changed! - based on ratings got from the users, we can measure the distance among items - we can recommend an item to a user, getting the items that are closer to the highest rated by the user
- 24. Machine Learning: Similarity and metric learning Hands on
- 25. Machine Learning: Bayes' classifier Bayes' theorem P ( A∣B )= P (B∣A)P (A ) P (B ) Example: given a company where 70% of developers use Java and 30% use C++, and knowing that half of the Java developers always use enhanced for loop, if you look at the snippet: for (int j=0; j<100; j++) { t = tests[j]; } which is the probability that the developer who wrote it uses Java?
- 26. Machine Learning: Bayes' classifier Bayes' theorem P ( A∣B )= P (B∣A)P (A ) P (B ) Example: given a company where 70% of developers use Java and 30% use C++, and knowing that half of the Java developers always use enhanced for loop, if you look at the snippet: for (int j=0; j<100; j++) { t = tests[j]; } which is the probability that the developer who wrote it uses Java? Hint: A = developer uses Java B = developer writes old for loops
- 27. Machine Learning: Bayes' classifier Bayes' theorem P ( A∣B )= P (B∣A)P (A ) P (B ) Example: given a company where 70% of developers use Java and 30% use C++, and knowing that half of the Java developers always use enhanced for loop, if you look at the snippet: for (int j=0; j<100; j++) { t = tests[j]; } which is the probability that the developer who wrote it uses Java? Solution: A = developer uses Java B = developer writes old for loops P(A) = prob. that a developer uses Java = 0.7 P(B) = prob. that any developer uses old for loop = 0.3 + 0.7*0.5 = 0.65 P(B|A) = prob. that a Java developer uses old for loop = 0.5 P (B∣A)P (A) 0.5⋅0.7 P (A∣B )= = =0.54 P (B ) 0.65
- 28. Machine Learning: Bayes' classifier Naive Bayes' classifier - supervised learning - trained on a set of known classes - computes probabilities of elements to be in a class - smoothing required n ∏ P (c∣w i ) P c (w 1 , .... , w n )= i =1 n n i =1 i =1 ∏ P (c∣w i )+ ∏ (1−P (c∣w i ))
- 29. Machine Learning: Bayes' classifier Naive Bayes' classifier Example - we want a classifier for Twitter messages - define a set of classes: {art, tech, home, events,.. } - trains the classifier with a set of alreay classified tweets - when a new tweet arrives, the classifier will (hopefully) tell us which class it belongs to
- 30. Machine Learning: Bayes' classifier Hands on
- 31. Machine Learning: Bayes' classifier Sentiment analysis - define two classes: { +, - } - define a set of words: { like, enjoy, hate, bore, fun, …} - train a NBC with a set of known +/- comments - let NBC classify any new comment to know if +/- performance is related to quality of training set
- 32. Machine Learning: Clustering Clustering - Unsupervised learning - Different algorithms: - Hierarchical clustering - K-Means clustering - ... Common use cases: - navigation habits - online commerce - social/political attitudes - ...
- 33. Machine Learning: Clustering K-Means clustering K-Means aims at identifying cluster centroids, such that an item belonging to a cluster X, is closer to the centroid of cluster X than to the centroid of any other cluster.
- 34. Machine Learning: Clustering K-Means clustering The algorithm requires a number of clusters to start, in this case 3. The centroids are placed in the item space, typically in random locations.
- 35. Machine Learning: Clustering K-Means clustering The algorithm will then assign to each centroid all items that are closer to it than to any other centroid.
- 36. Machine Learning: Clustering K-Means clustering The centroids are then moved to the center of mass of the items in the clusters.
- 37. Machine Learning: Clustering K-Means clustering A new iteration occurs, taking into account the new centroid positions.
- 38. Machine Learning: Clustering K-Means clustering The centroids are again moved to the center of mass of the items in the clusters.
- 39. Machine Learning: Clustering K-Means clustering Another iteration occurs, taking into account the new centroid positions.
- 40. Machine Learning: Clustering K-Means clustering The centroids are again moved to the center of mass of the items in the clusters.
- 41. Machine Learning: Clustering K-Means clustering Another iteration occurs, taking into account the new centroid positions. Note that this time the cluster membership did not change. The cluster centers will not move anymore.
- 42. Machine Learning: Clustering K-Means clustering The solution is found.
- 43. Machine Learning: Clustering Hands on
- 44. Machine Learning: Neural networks Neural networks A logical calculus of the ideas immanent in nervous activity by McCulloch and Pitts in 1943
- 45. Machine Learning: Neural networks Neural networks Feedforward Perceptron
- 46. Machine Learning: Neural networks Neural networks Logic operators with neural networks: Threshold = 0 X0 -10 -10 -10 -10 X1 0 0 20 20 X2 0 20 0 20 Σ -10 10 10 30 Result 0 1 1 1 OR operator
- 47. Machine Learning: Neural networks Neural networks Logic operators with neural networks: Threshold = 0 X0 -30 -30 -30 -30 X1 0 0 20 20 X2 0 20 0 20 Σ Result which operator?
- 48. Machine Learning: Neural networks Neural networks Logic operators with neural networks: Threshold = 0 X0 -30 -30 -30 -30 X1 0 0 20 20 X2 0 20 0 20 Σ -30 -10 -10 10 Result 0 0 0 1 AND operator
- 49. Machine Learning: Neural networks Hands on
- 50. Machine Learning: Neural networks Neural networks Backpropagation Phase 1: Propagation - Forward propagation of a training pattern's input through the neural network in order to generate the propagation's output activations - Backward propagation of the propagation's output activations through the neural network using the training pattern target in order to generate the deltas of all output and hidden neurons Phase 2: Weight update - Multiply its output delta and input activation to get the gradient of the weight - Bring the weight in the opposite direction of the gradient by subtracting a ratio of it from the weight
- 51. Machine Learning: Neural networks Neural networks Multilayer perceptrons
- 52. Machine Learning: Neural networks Hands on
- 53. Machine Learning: Genetic algorithms Genetic algorithms GA is a programming technique that mimics biological evolution as a problem-solving strategy Steps - maps the variables of the problem into a sequence of bits, a chromosome Chromosome - creates a random population of chromosomes - let evolve the population using evolution laws: - the higher the fitness, the higher the chance of breeding - crossover of chromosomes - mutation in chromosomes - if otpimal solution is found or after n steps the process is stopped
- 54. Machine Learning: Genetic algorithms Genetic algorithms Mutation Crossover
- 55. Machine Learning: Genetic algorithms Hands on
- 56. Machine Learning Thanks! The code is available on: https://github.com/andreaiacono/MachineLearning

No public clipboards found for this slide

×
### Save the most important slides with Clipping

Clipping is a handy way to collect and organize the most important slides from a presentation. You can keep your great finds in clipboards organized around topics.

Be the first to comment