1. Introduction to machine learning
https://www.springboard.com/blog/machine-learning-engineering/
Introducción al aprendizaje de la máquina
R
N
N
This is not a real cup of coffee
G
A
N
Luc Lesoil
12/03/2020
30/04/2020
14/05/2020
May 20th
2020
DeepMind
Self-driving cars
Cloud optimization
Healthcare
&
Medical images analysis
4. I- Supervised learning
X : Explaining variables
y : Variable to predict, labels known
X : images of numbers y : numbers
3 5 7
4 0
1
Supervised learning = Use X to predict y
Train
the model
Test
the model
3
4
1
0
5
7
3
4
0
5
Learn X→ y Predict ŷ, estimations of y
Compare with real y values
7 1
4
1 =
?
ŷ y
5. Loss function
Minimize the loss function
=
better predictions
Loss function
0
5
2.5
Today’s Tux happiness : 3.2
Prediction 1 : 3
Prediction 2 : 0.5
Compare quality
of predictions?
https://towardsdatascience.com/common-loss-functions-in-mac
hine-learning-46af0ffc4d23
https://algorithmia.com/blog/introduction-to-loss-functions
Tux
MAE
MSE
MAPE
Minkowski
Cross-entropy
Hinge
Examples
6. Supervised learning Qualitative Quantitative
Discrete
int
Continious
float
Nominal
str
Ordinal
str
Colour,
Species
Number
of
occurences
Time,
Temperature
Regression
Classification
Y
Scale
Type of
→ Value
→ Group or category
10. Boosting tree
●
Complex dataset
●
Many hyperparameters
●
XGBoost: the algorithm that wins
every competition
+ +
Update the trees based on previous results
AdaBoost
XGBoost
16. K-Nearest neighbors
●
Used in recommendation
systems
●
Supervised
●
“You are the average of the five
people you spend the most time with”
0.6
4.7
1 is the nearest neighbor of 3
0 is the second nearest
2.2