@SrcMinistry @MariuszGil
Machine Learning for Developers
Data processing
@SrcMinistry
My story
Data classification
Bot detection
Minimize
risk of error
Predictions
Click probability
Maximize
CTR or eCPM
A lot of data
data + algo = result
Real problem
+ value estimator
+ chance of sell
+ $ optimization
Tens of thousands
historical transactions
Tens of data
components
Hundreds of
data components
HOW?
Machine Learning
Theory
A computer program is said to learn from experience E
with respect to some class of tasks T and performance
measure P if its performance at tasks in T, as measured
by P, improves with experience E
Tom M. Mitchell
Task
Typical ML techniques
Classification
Regression
Clustering
Dimensionality reduction
Association learning
o
oo
o
oo
o
oo
o o
o
o oo
o
o
o
o oo
o
oo
o
o
o
feature 1
feature2
o
oo
o
oo
o
oo
o o
o
o oo
o
o
o
o oo
o
oo
o
o
o
feature 1
feature2
o
oo
o
oo
o
oo
o o
o
o oo
o
o
o
o oo
o
oo
o
o
o
feature 1
feature2
Experience
Typical ML paradigms
Supervised learning
Unsupervised learning
Reinforcement learning
Accuracy
Substantive
Expertise
Hacking
skills
M
ath
&
Statistics
Knowledge
Traditional

Research
Danger

Zone!
Machine

Learning
Data

Science
Substantive
Expertise
Hacking
skills
M
ath
&
Statistics
KnowledgeEvil
Outside Committee

Member
Not that dangerous,

in retrospect
Machine

Learning
James Bond

Villain
NSA
Data Science
That Guy Who Stole

Your Online Identity
Thesis Advisor
Grad School Mate
Machine Learning
Practice
data + algo = result
+-------+--------+------+--------+---------+-------+
| brand | model | year | milage | service | price |
+-------+--------+------+--------+---------+-------+
| ford | mondeo | 2005 | 123000 | 9900 | 67000 |
+-------+--------+------+--------+---------+-------+
| ford | mondeo | 2005 | 175000 | 9900 | 30000 |
+-------+--------+------+--------+---------+-------+
| ford | focus | 2010 | 45000 | 6700 | 30000 |
+-------+--------+------+--------+---------+-------+
…
Learning Data
Algorithm Learning
Classifier ModelReal Data Classification
Failure recipe
+-------+--------+------+--------+---------+-------+
| brand | model | year | milage | service | price |
+-------+--------+------+--------+---------+-------+
| ford | mondeo | 2005 | 123000 | 9900 | 67000 |
+-------+--------+------+--------+---------+-------+
| ford | mondeo | 2005 | 175000 | 9900 | 30000 |
+-------+--------+------+--------+---------+-------+
| ford | focus | 2010 | 45000 | 6700 | 30000 |
+-------+--------+------+--------+---------+-------+
…
+-------+--------+------+--------+---------+--------+-------+
| brand | model | year | milage | service | repair | price |
+-------+--------+------+--------+---------+--------+-------+
| ford | mondeo | 2005 | 123000 | 9000 | 900 | 67000 |
+-------+--------+------+--------+---------+--------+-------+
| ford | mondeo | 2005 | 175000 | 900 | 9000 | 30000 |
+-------+--------+------+--------+---------+--------+-------+
| ford | focus | 2010 | 45000 | 3700 | 3000 | 30000 |
+-------+--------+------+--------+---------+--------+-------+
…
+-------+--------+------+--------+---------+--------+-------+
| brand | model | year | milage | service | repair | price |
+-------+--------+------+--------+---------+--------+-------+
| ford | mondeo | 2005 | 123000 | 9000 | 900 | 67000 |
+-------+--------+------+--------+---------+--------+-------+
| ford | mondeo | 2005 | 175000 | 900 | 9000 | 30000 |
+-------+--------+------+--------+---------+--------+-------+
| ford | mondeo | 2005 | 175000 | 900 | 9000 | 45000 |
+-------+--------+------+--------+---------+--------+-------+
| ford | focus | 2010 | 45000 | 3700 | 3000 | 30000 |
+-------+--------+------+--------+---------+--------+-------+
…
+-------+--------+-----+------+--------+---------+--------+-------+
| brand | model | gen | year | milage | service | repair | price |
+-------+--------+-----+------+--------+---------+--------+-------+
| ford | mondeo | 4 | 2005 | 123000 | 9000 | 900 | 67000 |
+-------+--------+-----+------+--------+---------+--------+-------+
| ford | mondeo | 3 | 2005 | 175000 | 900 | 9000 | 30000 |
+-------+--------+-----+------+--------+---------+--------+-------+
| ford | mondeo | 4 | 2005 | 175000 | 900 | 9000 | 45000 |
+-------+--------+-----+------+--------+---------+--------+-------+
| ford | focus | 4 | 2010 | 45000 | 3700 | 3000 | 30000 |
+-------+--------+-----+------+--------+---------+--------+-------+
…
+-------+--------+-----+------+--------+---------+--------+------+---------------+-------+
| brand | model | gen | year | milage | service | repair | igla | crying German | price |
+-------+--------+-----+------+--------+---------+--------+------+---------------+-------+
| ford | mondeo | 4 | 2005 | 123000 | 9000 | 900 | 0 | 0 | 67000 |
+-------+--------+-----+------+--------+---------+--------+------+---------------+-------+
| ford | mondeo | 3 | 2005 | 175000 | 900 | 9000 | 1 | 1 | 30000 |
+-------+--------+-----+------+--------+---------+--------+------+---------------+-------+
| ford | mondeo | 4 | 2005 | 175000 | 900 | 9000 | 0 | 0 | 45000 |
+-------+--------+-----+------+--------+---------+--------+------+---------------+-------+
| ford | focus | 4 | 2010 | 45000 | 3700 | 3000 | 1 | 0 | 30000 |
+-------+--------+-----+------+--------+---------+--------+------+---------------+-------+
…
Understand your
data first
Exploratory
analysis
http://blogs.adobe.com/digitalmarketing/wp-content/uploads/2013/08/aq2.jpg
ML pipeline
Raw Data Collection
Pre-processing
Sampling
Training Dataset
Algorithm Training
Optimization
Post-processing
Final model
Pre-processingFeature Selection
Feature Scaling
Dimensionality Reduction
Performance Metrics
Model Selection
Test Dataset
CrossValidation
Final Model

Evaluation
Pre-processing Classification
Missing Data
Feature Extraction
Data

Split
Data
Raw Data Collection
Pre-processing
Sampling
Training Dataset
Algorithm Training
Optimization
Final model
Pre-processingFeature Selection
Feature Scaling
Dimensionality Reduction
Performance Metrics
Model Selection
Test Dataset
CrossValidation
Final Model

Evaluation
Pre-processing Classification
Missing Data
Feature Extraction
Data

Split
Post-processing
Data
Classification algorithms
Linear Classification
Logistic Regression
Linear Discriminant Analysis
PLS Discriminant Analysis
Non-Linear Classification
Mixture Discriminant Analysis
Quadratic Discriminant Analysis
Regularized Discriminant Analysis
Neural Networks
Flexible Discriminant Analysis
Support Vector Machines
k-Nearest Neighbor
Naive Bayes
Decission Trees for Classification
Classification and Regression Trees
C4.5
PART
Bagging CART
Random Forest
Gradient Booster Machines
Boosted 5.0
Regression algorithms
Linear Regiression
Ordinary Least Squares Regression
Stepwise Linear Regression
Prinicpal Component Regression
Partial Least Squares Regression
Non-Linear Regression /
Penalized Regression
Ridge Regression
Least Absolute Shrinkage
ElasticNet
Multivariate Adaptive Regression
Support Vector Machines
k-Nearest Neighbor
Neural Network
Decission Trees for Regression
Classification and Regression Trees
Conditional Decision Tree
Rule System
Bagging CART
Random Forest
Gradient Boosted Machine
Cubist
Algorithm is only
element in the ML chain
Demo #1
> dataset(iris)
> head(iris)
Sepal.Length Sepal.Width Petal.Length Petal.Width Species
1 5.1 3.5 1.4 0.2 setosa
2 4.9 3.0 1.4 0.2 setosa
3 4.7 3.2 1.3 0.2 setosa
4 4.6 3.1 1.5 0.2 setosa
5 5.0 3.6 1.4 0.2 setosa
6 5.4 3.9 1.7 0.4 setosa
> tail(iris)
Sepal.Length Sepal.Width Petal.Length Petal.Width Species
145 6.7 3.3 5.7 2.5 virginica
146 6.7 3.0 5.2 2.3 virginica
147 6.3 2.5 5.0 1.9 virginica
148 6.5 3.0 5.2 2.0 virginica
149 6.2 3.4 5.4 2.3 virginica
150 5.9 3.0 5.1 1.8 virginica
> plot(iris[,1:4])
> library(mclust)
> class = iris$Species
> mod2 = MclustDA(iris[,1:4], class, modelType = „EDDA")
> table(class)
class
setosa versicolor virginica
50 50 50
> summary(mod2)
------------------------------------------------
Gaussian finite mixture model for classification
------------------------------------------------
EDDA model summary:
log.likelihood n df BIC
-187.7097 150 36 -555.8024
Classes n Model G
setosa 50 VEV 1
versicolor 50 VEV 1
virginica 50 VEV 1
Training classification summary:
Predicted
Class setosa versicolor virginica
setosa 50 0 0
versicolor 0 47 3
virginica 0 0 50
Training error = 0.02
> plot(mod2, what = "scatterplot")
Demo #2
> head(titanic.raw)
Class Sex Age Survived
1 3rd Male Child No
2 3rd Male Child No
3 3rd Male Child No
4 3rd Male Child No
5 3rd Male Child No
6 3rd Male Child No
> tail(titanic.raw)
Class Sex Age Survived
2196 Crew Female Adult Yes
2197 Crew Female Adult Yes
2198 Crew Female Adult Yes
2199 Crew Female Adult Yes
2200 Crew Female Adult Yes
2201 Crew Female Adult Yes
> summary(titanic.raw)
Class Sex Age Survived
1st :325 Female: 470 Adult:2092 No :1490
2nd :285 Male :1731 Child: 109 Yes: 711
3rd :706
Crew:885
> library(arules)
Ładowanie wymaganego pakietu: Matrix
Dołączanie pakietu: ‘arules’
> rules <- apriori(titanic.raw)
Apriori
Parameter specification:
confidence minval smax arem aval originalSupport support minlen maxlen target ext
0.8 0.1 1 none FALSE TRUE 0.1 1 10 rules FALSE
Algorithmic control:
filter tree heap memopt load sort verbose
0.1 TRUE TRUE FALSE TRUE 2 TRUE
Absolute minimum support count: 220
set item appearances ...[0 item(s)] done [0.00s].
set transactions ...[10 item(s), 2201 transaction(s)] done [0.00s].
sorting and recoding items ... [9 item(s)] done [0.00s].
creating transaction tree ... done [0.00s].
checking subsets of size 1 2 3 4 done [0.00s].
writing ... [27 rule(s)] done [0.00s].
creating S4 object ... done [0.00s].
> rules <- apriori(titanic.raw,
+ parameter = list(minlen=2, supp=0.005, conf=0.8),
+ appearance = list(rhs=c("Survived=No", "Survived=Yes"),
+ default="lhs"),
+ control = list(verbose=F))
> rules.sorted <- sort(rules, by="lift")
> subset.matrix <- is.subset(rules.sorted, rules.sorted)
> subset.matrix[lower.tri(subset.matrix, diag=T)] <- NA
> redundant <- colSums(subset.matrix, na.rm=T) >= 1
> which(redundant)
{Class=2nd,Sex=Female,Age=Child,Survived=Yes} 2
{Class=1st,Sex=Female,Age=Adult,Survived=Yes} 4
{Class=Crew,Sex=Female,Age=Adult,Survived=Yes} 7
{Class=2nd,Sex=Female,Age=Adult,Survived=Yes} 8
> rules.pruned <- rules.sorted[!redundant]
> inspect(rules.pruned)
lhs rhs support confidence lift
1 {Class=2nd,Age=Child} => {Survived=Yes} 0.010904134 1.0000000 3.095640
4 {Class=1st,Sex=Female} => {Survived=Yes} 0.064061790 0.9724138 3.010243
2 {Class=2nd,Sex=Female} => {Survived=Yes} 0.042253521 0.8773585 2.715986
5 {Class=Crew,Sex=Female} => {Survived=Yes} 0.009086779 0.8695652 2.691861
9 {Class=2nd,Sex=Male,Age=Adult} => {Survived=No} 0.069968196 0.9166667 1.354083
3 {Class=2nd,Sex=Male} => {Survived=No} 0.069968196 0.8603352 1.270871
12 {Class=3rd,Sex=Male,Age=Adult} => {Survived=No} 0.175829169 0.8376623 1.237379
6 {Class=3rd,Sex=Male} => {Survived=No} 0.191731031 0.8274510 1.222295
Applications
Tools
Benefits
& Problems
o
oo
o
oo
o
oo
o o
o
o oo
o
o
o
o oo
o
oo
o
o
o
feature 1
feature2
o
o
Does it do well on

the training data?
Does it do well on

the test data?
Better features /

Better parameters
More data
Done!
No No
Yes
by Andrew Ng
Understand your
needs first
Tools will change
Ideas are immortal
@SrcMinistry
Thanks!
@MariuszGil

Machine learning for developers

  • 1.
    @SrcMinistry @MariuszGil Machine Learningfor Developers Data processing
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
  • 11.
    data + algo= result
  • 12.
  • 14.
  • 15.
  • 16.
  • 17.
  • 18.
  • 19.
  • 20.
  • 21.
  • 22.
    A computer programis said to learn from experience E with respect to some class of tasks T and performance measure P if its performance at tasks in T, as measured by P, improves with experience E Tom M. Mitchell
  • 23.
  • 24.
  • 25.
    o oo o oo o oo o o o o oo o o o ooo o oo o o o feature 1 feature2
  • 26.
    o oo o oo o oo o o o o oo o o o ooo o oo o o o feature 1 feature2
  • 27.
    o oo o oo o oo o o o o oo o o o ooo o oo o o o feature 1 feature2
  • 28.
  • 29.
    Typical ML paradigms Supervisedlearning Unsupervised learning Reinforcement learning
  • 30.
  • 31.
  • 32.
    Substantive Expertise Hacking skills M ath & Statistics KnowledgeEvil Outside Committee
 Member Not thatdangerous,
 in retrospect Machine
 Learning James Bond
 Villain NSA Data Science That Guy Who Stole
 Your Online Identity Thesis Advisor Grad School Mate
  • 33.
  • 35.
    data + algo= result
  • 36.
    +-------+--------+------+--------+---------+-------+ | brand |model | year | milage | service | price | +-------+--------+------+--------+---------+-------+ | ford | mondeo | 2005 | 123000 | 9900 | 67000 | +-------+--------+------+--------+---------+-------+ | ford | mondeo | 2005 | 175000 | 9900 | 30000 | +-------+--------+------+--------+---------+-------+ | ford | focus | 2010 | 45000 | 6700 | 30000 | +-------+--------+------+--------+---------+-------+ …
  • 37.
    Learning Data Algorithm Learning ClassifierModelReal Data Classification
  • 38.
  • 39.
    +-------+--------+------+--------+---------+-------+ | brand |model | year | milage | service | price | +-------+--------+------+--------+---------+-------+ | ford | mondeo | 2005 | 123000 | 9900 | 67000 | +-------+--------+------+--------+---------+-------+ | ford | mondeo | 2005 | 175000 | 9900 | 30000 | +-------+--------+------+--------+---------+-------+ | ford | focus | 2010 | 45000 | 6700 | 30000 | +-------+--------+------+--------+---------+-------+ …
  • 40.
    +-------+--------+------+--------+---------+--------+-------+ | brand |model | year | milage | service | repair | price | +-------+--------+------+--------+---------+--------+-------+ | ford | mondeo | 2005 | 123000 | 9000 | 900 | 67000 | +-------+--------+------+--------+---------+--------+-------+ | ford | mondeo | 2005 | 175000 | 900 | 9000 | 30000 | +-------+--------+------+--------+---------+--------+-------+ | ford | focus | 2010 | 45000 | 3700 | 3000 | 30000 | +-------+--------+------+--------+---------+--------+-------+ …
  • 41.
    +-------+--------+------+--------+---------+--------+-------+ | brand |model | year | milage | service | repair | price | +-------+--------+------+--------+---------+--------+-------+ | ford | mondeo | 2005 | 123000 | 9000 | 900 | 67000 | +-------+--------+------+--------+---------+--------+-------+ | ford | mondeo | 2005 | 175000 | 900 | 9000 | 30000 | +-------+--------+------+--------+---------+--------+-------+ | ford | mondeo | 2005 | 175000 | 900 | 9000 | 45000 | +-------+--------+------+--------+---------+--------+-------+ | ford | focus | 2010 | 45000 | 3700 | 3000 | 30000 | +-------+--------+------+--------+---------+--------+-------+ …
  • 42.
    +-------+--------+-----+------+--------+---------+--------+-------+ | brand |model | gen | year | milage | service | repair | price | +-------+--------+-----+------+--------+---------+--------+-------+ | ford | mondeo | 4 | 2005 | 123000 | 9000 | 900 | 67000 | +-------+--------+-----+------+--------+---------+--------+-------+ | ford | mondeo | 3 | 2005 | 175000 | 900 | 9000 | 30000 | +-------+--------+-----+------+--------+---------+--------+-------+ | ford | mondeo | 4 | 2005 | 175000 | 900 | 9000 | 45000 | +-------+--------+-----+------+--------+---------+--------+-------+ | ford | focus | 4 | 2010 | 45000 | 3700 | 3000 | 30000 | +-------+--------+-----+------+--------+---------+--------+-------+ …
  • 43.
    +-------+--------+-----+------+--------+---------+--------+------+---------------+-------+ | brand |model | gen | year | milage | service | repair | igla | crying German | price | +-------+--------+-----+------+--------+---------+--------+------+---------------+-------+ | ford | mondeo | 4 | 2005 | 123000 | 9000 | 900 | 0 | 0 | 67000 | +-------+--------+-----+------+--------+---------+--------+------+---------------+-------+ | ford | mondeo | 3 | 2005 | 175000 | 900 | 9000 | 1 | 1 | 30000 | +-------+--------+-----+------+--------+---------+--------+------+---------------+-------+ | ford | mondeo | 4 | 2005 | 175000 | 900 | 9000 | 0 | 0 | 45000 | +-------+--------+-----+------+--------+---------+--------+------+---------------+-------+ | ford | focus | 4 | 2010 | 45000 | 3700 | 3000 | 1 | 0 | 30000 | +-------+--------+-----+------+--------+---------+--------+------+---------------+-------+ …
  • 44.
  • 45.
  • 46.
  • 47.
  • 48.
    Raw Data Collection Pre-processing Sampling TrainingDataset Algorithm Training Optimization Post-processing Final model Pre-processingFeature Selection Feature Scaling Dimensionality Reduction Performance Metrics Model Selection Test Dataset CrossValidation Final Model
 Evaluation Pre-processing Classification Missing Data Feature Extraction Data
 Split Data
  • 49.
    Raw Data Collection Pre-processing Sampling TrainingDataset Algorithm Training Optimization Final model Pre-processingFeature Selection Feature Scaling Dimensionality Reduction Performance Metrics Model Selection Test Dataset CrossValidation Final Model
 Evaluation Pre-processing Classification Missing Data Feature Extraction Data
 Split Post-processing Data
  • 50.
    Classification algorithms Linear Classification LogisticRegression Linear Discriminant Analysis PLS Discriminant Analysis Non-Linear Classification Mixture Discriminant Analysis Quadratic Discriminant Analysis Regularized Discriminant Analysis Neural Networks Flexible Discriminant Analysis Support Vector Machines k-Nearest Neighbor Naive Bayes Decission Trees for Classification Classification and Regression Trees C4.5 PART Bagging CART Random Forest Gradient Booster Machines Boosted 5.0
  • 51.
    Regression algorithms Linear Regiression OrdinaryLeast Squares Regression Stepwise Linear Regression Prinicpal Component Regression Partial Least Squares Regression Non-Linear Regression / Penalized Regression Ridge Regression Least Absolute Shrinkage ElasticNet Multivariate Adaptive Regression Support Vector Machines k-Nearest Neighbor Neural Network Decission Trees for Regression Classification and Regression Trees Conditional Decision Tree Rule System Bagging CART Random Forest Gradient Boosted Machine Cubist
  • 52.
  • 53.
  • 54.
    > dataset(iris) > head(iris) Sepal.LengthSepal.Width Petal.Length Petal.Width Species 1 5.1 3.5 1.4 0.2 setosa 2 4.9 3.0 1.4 0.2 setosa 3 4.7 3.2 1.3 0.2 setosa 4 4.6 3.1 1.5 0.2 setosa 5 5.0 3.6 1.4 0.2 setosa 6 5.4 3.9 1.7 0.4 setosa > tail(iris) Sepal.Length Sepal.Width Petal.Length Petal.Width Species 145 6.7 3.3 5.7 2.5 virginica 146 6.7 3.0 5.2 2.3 virginica 147 6.3 2.5 5.0 1.9 virginica 148 6.5 3.0 5.2 2.0 virginica 149 6.2 3.4 5.4 2.3 virginica 150 5.9 3.0 5.1 1.8 virginica > plot(iris[,1:4])
  • 56.
    > library(mclust) > class= iris$Species > mod2 = MclustDA(iris[,1:4], class, modelType = „EDDA") > table(class) class setosa versicolor virginica 50 50 50
  • 57.
    > summary(mod2) ------------------------------------------------ Gaussian finitemixture model for classification ------------------------------------------------ EDDA model summary: log.likelihood n df BIC -187.7097 150 36 -555.8024 Classes n Model G setosa 50 VEV 1 versicolor 50 VEV 1 virginica 50 VEV 1 Training classification summary: Predicted Class setosa versicolor virginica setosa 50 0 0 versicolor 0 47 3 virginica 0 0 50 Training error = 0.02 > plot(mod2, what = "scatterplot")
  • 59.
  • 60.
    > head(titanic.raw) Class SexAge Survived 1 3rd Male Child No 2 3rd Male Child No 3 3rd Male Child No 4 3rd Male Child No 5 3rd Male Child No 6 3rd Male Child No > tail(titanic.raw) Class Sex Age Survived 2196 Crew Female Adult Yes 2197 Crew Female Adult Yes 2198 Crew Female Adult Yes 2199 Crew Female Adult Yes 2200 Crew Female Adult Yes 2201 Crew Female Adult Yes > summary(titanic.raw) Class Sex Age Survived 1st :325 Female: 470 Adult:2092 No :1490 2nd :285 Male :1731 Child: 109 Yes: 711 3rd :706 Crew:885
  • 61.
    > library(arules) Ładowanie wymaganegopakietu: Matrix Dołączanie pakietu: ‘arules’ > rules <- apriori(titanic.raw) Apriori Parameter specification: confidence minval smax arem aval originalSupport support minlen maxlen target ext 0.8 0.1 1 none FALSE TRUE 0.1 1 10 rules FALSE Algorithmic control: filter tree heap memopt load sort verbose 0.1 TRUE TRUE FALSE TRUE 2 TRUE Absolute minimum support count: 220 set item appearances ...[0 item(s)] done [0.00s]. set transactions ...[10 item(s), 2201 transaction(s)] done [0.00s]. sorting and recoding items ... [9 item(s)] done [0.00s]. creating transaction tree ... done [0.00s]. checking subsets of size 1 2 3 4 done [0.00s]. writing ... [27 rule(s)] done [0.00s]. creating S4 object ... done [0.00s].
  • 62.
    > rules <-apriori(titanic.raw, + parameter = list(minlen=2, supp=0.005, conf=0.8), + appearance = list(rhs=c("Survived=No", "Survived=Yes"), + default="lhs"), + control = list(verbose=F)) > rules.sorted <- sort(rules, by="lift") > subset.matrix <- is.subset(rules.sorted, rules.sorted) > subset.matrix[lower.tri(subset.matrix, diag=T)] <- NA > redundant <- colSums(subset.matrix, na.rm=T) >= 1 > which(redundant) {Class=2nd,Sex=Female,Age=Child,Survived=Yes} 2 {Class=1st,Sex=Female,Age=Adult,Survived=Yes} 4 {Class=Crew,Sex=Female,Age=Adult,Survived=Yes} 7 {Class=2nd,Sex=Female,Age=Adult,Survived=Yes} 8 > rules.pruned <- rules.sorted[!redundant]
  • 63.
    > inspect(rules.pruned) lhs rhssupport confidence lift 1 {Class=2nd,Age=Child} => {Survived=Yes} 0.010904134 1.0000000 3.095640 4 {Class=1st,Sex=Female} => {Survived=Yes} 0.064061790 0.9724138 3.010243 2 {Class=2nd,Sex=Female} => {Survived=Yes} 0.042253521 0.8773585 2.715986 5 {Class=Crew,Sex=Female} => {Survived=Yes} 0.009086779 0.8695652 2.691861 9 {Class=2nd,Sex=Male,Age=Adult} => {Survived=No} 0.069968196 0.9166667 1.354083 3 {Class=2nd,Sex=Male} => {Survived=No} 0.069968196 0.8603352 1.270871 12 {Class=3rd,Sex=Male,Age=Adult} => {Survived=No} 0.175829169 0.8376623 1.237379 6 {Class=3rd,Sex=Male} => {Survived=No} 0.191731031 0.8274510 1.222295
  • 64.
  • 69.
  • 78.
  • 79.
    o oo o oo o oo o o o o oo o o o ooo o oo o o o feature 1 feature2 o o
  • 80.
    Does it dowell on
 the training data? Does it do well on
 the test data? Better features /
 Better parameters More data Done! No No Yes by Andrew Ng
  • 81.
  • 82.
  • 83.