SlideShare a Scribd company logo
Support Vector Machines
Legal Notices and Disclaimers
This presentation is for informational purposes only. INTEL MAKES NO WARRANTIES,
EXPRESS OR IMPLIED, IN THIS SUMMARY.
Intel technologies’ features and benefits depend on system configuration and may require
enabled hardware, software or service activation. Performance varies depending on system
configuration. Check with your system manufacturer or retailer or learn more at intel.com.
This sample source code is released under the Intel Sample Source Code License Agreement.
Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.
*Other names and brands may be claimed as the property of others.
Copyright © 2017, Intel Corporation. All rights reserved.
Relationship to Logistic Regression
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
𝑦 𝛽 𝑥 =
1
1+𝑒−(𝛽0+ 𝛽1 𝑥 + ε )
Support Vector Machines (SVM)
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
Support Vector Machines (SVM)
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
Three
misclassifications
Support Vector Machines (SVM)
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
Two misclassifications
Support Vector Machines (SVM)
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
No misclassifications
Support Vector Machines (SVM)
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
No misclassifications—but is this the best position?
Support Vector Machines (SVM)
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
No misclassifications—but is this the best position?
Support Vector Machines (SVM)
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
Maximize the region between classes
Similarity Between Logistic Regression and SVM
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
Number of Malignant Nodes
0
Age
60
40
20
10 20
Two features (nodes, age)
Two labels (survived, lost)
Classification with SVMs
Number of Malignant Nodes
0
Age
60
40
20
10 20
Find the line that best
separates classes
Classification with SVMs
Number of Malignant Nodes
0
Age
60
40
20
10 20
Find the line that best
separates classes
Classification with SVMs
Number of Malignant Nodes
0
Age
60
40
20
10 20
Find the line that best
separates classes
Classification with SVMs
Number of Malignant Nodes
0
Age
60
40
20
10 20
Find the line that best
separates classes
Classification with SVMs
Number of Malignant Nodes
0
Age
60
40
20
10 20
And include the largest
boundary possible
Classification with SVMs
Logistic Regression vs SVM Cost Functions
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
Logistic Regression vs SVM Cost Functions
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
3
2
1
-3 -2 -1 0 1 2 3
Logistic Regression
Cost Function
for Lost Class
Logistic Regression vs SVM Cost Functions
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
3
2
1
-3 -2 -1 0 1 2 3
3
2
1
-3 -2 -1 0 1 2 3
Logistic Regression
Cost Function
for Lost Class
SVM
Cost Function
for Lost Class
The SVM Cost Function
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
3
2
1
-3 -2 -1 0 1 2 3
SVM
Cost Function
for Lost Class
The SVM Cost Function
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
3
2
1
-3 -2 -1 0 1 2 3
SVM
Cost Function
for Lost Class
The SVM Cost Function
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
3
2
1
-3 -2 -1 0 1 2 3
SVM
Cost Function
for Lost Class
The SVM Cost Function
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
3
2
1
-3 -2 -1 0 1 2 3
SVM
Cost Function
for Lost Class
The SVM Cost Function
Number of Positive Nodes
Survived: 0.0
Lost: 1.0Patient
Status
After Five
Years
0.5
3
2
1
-3 -2 -1 0 1 2 3
SVM
Cost Function
for Lost Class
Number of Malignant Nodes
0
Age
60
40
20
10 20
Outlier Sensitivity in SVMs
Number of Malignant Nodes
0
Age
60
40
20
10 20
Outlier Sensitivity in SVMs
Number of Malignant Nodes
0
Age
60
40
20
10 20
Outlier Sensitivity in SVMs
Number of Malignant Nodes
0
Age
60
40
20
10 20
Outlier Sensitivity in SVMs
Number of Malignant Nodes
0
Age
60
40
20
10 20
Outlier Sensitivity in SVMs
This is probably still the
correct boundary
Number of Malignant Nodes
0
Age
60
40
20
10 20
Regularization in SVMs
𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) +
1
𝐶 𝑖 𝛽𝑖
Number of Malignant Nodes
0
Age
60
40
20
10 20
Regularization in SVMs
𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) +
1
𝐶 𝑖 𝛽𝑖
Number of Malignant Nodes
0
Age
60
40
20
10 20
Regularization in SVMs
𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) +
1
𝐶 𝑖 𝛽𝑖
Best Fit
Number of Malignant Nodes
0
Age
60
40
20
10 20
Regularization in SVMs
𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) +
1
𝐶 𝑖 𝛽𝑖
Best Fit
Large
Number of Malignant Nodes
0
Age
60
40
20
10 20
Regularization in SVMs
𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) +
1
𝐶 𝑖 𝛽𝑖
Number of Malignant Nodes
0
Age
60
40
20
10 20
Regularization in SVMs
𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) +
1
𝐶 𝑖 𝛽𝑖
Slightly Higher
Number of Malignant Nodes
0
Age
60
40
20
10 20
Regularization in SVMs
𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) +
1
𝐶 𝑖 𝛽𝑖
Slightly Higher
Much
Smaller
Interpretation of SVM Coefficients
𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) +
1
𝐶 𝑖 𝛽𝑖
Number of Malignant Nodes
0
Age
60
40
20
10 20
Interpretation of SVM Coefficients
𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) +
1
𝐶 𝑖 𝛽𝑖
Number of Malignant Nodes
0
Age
60
40
20
10 20
𝛽1
𝛽2
𝛽3
Vector orthogonal
to the hyperplane
Linear SVM: The Syntax
Import the class containing the classification method
from sklearn.svm import LinearSVC
Linear SVM: The Syntax
Import the class containing the classification method
from sklearn.svm import LinearSVC
Create an instance of the class
LinSVC = LinearSVC(penalty='l2', C=10.0)
Linear SVM: The Syntax
Import the class containing the classification method
from sklearn.svm import LinearSVC
Create an instance of the class
LinSVC = LinearSVC(penalty='l2', C=10.0)
regularization
parameters
Linear SVM: The Syntax
Import the class containing the classification method
from sklearn.svm import LinearSVC
Create an instance of the class
LinSVC = LinearSVC(penalty='l2', C=10.0)
Fit the instance on the data and then predict the expected value
LinSVC = LinSVC.fit(X_train, y_train)
y_predict = LinSVC.predict(X_test)
Linear SVM: The Syntax
Import the class containing the classification method
from sklearn.svm import LinearSVC
Create an instance of the class
LinSVC = LinearSVC(penalty='l2', C=10.0)
Fit the instance on the data and then predict the expected value
LinSVC = LinSVC.fit(X_train, y_train)
y_predict = LinSVC.predict(X_test)
Tune regularization parameters with cross-validation.
Kernels
Number of Malignant Nodes
0
Age
60
40
20
10 20
Classification with SVMs
Non-Linear Decision Boundaries with SVM
Non-linear data can be made
linear with higher dimensionality
𝜙
The Kernel Trick
Transform data so it is
linearly separable
𝜙
Palme d’Or Winners at
Cannes
SVM Gaussian Kernel
IMDB User Rating
Budget
Palme d’Or Winners at
Cannes
SVM Gaussian Kernel
IMDB User Rating
Budget
Approach 1:
Create higher order
features to
transform the data.
Budget2 +
Rating2 +
Budget * Rating +
…
Palme d’Or Winners at
Cannes
SVM Gaussian Kernel
IMDB User Rating
Budget
Approach 2:
Transform the
space to a different
coordinate system.
Palme d’Or Winners at
Cannes
SVM Gaussian Kernel
IMDB User Rating
Budget
Define Feature 1:
Similarity to
"Pulp Fiction"
Palme d’Or Winners at
Cannes
SVM Gaussian Kernel
IMDB User Rating
Budget
Define Feature 2:
Similarity to
"Black Swan"
Palme d’Or Winners at
Cannes
SVM Gaussian Kernel
IMDB User Rating
Budget
Define Feature 3:
Similarity to
"Transformers"
Palme d’Or Winners at
Cannes
SVM Gaussian Kernel
IMDB User Rating
Budget
Create a
Gaussian function
at feature 1
𝑎1(𝑥 𝑜𝑏𝑠) = 𝑒𝑥𝑝
− (𝑥𝑖
𝑜𝑏𝑠
− 𝑥𝑖
𝑃𝑢𝑙𝑝 𝐹𝑖𝑐𝑡𝑖𝑜𝑛
)2
2𝜎2
Palme d’Or Winners at
Cannes
SVM Gaussian Kernel
IMDB User Rating
Budget
Create a
Gaussian function
at feature 2
𝑎1(𝑥 𝑜𝑏𝑠) = 𝑒𝑥𝑝
− (𝑥𝑖
𝑜𝑏𝑠
− 𝑥𝑖
𝐵𝑙𝑎𝑐𝑘 𝑆𝑤𝑎𝑛
)2
2𝜎2
Palme d’Or Winners at
Cannes
SVM Gaussian Kernel
IMDB User Rating
Budget Create a
Gaussian function
at feature 3
𝑎1(𝑥 𝑜𝑏𝑠) = 𝑒𝑥𝑝
− (𝑥𝑖
𝑜𝑏𝑠
− 𝑥𝑖
𝑇𝑟𝑎𝑛𝑠𝑓𝑜𝑟𝑚𝑒𝑟𝑠
)2
2𝜎2
SVM Gaussian Kernel
IMDB User Rating
Budget
Transformation:
[x1, x2]  [0.7a1 , 0.9a2 , -0.6a3]
SVM Gaussian Kernel
IMDB User Rating
Budget
a1=0.90
a2=0.92
a3=0.30
Transformation:
[x1, x2]  [0.7a1 , 0.9a2 , -0.6a3]
a3
a2
a1
SVM Gaussian Kernel
IMDB User Rating
Budget
a1=0.50
a2=0.60
a3=0.70 Transformation:
[x1, x2]  [0.7a1 , 0.9a2 , -0.6a3]
a3
a1
a2
SVM Gaussian Kernel
x2 (IMDB Rating)
x1(Budget)
a1 (Pulp Fiction)
a3 (Transformers)
a2 (Black Swan)
Transformation:
[x1, x2]  [0.7a1 , 0.9a2 , -0.6a3]
Classification in the New Space
a1 (Pulp Fiction)
a3 (Transformers)
a2 (Black Swan)
Transformation:
[x1, x2]  [0.7a1 , 0.9a2 , -0.6a3]
Palme d’Or Winners at
Cannes
SVM Gaussian Kernel
IMDB User Rating
Budget
Palme d’Or Winners at
Cannes
SVM Gaussian Kernel
IMDB User Rating
Budget
Palme d’Or Winners at
Cannes
SVM Gaussian Kernel
IMDB User Rating
Budget
Radial Basis Function
(RBF) Kernel
Import the class containing the classification method
from sklearn.svm import SVC
SVMs with Kernels: The Syntax
Import the class containing the classification method
from sklearn.svm import SVC
Create an instance of the class
rbfSVC = SVC(kernel='rbf', gamma=1.0, C=10.0)
SVMs with Kernels: The Syntax
Import the class containing the classification method
from sklearn.svm import SVC
Create an instance of the class
rbfSVC = SVC(kernel='rbf', gamma=1.0, C=10.0)
Fit the instance on the data and then predict the expected value
rbfSVC = rbfSVC.fit(X_train, y_train)
y_predict = rbfSVC.predict(X_test)
Tune kernel and associated parameters with cross-validation.
SVMs with Kernels: The Syntax
set kernel and
associated
coefficient
(gamma)
Import the class containing the classification method
from sklearn.svm import SVC
Create an instance of the class
rbfSVC = SVC(kernel='rbf', gamma=1.0, C=10.0)
Fit the instance on the data and then predict the expected value
rbfSVC = rbfSVC.fit(X_train, y_train)
y_predict = rbfSVC.predict(X_test)
Tune kernel and associated parameters with cross-validation.
SVMs with Kernels: The Syntax
"C" is penalty
associated with
the error term
Import the class containing the classification method
from sklearn.svm import SVC
Create an instance of the class
rbfSVC = SVC(kernel='rbf', gamma=1.0, C=10.0)
Fit the instance on the data and then predict the expected value
rbfSVC = rbfSVC.fit(X_train, y_train)
y_predict = rbfSVC.predict(X_test)
Tune kernel and associated parameters with cross-validation.
SVMs with Kernels: The Syntax
SVMs with Kernels: The Syntax
Import the class containing the classification method
from sklearn.svm import SVC
Create an instance of the class
rbfSVC = SVC(kernel='rbf', gamma=1.0, C=10.0)
Fit the instance on the data and then predict the expected value
rbfSVC = rbfSVC.fit(X_train, y_train)
y_predict = rbfSVC.predict(X_test)
Tune kernel and associated parameters with cross-validation.
Feature Overload
SVMs with RBF Kernels are very slow to
train with lots of features or data
Problem
Solution
Construct approximate kernel map with
SGD using Nystroem or RBF sampler
Feature Overload
SVMs with RBF Kernels are very slow to
train with lots of features or data
Problem
Solution
Construct approximate kernel map with
SGD using Nystroem or RBF sampler
Feature Overload
SVMs with RBF Kernels are very slow to
train with lots of features or data
Problem
Solution
Construct approximate kernel map with
SGD using Nystroem or RBF sampler.
Fit a linear classifier.
Faster Kernel Transformations: The Syntax
Import the class containing the classification method
from sklearn.kernel_approximation import Nystroem
Create an instance of the class
nystroemSVC = Nystroem(kernel='rbf', gamma=1.0,
n_components=100)
Fit the instance on the data and transform
X_train = nystroemSVC.fit_transform(X_train)
X_test = nystroemSVC.transform(X_test)
Tune kernel parameters and components with cross-validation.
Faster Kernel Transformations: The Syntax
Import the class containing the classification method
from sklearn.kernel_approximation import Nystroem
Create an instance of the class
nystroemSVC = Nystroem(kernel='rbf', gamma=1.0,
n_components=100)
Fit the instance on the data and transform
X_train = nystroemSVC.fit_transform(X_train)
X_test = nystroemSVC.transform(X_test)
Tune kernel parameters and components with cross-validation.
multiple non-
linear kernels
can be used
Faster Kernel Transformations: The Syntax
Import the class containing the classification method
from sklearn.kernel_approximation import Nystroem
Create an instance of the class
nystroemSVC = Nystroem(kernel='rbf', gamma=1.0,
n_components=100)
Fit the instance on the data and transform
X_train = nystroemSVC.fit_transform(X_train)
X_test = nystroemSVC.transform(X_test)
Tune kernel parameters and components with cross-validation.
kernel and
gamma are
identical to SVC
Faster Kernel Transformations: The Syntax
Import the class containing the classification method
from sklearn.kernel_approximation import Nystroem
Create an instance of the class
nystroemSVC = Nystroem(kernel='rbf', gamma=1.0,
n_components=100)
Fit the instance on the data and transform
X_train = nystroemSVC.fit_transform(X_train)
X_test = nystroemSVC.transform(X_test)
Tune kernel parameters and components with cross-validation.
n_components
is number of
samples
Faster Kernel Transformations: The Syntax
Import the class containing the classification method
from sklearn.kernel_approximation import RBFsampler
Create an instance of the class
rbfSample = RBFsampler(gamma=1.0,
n_components=100)
Fit the instance on the data and transform
X_train = rbfSample.fit_transform(X_train)
X_test = rbfSample.transform(X_test)
Tune kernel parameters and components with cross-validation.
Faster Kernel Transformations: The Syntax
Import the class containing the classification method
from sklearn.kernel_approximation import RBFsampler
Create an instance of the class
rbfSample = RBFsampler(gamma=1.0,
n_components=100)
Fit the instance on the data and transform
X_train = rbfSample.fit_transform(X_train)
X_test = rbfSample.transform(X_test)
Tune kernel parameters and components with cross-validation.
RBF is only
kernel that can
be used
Faster Kernel Transformations: The Syntax
Import the class containing the classification method
from sklearn.kernel_approximation import RBFsampler
Create an instance of the class
rbfSample = RBFsampler(gamma=1.0,
n_components=100)
Fit the instance on the data and transform
X_train = rbfSample.fit_transform(X_train)
X_test = rbfSample.transform(X_test)
Tune kernel parameters and components with cross-validation.
parameter
names are
identical to
previous
When to Use Logistic Regression vs SVC
Features Data Model Choice
Many (~10K
Features)
Few (<100 Features)
Few (<100 Features)
Small (1K rows)
Medium (~10k rows)
Many (>100K Points)
Simple, Logistic or LinearSVC
SVC with RBF
Add features, Logistic, LinearSVC
or Kernel Approx.
Ml5 svm and-kernels

More Related Content

What's hot

Machine Learning using Support Vector Machine
Machine Learning using Support Vector MachineMachine Learning using Support Vector Machine
Machine Learning using Support Vector Machine
Mohsin Ul Haq
 
Support Vector Machines
Support Vector MachinesSupport Vector Machines
Support Vector Machines
CloudxLab
 
Binary Class and Multi Class Strategies for Machine Learning
Binary Class and Multi Class Strategies for Machine LearningBinary Class and Multi Class Strategies for Machine Learning
Binary Class and Multi Class Strategies for Machine Learning
Paxcel Technologies
 
Classification
ClassificationClassification
Classification
CloudxLab
 
SVM Tutorial
SVM TutorialSVM Tutorial
SVM Tutorialbutest
 
07 learning
07 learning07 learning
07 learning
ankit_ppt
 
Support vector machine
Support vector machineSupport vector machine
Support vector machine
zekeLabs Technologies
 
sentiment analysis using support vector machine
sentiment analysis using support vector machinesentiment analysis using support vector machine
sentiment analysis using support vector machine
Shital Andhale
 
GBM theory code and parameters
GBM theory code and parametersGBM theory code and parameters
GBM theory code and parameters
Venkata Reddy Konasani
 
SVM
SVMSVM
2013-1 Machine Learning Lecture 05 - Andrew Moore - Support Vector Machines
2013-1 Machine Learning Lecture 05 - Andrew Moore - Support Vector Machines2013-1 Machine Learning Lecture 05 - Andrew Moore - Support Vector Machines
2013-1 Machine Learning Lecture 05 - Andrew Moore - Support Vector MachinesDongseo University
 
Matrix decomposition and_applications_to_nlp
Matrix decomposition and_applications_to_nlpMatrix decomposition and_applications_to_nlp
Matrix decomposition and_applications_to_nlp
ankit_ppt
 
Support Vector Machines for Classification
Support Vector Machines for ClassificationSupport Vector Machines for Classification
Support Vector Machines for Classification
Prakash Pimpale
 
Svm
SvmSvm
Gradient Boosted Regression Trees in scikit-learn
Gradient Boosted Regression Trees in scikit-learnGradient Boosted Regression Trees in scikit-learn
Gradient Boosted Regression Trees in scikit-learn
DataRobot
 
02 image processing
02 image processing02 image processing
02 image processing
ankit_ppt
 
Vc dimension in Machine Learning
Vc dimension in Machine LearningVc dimension in Machine Learning
Vc dimension in Machine Learning
VARUN KUMAR
 
Data Science - Part IX - Support Vector Machine
Data Science - Part IX -  Support Vector MachineData Science - Part IX -  Support Vector Machine
Data Science - Part IX - Support Vector Machine
Derek Kane
 
06 image features
06 image features06 image features
06 image features
ankit_ppt
 

What's hot (19)

Machine Learning using Support Vector Machine
Machine Learning using Support Vector MachineMachine Learning using Support Vector Machine
Machine Learning using Support Vector Machine
 
Support Vector Machines
Support Vector MachinesSupport Vector Machines
Support Vector Machines
 
Binary Class and Multi Class Strategies for Machine Learning
Binary Class and Multi Class Strategies for Machine LearningBinary Class and Multi Class Strategies for Machine Learning
Binary Class and Multi Class Strategies for Machine Learning
 
Classification
ClassificationClassification
Classification
 
SVM Tutorial
SVM TutorialSVM Tutorial
SVM Tutorial
 
07 learning
07 learning07 learning
07 learning
 
Support vector machine
Support vector machineSupport vector machine
Support vector machine
 
sentiment analysis using support vector machine
sentiment analysis using support vector machinesentiment analysis using support vector machine
sentiment analysis using support vector machine
 
GBM theory code and parameters
GBM theory code and parametersGBM theory code and parameters
GBM theory code and parameters
 
SVM
SVMSVM
SVM
 
2013-1 Machine Learning Lecture 05 - Andrew Moore - Support Vector Machines
2013-1 Machine Learning Lecture 05 - Andrew Moore - Support Vector Machines2013-1 Machine Learning Lecture 05 - Andrew Moore - Support Vector Machines
2013-1 Machine Learning Lecture 05 - Andrew Moore - Support Vector Machines
 
Matrix decomposition and_applications_to_nlp
Matrix decomposition and_applications_to_nlpMatrix decomposition and_applications_to_nlp
Matrix decomposition and_applications_to_nlp
 
Support Vector Machines for Classification
Support Vector Machines for ClassificationSupport Vector Machines for Classification
Support Vector Machines for Classification
 
Svm
SvmSvm
Svm
 
Gradient Boosted Regression Trees in scikit-learn
Gradient Boosted Regression Trees in scikit-learnGradient Boosted Regression Trees in scikit-learn
Gradient Boosted Regression Trees in scikit-learn
 
02 image processing
02 image processing02 image processing
02 image processing
 
Vc dimension in Machine Learning
Vc dimension in Machine LearningVc dimension in Machine Learning
Vc dimension in Machine Learning
 
Data Science - Part IX - Support Vector Machine
Data Science - Part IX -  Support Vector MachineData Science - Part IX -  Support Vector Machine
Data Science - Part IX - Support Vector Machine
 
06 image features
06 image features06 image features
06 image features
 

Similar to Ml5 svm and-kernels

Support Vector Machines USING MACHINE LEARNING HOW IT WORKS
Support Vector Machines USING MACHINE LEARNING HOW IT WORKSSupport Vector Machines USING MACHINE LEARNING HOW IT WORKS
Support Vector Machines USING MACHINE LEARNING HOW IT WORKS
rajalakshmi5921
 
Search-driven String Constraint Solving for Vulnerability Detection
Search-driven String Constraint Solving for Vulnerability DetectionSearch-driven String Constraint Solving for Vulnerability Detection
Search-driven String Constraint Solving for Vulnerability Detection
Lionel Briand
 
Support Vector Machine Techniques for Nonlinear Equalization
Support Vector Machine Techniques for Nonlinear EqualizationSupport Vector Machine Techniques for Nonlinear Equalization
Support Vector Machine Techniques for Nonlinear Equalization
Shamman Noor Shoudha
 
Support Vector Machine (Classification) - Step by Step
Support Vector Machine (Classification) - Step by StepSupport Vector Machine (Classification) - Step by Step
Support Vector Machine (Classification) - Step by Step
Manish nath choudhary
 
Le Song, Assistant Professor, College of Computing, Georgia Institute of Tech...
Le Song, Assistant Professor, College of Computing, Georgia Institute of Tech...Le Song, Assistant Professor, College of Computing, Georgia Institute of Tech...
Le Song, Assistant Professor, College of Computing, Georgia Institute of Tech...
MLconf
 
svm.pptx
svm.pptxsvm.pptx
SVD and the Netflix Dataset
SVD and the Netflix DatasetSVD and the Netflix Dataset
SVD and the Netflix Dataset
Ben Mabey
 
Seminar On Kalman Filter And Its Applications
Seminar On  Kalman  Filter And Its ApplicationsSeminar On  Kalman  Filter And Its Applications
Seminar On Kalman Filter And Its Applications
Barnali Dey
 
Notes relating to Machine Learning and SVM
Notes relating to Machine Learning and SVMNotes relating to Machine Learning and SVM
Notes relating to Machine Learning and SVM
SyedSaimGardezi
 
Lesson_8_DeepLearning.pdf
Lesson_8_DeepLearning.pdfLesson_8_DeepLearning.pdf
Lesson_8_DeepLearning.pdf
ssuser7f0b19
 
ScalaMeter 2014
ScalaMeter 2014ScalaMeter 2014
ScalaMeter 2014
Aleksandar Prokopec
 
COCOA: Communication-Efficient Coordinate Ascent
COCOA: Communication-Efficient Coordinate AscentCOCOA: Communication-Efficient Coordinate Ascent
COCOA: Communication-Efficient Coordinate Ascent
jeykottalam
 
PVS-Studio Meets Octave
PVS-Studio Meets Octave PVS-Studio Meets Octave
PVS-Studio Meets Octave
PVS-Studio
 
Lec_XX_Support Vector Machine Algorithm.pptx
Lec_XX_Support Vector Machine Algorithm.pptxLec_XX_Support Vector Machine Algorithm.pptx
Lec_XX_Support Vector Machine Algorithm.pptx
piwig56192
 
End1
End1End1
Tutorial - Support vector machines
Tutorial - Support vector machinesTutorial - Support vector machines
Tutorial - Support vector machinesbutest
 
Auro tripathy - Localizing with CNNs
Auro tripathy -  Localizing with CNNsAuro tripathy -  Localizing with CNNs
Auro tripathy - Localizing with CNNs
Auro Tripathy
 
What I did in My Internship @ WSO2
What I did in My Internship @ WSO2What I did in My Internship @ WSO2
What I did in My Internship @ WSO2
Andun Sameera
 

Similar to Ml5 svm and-kernels (20)

Support Vector Machines USING MACHINE LEARNING HOW IT WORKS
Support Vector Machines USING MACHINE LEARNING HOW IT WORKSSupport Vector Machines USING MACHINE LEARNING HOW IT WORKS
Support Vector Machines USING MACHINE LEARNING HOW IT WORKS
 
Search-driven String Constraint Solving for Vulnerability Detection
Search-driven String Constraint Solving for Vulnerability DetectionSearch-driven String Constraint Solving for Vulnerability Detection
Search-driven String Constraint Solving for Vulnerability Detection
 
Support Vector Machine Techniques for Nonlinear Equalization
Support Vector Machine Techniques for Nonlinear EqualizationSupport Vector Machine Techniques for Nonlinear Equalization
Support Vector Machine Techniques for Nonlinear Equalization
 
Support Vector Machine (Classification) - Step by Step
Support Vector Machine (Classification) - Step by StepSupport Vector Machine (Classification) - Step by Step
Support Vector Machine (Classification) - Step by Step
 
Le Song, Assistant Professor, College of Computing, Georgia Institute of Tech...
Le Song, Assistant Professor, College of Computing, Georgia Institute of Tech...Le Song, Assistant Professor, College of Computing, Georgia Institute of Tech...
Le Song, Assistant Professor, College of Computing, Georgia Institute of Tech...
 
svm.pptx
svm.pptxsvm.pptx
svm.pptx
 
Ch 6 randomization
Ch 6 randomizationCh 6 randomization
Ch 6 randomization
 
SVD and the Netflix Dataset
SVD and the Netflix DatasetSVD and the Netflix Dataset
SVD and the Netflix Dataset
 
Seminar On Kalman Filter And Its Applications
Seminar On  Kalman  Filter And Its ApplicationsSeminar On  Kalman  Filter And Its Applications
Seminar On Kalman Filter And Its Applications
 
Notes relating to Machine Learning and SVM
Notes relating to Machine Learning and SVMNotes relating to Machine Learning and SVM
Notes relating to Machine Learning and SVM
 
Lesson_8_DeepLearning.pdf
Lesson_8_DeepLearning.pdfLesson_8_DeepLearning.pdf
Lesson_8_DeepLearning.pdf
 
ScalaMeter 2014
ScalaMeter 2014ScalaMeter 2014
ScalaMeter 2014
 
COCOA: Communication-Efficient Coordinate Ascent
COCOA: Communication-Efficient Coordinate AscentCOCOA: Communication-Efficient Coordinate Ascent
COCOA: Communication-Efficient Coordinate Ascent
 
PPT
PPTPPT
PPT
 
PVS-Studio Meets Octave
PVS-Studio Meets Octave PVS-Studio Meets Octave
PVS-Studio Meets Octave
 
Lec_XX_Support Vector Machine Algorithm.pptx
Lec_XX_Support Vector Machine Algorithm.pptxLec_XX_Support Vector Machine Algorithm.pptx
Lec_XX_Support Vector Machine Algorithm.pptx
 
End1
End1End1
End1
 
Tutorial - Support vector machines
Tutorial - Support vector machinesTutorial - Support vector machines
Tutorial - Support vector machines
 
Auro tripathy - Localizing with CNNs
Auro tripathy -  Localizing with CNNsAuro tripathy -  Localizing with CNNs
Auro tripathy - Localizing with CNNs
 
What I did in My Internship @ WSO2
What I did in My Internship @ WSO2What I did in My Internship @ WSO2
What I did in My Internship @ WSO2
 

More from ankit_ppt

Deep learning summary
Deep learning summaryDeep learning summary
Deep learning summary
ankit_ppt
 
08 neural networks
08 neural networks08 neural networks
08 neural networks
ankit_ppt
 
05 contours seg_matching
05 contours seg_matching05 contours seg_matching
05 contours seg_matching
ankit_ppt
 
04 image transformations_ii
04 image transformations_ii04 image transformations_ii
04 image transformations_ii
ankit_ppt
 
03 image transformations_i
03 image transformations_i03 image transformations_i
03 image transformations_i
ankit_ppt
 
01 foundations
01 foundations01 foundations
01 foundations
ankit_ppt
 
Word2 vec
Word2 vecWord2 vec
Word2 vec
ankit_ppt
 
Text similarity measures
Text similarity measuresText similarity measures
Text similarity measures
ankit_ppt
 
Text generation and_advanced_topics
Text generation and_advanced_topicsText generation and_advanced_topics
Text generation and_advanced_topics
ankit_ppt
 
Nlp toolkits and_preprocessing_techniques
Nlp toolkits and_preprocessing_techniquesNlp toolkits and_preprocessing_techniques
Nlp toolkits and_preprocessing_techniques
ankit_ppt
 
Latent dirichlet allocation_and_topic_modeling
Latent dirichlet allocation_and_topic_modelingLatent dirichlet allocation_and_topic_modeling
Latent dirichlet allocation_and_topic_modeling
ankit_ppt
 
Intro to nlp
Intro to nlpIntro to nlp
Intro to nlp
ankit_ppt
 
Ml10 dimensionality reduction-and_advanced_topics
Ml10 dimensionality reduction-and_advanced_topicsMl10 dimensionality reduction-and_advanced_topics
Ml10 dimensionality reduction-and_advanced_topics
ankit_ppt
 
Ml9 introduction to-unsupervised_learning_and_clustering_methods
Ml9 introduction to-unsupervised_learning_and_clustering_methodsMl9 introduction to-unsupervised_learning_and_clustering_methods
Ml9 introduction to-unsupervised_learning_and_clustering_methods
ankit_ppt
 
Ml8 boosting and-stacking
Ml8 boosting and-stackingMl8 boosting and-stacking
Ml8 boosting and-stacking
ankit_ppt
 
Ml7 bagging
Ml7 baggingMl7 bagging
Ml7 bagging
ankit_ppt
 
Ml6 decision trees
Ml6 decision treesMl6 decision trees
Ml6 decision trees
ankit_ppt
 
Ml4 naive bayes
Ml4 naive bayesMl4 naive bayes
Ml4 naive bayes
ankit_ppt
 
Lesson 3 ai in the enterprise
Lesson 3   ai in the enterpriseLesson 3   ai in the enterprise
Lesson 3 ai in the enterprise
ankit_ppt
 
Ml1 introduction to-supervised_learning_and_k_nearest_neighbors
Ml1 introduction to-supervised_learning_and_k_nearest_neighborsMl1 introduction to-supervised_learning_and_k_nearest_neighbors
Ml1 introduction to-supervised_learning_and_k_nearest_neighbors
ankit_ppt
 

More from ankit_ppt (20)

Deep learning summary
Deep learning summaryDeep learning summary
Deep learning summary
 
08 neural networks
08 neural networks08 neural networks
08 neural networks
 
05 contours seg_matching
05 contours seg_matching05 contours seg_matching
05 contours seg_matching
 
04 image transformations_ii
04 image transformations_ii04 image transformations_ii
04 image transformations_ii
 
03 image transformations_i
03 image transformations_i03 image transformations_i
03 image transformations_i
 
01 foundations
01 foundations01 foundations
01 foundations
 
Word2 vec
Word2 vecWord2 vec
Word2 vec
 
Text similarity measures
Text similarity measuresText similarity measures
Text similarity measures
 
Text generation and_advanced_topics
Text generation and_advanced_topicsText generation and_advanced_topics
Text generation and_advanced_topics
 
Nlp toolkits and_preprocessing_techniques
Nlp toolkits and_preprocessing_techniquesNlp toolkits and_preprocessing_techniques
Nlp toolkits and_preprocessing_techniques
 
Latent dirichlet allocation_and_topic_modeling
Latent dirichlet allocation_and_topic_modelingLatent dirichlet allocation_and_topic_modeling
Latent dirichlet allocation_and_topic_modeling
 
Intro to nlp
Intro to nlpIntro to nlp
Intro to nlp
 
Ml10 dimensionality reduction-and_advanced_topics
Ml10 dimensionality reduction-and_advanced_topicsMl10 dimensionality reduction-and_advanced_topics
Ml10 dimensionality reduction-and_advanced_topics
 
Ml9 introduction to-unsupervised_learning_and_clustering_methods
Ml9 introduction to-unsupervised_learning_and_clustering_methodsMl9 introduction to-unsupervised_learning_and_clustering_methods
Ml9 introduction to-unsupervised_learning_and_clustering_methods
 
Ml8 boosting and-stacking
Ml8 boosting and-stackingMl8 boosting and-stacking
Ml8 boosting and-stacking
 
Ml7 bagging
Ml7 baggingMl7 bagging
Ml7 bagging
 
Ml6 decision trees
Ml6 decision treesMl6 decision trees
Ml6 decision trees
 
Ml4 naive bayes
Ml4 naive bayesMl4 naive bayes
Ml4 naive bayes
 
Lesson 3 ai in the enterprise
Lesson 3   ai in the enterpriseLesson 3   ai in the enterprise
Lesson 3 ai in the enterprise
 
Ml1 introduction to-supervised_learning_and_k_nearest_neighbors
Ml1 introduction to-supervised_learning_and_k_nearest_neighborsMl1 introduction to-supervised_learning_and_k_nearest_neighbors
Ml1 introduction to-supervised_learning_and_k_nearest_neighbors
 

Recently uploaded

Cosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdfCosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdf
Kamal Acharya
 
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdfWater Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation & Control
 
Railway Signalling Principles Edition 3.pdf
Railway Signalling Principles Edition 3.pdfRailway Signalling Principles Edition 3.pdf
Railway Signalling Principles Edition 3.pdf
TeeVichai
 
Automobile Management System Project Report.pdf
Automobile Management System Project Report.pdfAutomobile Management System Project Report.pdf
Automobile Management System Project Report.pdf
Kamal Acharya
 
ethical hacking-mobile hacking methods.ppt
ethical hacking-mobile hacking methods.pptethical hacking-mobile hacking methods.ppt
ethical hacking-mobile hacking methods.ppt
Jayaprasanna4
 
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Dr.Costas Sachpazis
 
H.Seo, ICLR 2024, MLILAB, KAIST AI.pdf
H.Seo,  ICLR 2024, MLILAB,  KAIST AI.pdfH.Seo,  ICLR 2024, MLILAB,  KAIST AI.pdf
H.Seo, ICLR 2024, MLILAB, KAIST AI.pdf
MLILAB
 
Vaccine management system project report documentation..pdf
Vaccine management system project report documentation..pdfVaccine management system project report documentation..pdf
Vaccine management system project report documentation..pdf
Kamal Acharya
 
Immunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary AttacksImmunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary Attacks
gerogepatton
 
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
MdTanvirMahtab2
 
Standard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - NeometrixStandard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - Neometrix
Neometrix_Engineering_Pvt_Ltd
 
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdfAKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
SamSarthak3
 
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdfHybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
fxintegritypublishin
 
The role of big data in decision making.
The role of big data in decision making.The role of big data in decision making.
The role of big data in decision making.
ankuprajapati0525
 
Gen AI Study Jams _ For the GDSC Leads in India.pdf
Gen AI Study Jams _ For the GDSC Leads in India.pdfGen AI Study Jams _ For the GDSC Leads in India.pdf
Gen AI Study Jams _ For the GDSC Leads in India.pdf
gdsczhcet
 
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
bakpo1
 
Final project report on grocery store management system..pdf
Final project report on grocery store management system..pdfFinal project report on grocery store management system..pdf
Final project report on grocery store management system..pdf
Kamal Acharya
 
DESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docxDESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docx
FluxPrime1
 
J.Yang, ICLR 2024, MLILAB, KAIST AI.pdf
J.Yang,  ICLR 2024, MLILAB, KAIST AI.pdfJ.Yang,  ICLR 2024, MLILAB, KAIST AI.pdf
J.Yang, ICLR 2024, MLILAB, KAIST AI.pdf
MLILAB
 
Event Management System Vb Net Project Report.pdf
Event Management System Vb Net  Project Report.pdfEvent Management System Vb Net  Project Report.pdf
Event Management System Vb Net Project Report.pdf
Kamal Acharya
 

Recently uploaded (20)

Cosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdfCosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdf
 
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdfWater Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdf
 
Railway Signalling Principles Edition 3.pdf
Railway Signalling Principles Edition 3.pdfRailway Signalling Principles Edition 3.pdf
Railway Signalling Principles Edition 3.pdf
 
Automobile Management System Project Report.pdf
Automobile Management System Project Report.pdfAutomobile Management System Project Report.pdf
Automobile Management System Project Report.pdf
 
ethical hacking-mobile hacking methods.ppt
ethical hacking-mobile hacking methods.pptethical hacking-mobile hacking methods.ppt
ethical hacking-mobile hacking methods.ppt
 
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
 
H.Seo, ICLR 2024, MLILAB, KAIST AI.pdf
H.Seo,  ICLR 2024, MLILAB,  KAIST AI.pdfH.Seo,  ICLR 2024, MLILAB,  KAIST AI.pdf
H.Seo, ICLR 2024, MLILAB, KAIST AI.pdf
 
Vaccine management system project report documentation..pdf
Vaccine management system project report documentation..pdfVaccine management system project report documentation..pdf
Vaccine management system project report documentation..pdf
 
Immunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary AttacksImmunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary Attacks
 
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
 
Standard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - NeometrixStandard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - Neometrix
 
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdfAKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
 
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdfHybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
 
The role of big data in decision making.
The role of big data in decision making.The role of big data in decision making.
The role of big data in decision making.
 
Gen AI Study Jams _ For the GDSC Leads in India.pdf
Gen AI Study Jams _ For the GDSC Leads in India.pdfGen AI Study Jams _ For the GDSC Leads in India.pdf
Gen AI Study Jams _ For the GDSC Leads in India.pdf
 
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
 
Final project report on grocery store management system..pdf
Final project report on grocery store management system..pdfFinal project report on grocery store management system..pdf
Final project report on grocery store management system..pdf
 
DESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docxDESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docx
 
J.Yang, ICLR 2024, MLILAB, KAIST AI.pdf
J.Yang,  ICLR 2024, MLILAB, KAIST AI.pdfJ.Yang,  ICLR 2024, MLILAB, KAIST AI.pdf
J.Yang, ICLR 2024, MLILAB, KAIST AI.pdf
 
Event Management System Vb Net Project Report.pdf
Event Management System Vb Net  Project Report.pdfEvent Management System Vb Net  Project Report.pdf
Event Management System Vb Net Project Report.pdf
 

Ml5 svm and-kernels

  • 2. Legal Notices and Disclaimers This presentation is for informational purposes only. INTEL MAKES NO WARRANTIES, EXPRESS OR IMPLIED, IN THIS SUMMARY. Intel technologies’ features and benefits depend on system configuration and may require enabled hardware, software or service activation. Performance varies depending on system configuration. Check with your system manufacturer or retailer or learn more at intel.com. This sample source code is released under the Intel Sample Source Code License Agreement. Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries. *Other names and brands may be claimed as the property of others. Copyright © 2017, Intel Corporation. All rights reserved.
  • 3. Relationship to Logistic Regression Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5 𝑦 𝛽 𝑥 = 1 1+𝑒−(𝛽0+ 𝛽1 𝑥 + ε )
  • 4. Support Vector Machines (SVM) Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5
  • 5. Support Vector Machines (SVM) Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5 Three misclassifications
  • 6. Support Vector Machines (SVM) Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5 Two misclassifications
  • 7. Support Vector Machines (SVM) Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5 No misclassifications
  • 8. Support Vector Machines (SVM) Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5 No misclassifications—but is this the best position?
  • 9. Support Vector Machines (SVM) Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5 No misclassifications—but is this the best position?
  • 10. Support Vector Machines (SVM) Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5 Maximize the region between classes
  • 11. Similarity Between Logistic Regression and SVM Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5
  • 12. Number of Malignant Nodes 0 Age 60 40 20 10 20 Two features (nodes, age) Two labels (survived, lost) Classification with SVMs
  • 13. Number of Malignant Nodes 0 Age 60 40 20 10 20 Find the line that best separates classes Classification with SVMs
  • 14. Number of Malignant Nodes 0 Age 60 40 20 10 20 Find the line that best separates classes Classification with SVMs
  • 15. Number of Malignant Nodes 0 Age 60 40 20 10 20 Find the line that best separates classes Classification with SVMs
  • 16. Number of Malignant Nodes 0 Age 60 40 20 10 20 Find the line that best separates classes Classification with SVMs
  • 17. Number of Malignant Nodes 0 Age 60 40 20 10 20 And include the largest boundary possible Classification with SVMs
  • 18. Logistic Regression vs SVM Cost Functions Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5
  • 19. Logistic Regression vs SVM Cost Functions Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5 3 2 1 -3 -2 -1 0 1 2 3 Logistic Regression Cost Function for Lost Class
  • 20. Logistic Regression vs SVM Cost Functions Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5 3 2 1 -3 -2 -1 0 1 2 3 3 2 1 -3 -2 -1 0 1 2 3 Logistic Regression Cost Function for Lost Class SVM Cost Function for Lost Class
  • 21. The SVM Cost Function Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5 3 2 1 -3 -2 -1 0 1 2 3 SVM Cost Function for Lost Class
  • 22. The SVM Cost Function Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5 3 2 1 -3 -2 -1 0 1 2 3 SVM Cost Function for Lost Class
  • 23. The SVM Cost Function Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5 3 2 1 -3 -2 -1 0 1 2 3 SVM Cost Function for Lost Class
  • 24. The SVM Cost Function Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5 3 2 1 -3 -2 -1 0 1 2 3 SVM Cost Function for Lost Class
  • 25. The SVM Cost Function Number of Positive Nodes Survived: 0.0 Lost: 1.0Patient Status After Five Years 0.5 3 2 1 -3 -2 -1 0 1 2 3 SVM Cost Function for Lost Class
  • 26. Number of Malignant Nodes 0 Age 60 40 20 10 20 Outlier Sensitivity in SVMs
  • 27. Number of Malignant Nodes 0 Age 60 40 20 10 20 Outlier Sensitivity in SVMs
  • 28. Number of Malignant Nodes 0 Age 60 40 20 10 20 Outlier Sensitivity in SVMs
  • 29. Number of Malignant Nodes 0 Age 60 40 20 10 20 Outlier Sensitivity in SVMs
  • 30. Number of Malignant Nodes 0 Age 60 40 20 10 20 Outlier Sensitivity in SVMs This is probably still the correct boundary
  • 31. Number of Malignant Nodes 0 Age 60 40 20 10 20 Regularization in SVMs 𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) + 1 𝐶 𝑖 𝛽𝑖
  • 32. Number of Malignant Nodes 0 Age 60 40 20 10 20 Regularization in SVMs 𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) + 1 𝐶 𝑖 𝛽𝑖
  • 33. Number of Malignant Nodes 0 Age 60 40 20 10 20 Regularization in SVMs 𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) + 1 𝐶 𝑖 𝛽𝑖 Best Fit
  • 34. Number of Malignant Nodes 0 Age 60 40 20 10 20 Regularization in SVMs 𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) + 1 𝐶 𝑖 𝛽𝑖 Best Fit Large
  • 35. Number of Malignant Nodes 0 Age 60 40 20 10 20 Regularization in SVMs 𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) + 1 𝐶 𝑖 𝛽𝑖
  • 36. Number of Malignant Nodes 0 Age 60 40 20 10 20 Regularization in SVMs 𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) + 1 𝐶 𝑖 𝛽𝑖 Slightly Higher
  • 37. Number of Malignant Nodes 0 Age 60 40 20 10 20 Regularization in SVMs 𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) + 1 𝐶 𝑖 𝛽𝑖 Slightly Higher Much Smaller
  • 38. Interpretation of SVM Coefficients 𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) + 1 𝐶 𝑖 𝛽𝑖 Number of Malignant Nodes 0 Age 60 40 20 10 20
  • 39. Interpretation of SVM Coefficients 𝐽(𝛽𝑖) = 𝑆𝑉𝑀𝐶𝑜𝑠𝑡(𝛽𝑖) + 1 𝐶 𝑖 𝛽𝑖 Number of Malignant Nodes 0 Age 60 40 20 10 20 𝛽1 𝛽2 𝛽3 Vector orthogonal to the hyperplane
  • 40. Linear SVM: The Syntax Import the class containing the classification method from sklearn.svm import LinearSVC
  • 41. Linear SVM: The Syntax Import the class containing the classification method from sklearn.svm import LinearSVC Create an instance of the class LinSVC = LinearSVC(penalty='l2', C=10.0)
  • 42. Linear SVM: The Syntax Import the class containing the classification method from sklearn.svm import LinearSVC Create an instance of the class LinSVC = LinearSVC(penalty='l2', C=10.0) regularization parameters
  • 43. Linear SVM: The Syntax Import the class containing the classification method from sklearn.svm import LinearSVC Create an instance of the class LinSVC = LinearSVC(penalty='l2', C=10.0) Fit the instance on the data and then predict the expected value LinSVC = LinSVC.fit(X_train, y_train) y_predict = LinSVC.predict(X_test)
  • 44. Linear SVM: The Syntax Import the class containing the classification method from sklearn.svm import LinearSVC Create an instance of the class LinSVC = LinearSVC(penalty='l2', C=10.0) Fit the instance on the data and then predict the expected value LinSVC = LinSVC.fit(X_train, y_train) y_predict = LinSVC.predict(X_test) Tune regularization parameters with cross-validation.
  • 45.
  • 47. Number of Malignant Nodes 0 Age 60 40 20 10 20 Classification with SVMs
  • 48. Non-Linear Decision Boundaries with SVM Non-linear data can be made linear with higher dimensionality 𝜙
  • 49. The Kernel Trick Transform data so it is linearly separable 𝜙
  • 50. Palme d’Or Winners at Cannes SVM Gaussian Kernel IMDB User Rating Budget
  • 51. Palme d’Or Winners at Cannes SVM Gaussian Kernel IMDB User Rating Budget Approach 1: Create higher order features to transform the data. Budget2 + Rating2 + Budget * Rating + …
  • 52. Palme d’Or Winners at Cannes SVM Gaussian Kernel IMDB User Rating Budget Approach 2: Transform the space to a different coordinate system.
  • 53. Palme d’Or Winners at Cannes SVM Gaussian Kernel IMDB User Rating Budget Define Feature 1: Similarity to "Pulp Fiction"
  • 54. Palme d’Or Winners at Cannes SVM Gaussian Kernel IMDB User Rating Budget Define Feature 2: Similarity to "Black Swan"
  • 55. Palme d’Or Winners at Cannes SVM Gaussian Kernel IMDB User Rating Budget Define Feature 3: Similarity to "Transformers"
  • 56. Palme d’Or Winners at Cannes SVM Gaussian Kernel IMDB User Rating Budget Create a Gaussian function at feature 1 𝑎1(𝑥 𝑜𝑏𝑠) = 𝑒𝑥𝑝 − (𝑥𝑖 𝑜𝑏𝑠 − 𝑥𝑖 𝑃𝑢𝑙𝑝 𝐹𝑖𝑐𝑡𝑖𝑜𝑛 )2 2𝜎2
  • 57. Palme d’Or Winners at Cannes SVM Gaussian Kernel IMDB User Rating Budget Create a Gaussian function at feature 2 𝑎1(𝑥 𝑜𝑏𝑠) = 𝑒𝑥𝑝 − (𝑥𝑖 𝑜𝑏𝑠 − 𝑥𝑖 𝐵𝑙𝑎𝑐𝑘 𝑆𝑤𝑎𝑛 )2 2𝜎2
  • 58. Palme d’Or Winners at Cannes SVM Gaussian Kernel IMDB User Rating Budget Create a Gaussian function at feature 3 𝑎1(𝑥 𝑜𝑏𝑠) = 𝑒𝑥𝑝 − (𝑥𝑖 𝑜𝑏𝑠 − 𝑥𝑖 𝑇𝑟𝑎𝑛𝑠𝑓𝑜𝑟𝑚𝑒𝑟𝑠 )2 2𝜎2
  • 59. SVM Gaussian Kernel IMDB User Rating Budget Transformation: [x1, x2]  [0.7a1 , 0.9a2 , -0.6a3]
  • 60. SVM Gaussian Kernel IMDB User Rating Budget a1=0.90 a2=0.92 a3=0.30 Transformation: [x1, x2]  [0.7a1 , 0.9a2 , -0.6a3] a3 a2 a1
  • 61. SVM Gaussian Kernel IMDB User Rating Budget a1=0.50 a2=0.60 a3=0.70 Transformation: [x1, x2]  [0.7a1 , 0.9a2 , -0.6a3] a3 a1 a2
  • 62. SVM Gaussian Kernel x2 (IMDB Rating) x1(Budget) a1 (Pulp Fiction) a3 (Transformers) a2 (Black Swan) Transformation: [x1, x2]  [0.7a1 , 0.9a2 , -0.6a3]
  • 63. Classification in the New Space a1 (Pulp Fiction) a3 (Transformers) a2 (Black Swan) Transformation: [x1, x2]  [0.7a1 , 0.9a2 , -0.6a3]
  • 64. Palme d’Or Winners at Cannes SVM Gaussian Kernel IMDB User Rating Budget
  • 65. Palme d’Or Winners at Cannes SVM Gaussian Kernel IMDB User Rating Budget
  • 66. Palme d’Or Winners at Cannes SVM Gaussian Kernel IMDB User Rating Budget Radial Basis Function (RBF) Kernel
  • 67. Import the class containing the classification method from sklearn.svm import SVC SVMs with Kernels: The Syntax
  • 68. Import the class containing the classification method from sklearn.svm import SVC Create an instance of the class rbfSVC = SVC(kernel='rbf', gamma=1.0, C=10.0) SVMs with Kernels: The Syntax
  • 69. Import the class containing the classification method from sklearn.svm import SVC Create an instance of the class rbfSVC = SVC(kernel='rbf', gamma=1.0, C=10.0) Fit the instance on the data and then predict the expected value rbfSVC = rbfSVC.fit(X_train, y_train) y_predict = rbfSVC.predict(X_test) Tune kernel and associated parameters with cross-validation. SVMs with Kernels: The Syntax set kernel and associated coefficient (gamma)
  • 70. Import the class containing the classification method from sklearn.svm import SVC Create an instance of the class rbfSVC = SVC(kernel='rbf', gamma=1.0, C=10.0) Fit the instance on the data and then predict the expected value rbfSVC = rbfSVC.fit(X_train, y_train) y_predict = rbfSVC.predict(X_test) Tune kernel and associated parameters with cross-validation. SVMs with Kernels: The Syntax "C" is penalty associated with the error term
  • 71. Import the class containing the classification method from sklearn.svm import SVC Create an instance of the class rbfSVC = SVC(kernel='rbf', gamma=1.0, C=10.0) Fit the instance on the data and then predict the expected value rbfSVC = rbfSVC.fit(X_train, y_train) y_predict = rbfSVC.predict(X_test) Tune kernel and associated parameters with cross-validation. SVMs with Kernels: The Syntax
  • 72. SVMs with Kernels: The Syntax Import the class containing the classification method from sklearn.svm import SVC Create an instance of the class rbfSVC = SVC(kernel='rbf', gamma=1.0, C=10.0) Fit the instance on the data and then predict the expected value rbfSVC = rbfSVC.fit(X_train, y_train) y_predict = rbfSVC.predict(X_test) Tune kernel and associated parameters with cross-validation.
  • 73. Feature Overload SVMs with RBF Kernels are very slow to train with lots of features or data Problem Solution Construct approximate kernel map with SGD using Nystroem or RBF sampler
  • 74. Feature Overload SVMs with RBF Kernels are very slow to train with lots of features or data Problem Solution Construct approximate kernel map with SGD using Nystroem or RBF sampler
  • 75. Feature Overload SVMs with RBF Kernels are very slow to train with lots of features or data Problem Solution Construct approximate kernel map with SGD using Nystroem or RBF sampler. Fit a linear classifier.
  • 76. Faster Kernel Transformations: The Syntax Import the class containing the classification method from sklearn.kernel_approximation import Nystroem Create an instance of the class nystroemSVC = Nystroem(kernel='rbf', gamma=1.0, n_components=100) Fit the instance on the data and transform X_train = nystroemSVC.fit_transform(X_train) X_test = nystroemSVC.transform(X_test) Tune kernel parameters and components with cross-validation.
  • 77. Faster Kernel Transformations: The Syntax Import the class containing the classification method from sklearn.kernel_approximation import Nystroem Create an instance of the class nystroemSVC = Nystroem(kernel='rbf', gamma=1.0, n_components=100) Fit the instance on the data and transform X_train = nystroemSVC.fit_transform(X_train) X_test = nystroemSVC.transform(X_test) Tune kernel parameters and components with cross-validation. multiple non- linear kernels can be used
  • 78. Faster Kernel Transformations: The Syntax Import the class containing the classification method from sklearn.kernel_approximation import Nystroem Create an instance of the class nystroemSVC = Nystroem(kernel='rbf', gamma=1.0, n_components=100) Fit the instance on the data and transform X_train = nystroemSVC.fit_transform(X_train) X_test = nystroemSVC.transform(X_test) Tune kernel parameters and components with cross-validation. kernel and gamma are identical to SVC
  • 79. Faster Kernel Transformations: The Syntax Import the class containing the classification method from sklearn.kernel_approximation import Nystroem Create an instance of the class nystroemSVC = Nystroem(kernel='rbf', gamma=1.0, n_components=100) Fit the instance on the data and transform X_train = nystroemSVC.fit_transform(X_train) X_test = nystroemSVC.transform(X_test) Tune kernel parameters and components with cross-validation. n_components is number of samples
  • 80. Faster Kernel Transformations: The Syntax Import the class containing the classification method from sklearn.kernel_approximation import RBFsampler Create an instance of the class rbfSample = RBFsampler(gamma=1.0, n_components=100) Fit the instance on the data and transform X_train = rbfSample.fit_transform(X_train) X_test = rbfSample.transform(X_test) Tune kernel parameters and components with cross-validation.
  • 81. Faster Kernel Transformations: The Syntax Import the class containing the classification method from sklearn.kernel_approximation import RBFsampler Create an instance of the class rbfSample = RBFsampler(gamma=1.0, n_components=100) Fit the instance on the data and transform X_train = rbfSample.fit_transform(X_train) X_test = rbfSample.transform(X_test) Tune kernel parameters and components with cross-validation. RBF is only kernel that can be used
  • 82. Faster Kernel Transformations: The Syntax Import the class containing the classification method from sklearn.kernel_approximation import RBFsampler Create an instance of the class rbfSample = RBFsampler(gamma=1.0, n_components=100) Fit the instance on the data and transform X_train = rbfSample.fit_transform(X_train) X_test = rbfSample.transform(X_test) Tune kernel parameters and components with cross-validation. parameter names are identical to previous
  • 83. When to Use Logistic Regression vs SVC Features Data Model Choice Many (~10K Features) Few (<100 Features) Few (<100 Features) Small (1K rows) Medium (~10k rows) Many (>100K Points) Simple, Logistic or LinearSVC SVC with RBF Add features, Logistic, LinearSVC or Kernel Approx.