MCS 7103: Machine Learning
Simon Alex and Nambaale
Support Vector Machines
Simon Alex and Nambaale MCS 7101 October 8, 2019 1 / 28
Overview
1 Support Vector Machines
Applications of SVM
What is Machine Learning?
What Is SVM?
Features of SVM
How Does SVM Work?
Non-Linear SVM
SVM Use Case
Simon Alex and Nambaale MCS 7101 October 8, 2019 2 / 28
Applications of Support Vector Machines
Text Categorization Bioinformatics Face Recognition Image Classification
Simon Alex and Nambaale MCS 7101 October 8, 2019 3 / 28
What is Machine Learning?
Simon Alex and Nambaale MCS 7101 October 8, 2019 4 / 28
Supervised Learning
Simon Alex and Nambaale MCS 7101 October 8, 2019 5 / 28
Unsupervised Learning
Simon Alex and Nambaale MCS 7101 October 8, 2019 6 / 28
Type of Problems in Machine Learning
Simon Alex and Nambaale MCS 7101 October 8, 2019 7 / 28
Where are Support Vector Machines?
Simon Alex and Nambaale MCS 7101 October 8, 2019 8 / 28
What is Support Vector Machine?
Support Vector Machine (SVM) is a supervised learning method used
for classification and regression.
SVM separates data using a hyperplane which acts like a decision
boundary between the various classes.
SVM works well for classifying higher-dimensional data (lots of fea-
tures).
Simon Alex and Nambaale MCS 7101 October 8, 2019 9 / 28
Features of Support Vector Machine
SVM is a supervised learning algorithm.
It can be used for both classification and regression problems.
SVM can be used for classifying non-linear data using the kernel trick.
Simon Alex and Nambaale MCS 7101 October 8, 2019 10 / 28
How Does SVM Work?
Simon Alex and Nambaale MCS 7101 October 8, 2019 11 / 28
How Does SVM Work?
Simon Alex and Nambaale MCS 7101 October 8, 2019 12 / 28
What is a Support Vector in SVM
Simon Alex and Nambaale MCS 7101 October 8, 2019 13 / 28
What is a Support Vector in SVM
Simon Alex and Nambaale MCS 7101 October 8, 2019 14 / 28
What is a Support Vector in SVM
Simon Alex and Nambaale MCS 7101 October 8, 2019 15 / 28
Non-Linear Support Vector Machine
Simon Alex and Nambaale MCS 7101 October 8, 2019 16 / 28
Non-Linear Support Vector Machine
Simon Alex and Nambaale MCS 7101 October 8, 2019 17 / 28
Defining the Separating Hyperplane
Form of equation defining the decision surface separating the classes is
a hyperplane of the form:
wT
x + b = 0
Where:
w is a weight vector
x is an input vector
b is bias
Simon Alex and Nambaale MCS 7101 October 8, 2019 18 / 28
Defining the Separating Hyperplane
Simon Alex and Nambaale MCS 7101 October 8, 2019 19 / 28
Support Vector Classification
In practice, SVM uses SVC to classify data.
Different kernels can be used with SVC.
These kernels include: linear, RBF and polynomial.
Some kernels work better than others for a given dataset.
Simon Alex and Nambaale MCS 7101 October 8, 2019 20 / 28
Gamma and C
Gamma controls the shape of the ’peaks’ where the points are raised.
The gamma parameter defines the degree of non-linearity, with low val-
ues tending high linearity and high values leading to high non-linearity.
Simon Alex and Nambaale MCS 7101 October 8, 2019 21 / 28
Influence of Gamma on Training Data
Simon Alex and Nambaale MCS 7101 October 8, 2019 22 / 28
Gamma and C
C controls the cost of misclassification on the training data.
A high C tries to minimize the misclassification of training data leading
to overfitting∗ and a low value tries to maintain a smooth classifica-
tion.
∗
Underfitting is where the model neither performs well in training nor testing.
Simon Alex and Nambaale MCS 7101 October 8, 2019 23 / 28
Influence of C on Training Data
Simon Alex and Nambaale MCS 7101 October 8, 2019 24 / 28
Cross-Validation
Cross-Validation is a statistical method of evaluating and comparing
learning algorithms by dividing data into two segments: one used to
learn or train a model and the other used to validate the model.
The basic form of cross-validation is k-fold cross-validation.
Simon Alex and Nambaale MCS 7101 October 8, 2019 25 / 28
K-Fold Cross Validation
Simon Alex and Nambaale MCS 7101 October 8, 2019 26 / 28
Questions?
Simon Alex and Nambaale MCS 7101 October 8, 2019 27 / 28
SVM Use Case
Simon Alex and Nambaale MCS 7101 October 8, 2019 28 / 28

Support vector-machines-presentation

  • 1.
    MCS 7103: MachineLearning Simon Alex and Nambaale Support Vector Machines Simon Alex and Nambaale MCS 7101 October 8, 2019 1 / 28
  • 2.
    Overview 1 Support VectorMachines Applications of SVM What is Machine Learning? What Is SVM? Features of SVM How Does SVM Work? Non-Linear SVM SVM Use Case Simon Alex and Nambaale MCS 7101 October 8, 2019 2 / 28
  • 3.
    Applications of SupportVector Machines Text Categorization Bioinformatics Face Recognition Image Classification Simon Alex and Nambaale MCS 7101 October 8, 2019 3 / 28
  • 4.
    What is MachineLearning? Simon Alex and Nambaale MCS 7101 October 8, 2019 4 / 28
  • 5.
    Supervised Learning Simon Alexand Nambaale MCS 7101 October 8, 2019 5 / 28
  • 6.
    Unsupervised Learning Simon Alexand Nambaale MCS 7101 October 8, 2019 6 / 28
  • 7.
    Type of Problemsin Machine Learning Simon Alex and Nambaale MCS 7101 October 8, 2019 7 / 28
  • 8.
    Where are SupportVector Machines? Simon Alex and Nambaale MCS 7101 October 8, 2019 8 / 28
  • 9.
    What is SupportVector Machine? Support Vector Machine (SVM) is a supervised learning method used for classification and regression. SVM separates data using a hyperplane which acts like a decision boundary between the various classes. SVM works well for classifying higher-dimensional data (lots of fea- tures). Simon Alex and Nambaale MCS 7101 October 8, 2019 9 / 28
  • 10.
    Features of SupportVector Machine SVM is a supervised learning algorithm. It can be used for both classification and regression problems. SVM can be used for classifying non-linear data using the kernel trick. Simon Alex and Nambaale MCS 7101 October 8, 2019 10 / 28
  • 11.
    How Does SVMWork? Simon Alex and Nambaale MCS 7101 October 8, 2019 11 / 28
  • 12.
    How Does SVMWork? Simon Alex and Nambaale MCS 7101 October 8, 2019 12 / 28
  • 13.
    What is aSupport Vector in SVM Simon Alex and Nambaale MCS 7101 October 8, 2019 13 / 28
  • 14.
    What is aSupport Vector in SVM Simon Alex and Nambaale MCS 7101 October 8, 2019 14 / 28
  • 15.
    What is aSupport Vector in SVM Simon Alex and Nambaale MCS 7101 October 8, 2019 15 / 28
  • 16.
    Non-Linear Support VectorMachine Simon Alex and Nambaale MCS 7101 October 8, 2019 16 / 28
  • 17.
    Non-Linear Support VectorMachine Simon Alex and Nambaale MCS 7101 October 8, 2019 17 / 28
  • 18.
    Defining the SeparatingHyperplane Form of equation defining the decision surface separating the classes is a hyperplane of the form: wT x + b = 0 Where: w is a weight vector x is an input vector b is bias Simon Alex and Nambaale MCS 7101 October 8, 2019 18 / 28
  • 19.
    Defining the SeparatingHyperplane Simon Alex and Nambaale MCS 7101 October 8, 2019 19 / 28
  • 20.
    Support Vector Classification Inpractice, SVM uses SVC to classify data. Different kernels can be used with SVC. These kernels include: linear, RBF and polynomial. Some kernels work better than others for a given dataset. Simon Alex and Nambaale MCS 7101 October 8, 2019 20 / 28
  • 21.
    Gamma and C Gammacontrols the shape of the ’peaks’ where the points are raised. The gamma parameter defines the degree of non-linearity, with low val- ues tending high linearity and high values leading to high non-linearity. Simon Alex and Nambaale MCS 7101 October 8, 2019 21 / 28
  • 22.
    Influence of Gammaon Training Data Simon Alex and Nambaale MCS 7101 October 8, 2019 22 / 28
  • 23.
    Gamma and C Ccontrols the cost of misclassification on the training data. A high C tries to minimize the misclassification of training data leading to overfitting∗ and a low value tries to maintain a smooth classifica- tion. ∗ Underfitting is where the model neither performs well in training nor testing. Simon Alex and Nambaale MCS 7101 October 8, 2019 23 / 28
  • 24.
    Influence of Con Training Data Simon Alex and Nambaale MCS 7101 October 8, 2019 24 / 28
  • 25.
    Cross-Validation Cross-Validation is astatistical method of evaluating and comparing learning algorithms by dividing data into two segments: one used to learn or train a model and the other used to validate the model. The basic form of cross-validation is k-fold cross-validation. Simon Alex and Nambaale MCS 7101 October 8, 2019 25 / 28
  • 26.
    K-Fold Cross Validation SimonAlex and Nambaale MCS 7101 October 8, 2019 26 / 28
  • 27.
    Questions? Simon Alex andNambaale MCS 7101 October 8, 2019 27 / 28
  • 28.
    SVM Use Case SimonAlex and Nambaale MCS 7101 October 8, 2019 28 / 28