Support Vector Machine
2
Name Roll GR.No
Satvik Bhor 39 11910532
Yash Balaskar 29 11910275
Somnath More 08 11910994
Mohammed Patel 04 11910190
Group Members:
Guide:- Prof. S.T.Patil
Contents
❖ What is Support
Vector Machine?
❖ Types Of SVM
❖ How does SVM
Work?
❖ Applications of
SVM in Real World
❖ What is Support Vector Machine?
Support Vector Machine used for
Classification as well as Regression
problems.
The goal of the SVM algorithm is to create
the best line or decision boundary that
can segregate n-dimensional space into
classes.
SVM chooses the extreme points/vectors
that help in creating the hyperplane.
Example :-
Types of SVM
Linear
Linear SVM is used for
linearly separable data,
which means if a dataset
can be classified into two
classes by using a single
straight line, then such data
is termed as linearly
separable data, and
classifier is used called as
Linear SVM classifier.
.
Non-Linear
Non-Linear SVM is used
for non-linearly separated
data, which means if a
dataset cannot be classified
by using a straight line,
then such data is termed as
non-linear data and
classifier used is called as
Non-linear SVM classifier.
How Does SVM work?
Important Concepts:
❖ HyperPlane
❖ Margin:
❖ Support vectors:
HyperPlane
How Does SVM Work ? : Linear Separable data:
Fig:1 Fig:2:Generating Hyperplane Fig:3:Deciding hyperplane
B
G
x x x
y
y y
How Does SVM Work ? : Non-Linear Separable data:
G
B
?
❏ z=Adding third(3-D) Dimension z=x^2+y^2x2 +y2
Fig:1:3-D plane Fig:2::Deciding Hyperplane
Implementation of SVM
Support Vector Machines VS Other Classification Methods
● Naive Bayes Classifiers
○ SVM is more consistent than NBC.
○ NBC is more prone to failures.
○ NBC is more effective when the dataset is smaller.
● Decision Tree
○ Decision gives a better interpretation of the model..
○ SVM gives better accuracy than the decision tree.
Support Vector Machines VS Other Classification Methods
● Random Forest
○ Less overfitting.
○ Easier to train.
● K Nearest Neighbour
○ SVM is easier to implement than KNN.
○ Interpretation of data is easier in SVM.
○ More complex patterns can be found using KNN.
Tuning Parameters
❏ Gamma
❏ Regularization
❏ Kernel
Advantages
● High dimensional space
● Relatively memory efficient
● Regularization parameter
Disadvantages
● doesn't perform well,when we have large data set because the required training
time is higher
● also doesn't perform very well, when the data set has more noise
● As the support vector classifier works by putting data points,above and below the
classifying the hyperplane
Application of SVM in Real World
● Face detection
● Text and hypertext categorization –
Classification of images –
● Bioinformatics –
● Protein fold and remote homology
detection –
● Handwriting recognition –
● Generalized predictive control(GPC) – 15
25
35
22
References:
Thanks!
20

Support vector machine

  • 1.
  • 2.
    2 Name Roll GR.No SatvikBhor 39 11910532 Yash Balaskar 29 11910275 Somnath More 08 11910994 Mohammed Patel 04 11910190 Group Members: Guide:- Prof. S.T.Patil
  • 3.
    Contents ❖ What isSupport Vector Machine? ❖ Types Of SVM ❖ How does SVM Work? ❖ Applications of SVM in Real World
  • 4.
    ❖ What isSupport Vector Machine? Support Vector Machine used for Classification as well as Regression problems. The goal of the SVM algorithm is to create the best line or decision boundary that can segregate n-dimensional space into classes. SVM chooses the extreme points/vectors that help in creating the hyperplane.
  • 5.
  • 6.
    Types of SVM Linear LinearSVM is used for linearly separable data, which means if a dataset can be classified into two classes by using a single straight line, then such data is termed as linearly separable data, and classifier is used called as Linear SVM classifier. .
  • 7.
    Non-Linear Non-Linear SVM isused for non-linearly separated data, which means if a dataset cannot be classified by using a straight line, then such data is termed as non-linear data and classifier used is called as Non-linear SVM classifier.
  • 8.
    How Does SVMwork? Important Concepts: ❖ HyperPlane ❖ Margin: ❖ Support vectors: HyperPlane
  • 9.
    How Does SVMWork ? : Linear Separable data: Fig:1 Fig:2:Generating Hyperplane Fig:3:Deciding hyperplane B G x x x y y y
  • 10.
    How Does SVMWork ? : Non-Linear Separable data: G B ?
  • 11.
    ❏ z=Adding third(3-D)Dimension z=x^2+y^2x2 +y2 Fig:1:3-D plane Fig:2::Deciding Hyperplane
  • 12.
  • 13.
    Support Vector MachinesVS Other Classification Methods ● Naive Bayes Classifiers ○ SVM is more consistent than NBC. ○ NBC is more prone to failures. ○ NBC is more effective when the dataset is smaller. ● Decision Tree ○ Decision gives a better interpretation of the model.. ○ SVM gives better accuracy than the decision tree.
  • 14.
    Support Vector MachinesVS Other Classification Methods ● Random Forest ○ Less overfitting. ○ Easier to train. ● K Nearest Neighbour ○ SVM is easier to implement than KNN. ○ Interpretation of data is easier in SVM. ○ More complex patterns can be found using KNN.
  • 15.
    Tuning Parameters ❏ Gamma ❏Regularization ❏ Kernel
  • 16.
    Advantages ● High dimensionalspace ● Relatively memory efficient ● Regularization parameter
  • 17.
    Disadvantages ● doesn't performwell,when we have large data set because the required training time is higher ● also doesn't perform very well, when the data set has more noise ● As the support vector classifier works by putting data points,above and below the classifying the hyperplane
  • 18.
    Application of SVMin Real World ● Face detection ● Text and hypertext categorization – Classification of images – ● Bioinformatics – ● Protein fold and remote homology detection – ● Handwriting recognition – ● Generalized predictive control(GPC) – 15 25 35 22
  • 19.
  • 20.