SlideShare a Scribd company logo
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
SVM based Semi-Supervised Classication
Topics in Pattern Recognition
Bigyan Bhar
M.E. CSA, IISc
4710-410-091-07064
Oct 11th, 2010
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Outline
1 Classication
2 Support Vector Machine (SVM)
3 Using SVM for Semi-Supervised Classication
Transductive SVM  Modications
Augmented Lagrangian
Other Methods
All Methods
4 Results
5 Conclusion
New Facts
Further Directions
Acknowledgments
References
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Outline
1 Classication
2 Support Vector Machine (SVM)
3 Using SVM for Semi-Supervised Classication
Transductive SVM  Modications
Augmented Lagrangian
Other Methods
All Methods
4 Results
5 Conclusion
New Facts
Further Directions
Acknowledgments
References
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
What is a Classication?
Classication refers to an algorithmic procedure for assigning a
given piece of input data into one of a given number of
categories
Class test Final Exam Project Seminar
13 35 16 18
10 31 5 19
11 21 9 11
12 29 10 15
Grade
A
B
C
B
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Traditional Classier
Classifier
Builder
Labelled Data Classifier
Unlabelled Data Classifier Label for Data
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Classier
Classier is supposed to classify unlabeled data
We have a lot of unlabeled data; typically much more than
number of labeled data
So far we have seen classiers being built using only labeled
data
What if we could also use the large set of unclassied data to
build a better classier?
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Semi-supervised Classier
Classifier
Builder
Unlabelled 
Classifier
Builder
Labelled Data Classifier
Labelled Data Classifier
Semi−supervised
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
How to use the unlabeled data?
The separating plane has to pass through a low density region
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
How to use the unlabeled data?
The separating plane has to pass through a low density region
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Labeling Constraint
The low density region principle that we observed can be
realized using a fractional constraint
# of positive class examples
total # of of examples
= r
r is an user supplied input
We enforce the above constraint on unlabeled examples as
they are large in number
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Outline
1 Classication
2 Support Vector Machine (SVM)
3 Using SVM for Semi-Supervised Classication
Transductive SVM  Modications
Augmented Lagrangian
Other Methods
All Methods
4 Results
5 Conclusion
New Facts
Further Directions
Acknowledgments
References
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
What is SVM?
SVM = Support Vector Machine
Maximal Margin Classier
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
SVM Continued
w
w
’x+b=1
w
’x+b=0
w
’x+b=−1
m
argin
Total margin= 1
w + 1
w = 2
w
Optimization problem
min
w
1
2
wTw
subject to,
yi wTxi +b ≥ 1 ∀1 ≤ i ≤ l
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
SVM Formulation
Using KKT conditions, we get the nal SVM problem as:
w∗
= argmin
w
1
2
l
∑
i=1
loss yiwTxi +
λ
2
wTw
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Transductive SVM  Modications
Augmented Lagrangian
Other Methods
All Methods
Outline
1 Classication
2 Support Vector Machine (SVM)
3 Using SVM for Semi-Supervised Classication
Transductive SVM  Modications
Augmented Lagrangian
Other Methods
All Methods
4 Results
5 Conclusion
New Facts
Further Directions
Acknowledgments
References
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Transductive SVM  Modications
Augmented Lagrangian
Other Methods
All Methods
Transductive SVM (TSVM)
min
w,{yj }
u
j=1
λ
2
w 2
+
1
2l
l
∑
i=1
loss yiwTxi +
λ
2u
u
∑
j=1
loss yjwTxi
subject to:
1
u
u
∑
j=1
max 0,sign wTxj = r
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Transductive SVM  Modications
Augmented Lagrangian
Other Methods
All Methods
Modifying TSVM
What is the cost to importance ratio of the terms in TSVM
formulation?
min
w,{yj }
u
j=1
λ
2
w 2
+
1
2l
l
∑
i=1
loss yiwTxi +
λ
2u
u
∑
j=1
loss yjwTxi
Clearly the third term, unlabeled loss is the costliest
computation of yi for the large set of unlabeled terms
What if we can avoid it altogether?
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Transductive SVM  Modications
Augmented Lagrangian
Other Methods
All Methods
Modied TSVM
TSVM formulation:
min
w,{yj }
u
j=1
λ
2
w 2
+
1
2l
l
∑
i=1
loss yiwTxi +
λ
2u
u
∑
j=1
loss yjwTxi
Our formulation:
min
w
λ
2
w 2
+
1
2l
l
∑
i=1
loss yiwTxi
subject to:
1
u
u
∑
j=1
max 0,sign wTxj = r
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Transductive SVM  Modications
Augmented Lagrangian
Other Methods
All Methods
Augmented Lagrangian Technique
Augmented Lagrangian is a technique for solving minimization
problems with equality constraints
It converges faster than the generalized methods
Original problem: min f(x), subject to g(x) = 0
Can be written as an unconstrained minimization over:
L(x,λ,µ) = f(x)−λg(x)+
1
2µ
g(x) 2
Since f and the Lagrangian (for any λ) agree on the feasible
set g(x) = 0, the basic idea remains same as that of
Lagrangian
a small value of µ forces the minimizer(s) of L to lie close to
the feasible set
values of x that that reduce f are preferred
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Transductive SVM  Modications
Augmented Lagrangian
Other Methods
All Methods
Modied TSVM using Augmented Lagrangian
Our formulation:
min
w
[f(w)] =⇒ min
w
λ
2
w 2
+
1
2l
l
∑
i=1
loss yiwTxi
subject to:
g(w) = 0 =⇒
1
u
u
∑
j=1
max 0,sign wTxj −r = 0
Augmented Lagrangian:
min
x
[L(x,λ,µ)] = min
x
f(x)−λg(x)+
1
2µ
g(x) 2
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Transductive SVM  Modications
Augmented Lagrangian
Other Methods
All Methods
Penalty Method
Augmented Lagrangian:
min
x
[L(x,λ,µ)] = min
x
f(x)−λg(x)+
1
2µ
g(x) 2
Penalty Method:
min
x
f(x)+
1
2µ
g(x) 2
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Transductive SVM  Modications
Augmented Lagrangian
Other Methods
All Methods
SVM based
Supervised SVM (SSVM)
w∗
= arg min
w∈Rd
λ
2
w 2
+
1
2
l
∑
i=1
loss yiwTxi
SSVM with Threshold Adjustment
Obtain w∗ from SSVM
Adjust threshold to satisfy la belling constraint
1
u
u
∑
j=1
max 0,sign wTxj = r
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Transductive SVM  Modications
Augmented Lagrangian
Other Methods
All Methods
All Methods at a Glance
SVM based:
SSVM on labeled data
SSVM on labeled data with threshold adjustment
Methods proposed in this work:
Augmented Lagrangian
Penalty Method
TSVM
Deterministic Annealing
Switching
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Outline
1 Classication
2 Support Vector Machine (SVM)
3 Using SVM for Semi-Supervised Classication
Transductive SVM  Modications
Augmented Lagrangian
Other Methods
All Methods
4 Results
5 Conclusion
New Facts
Further Directions
Acknowledgments
References
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Accuracy Vs # of Labled Example (gcat)
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Accuracy Vs # of Labled Example (aut-avn)
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Accuracy Vs # of Noise in r (gcat)
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
Accuracy Vs # of Noise in r (aut-avn)
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
New Facts
Further Directions
Acknowledgments
References
Outline
1 Classication
2 Support Vector Machine (SVM)
3 Using SVM for Semi-Supervised Classication
Transductive SVM  Modications
Augmented Lagrangian
Other Methods
All Methods
4 Results
5 Conclusion
New Facts
Further Directions
Acknowledgments
References
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
New Facts
Further Directions
Acknowledgments
References
Some Results
Simple penalty method is the most robust method WRT
estimation of r
TSVM still leads in terms of accuracy
Augmented Lagrangian is a direction worth investigating due
to its faster computational time
Defeating the SSVM is possible only for reasonably accurate
estimation of r
If labeled dataset does not follow r, then alternate methods
perform better
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
New Facts
Further Directions
Acknowledgments
References
Future Directions
Establish theoretical bounds for accuracy of our methods WRT
that of TSVM
Look at non-SVM based semi-supervised classiers (e.g.
decision tree) and come up with a way to express the
fractional constraint
Can we use something other than the fractional constraint to
enforce the low density criterion?
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
New Facts
Further Directions
Acknowledgments
References
Acknowledgments
I thank the following persons for their able guidance and help in
this work:
S S Keerthi (Yahoo! Labs)
M N Murthy (IISc)
S Sundararajan (Yahoo! Labs)
S Shevade (IISc)
Bigyan Bhar Seminar, Topics in PR
Classication
Support Vector Machine (SVM)
Using SVM for Semi-Supervised Classication
Results
Conclusion
New Facts
Further Directions
Acknowledgments
References
References
MS Gockenbach. The augmented Lagrangian method for
equality-constrained optimizations
V Sindhwani, SS Keerthi. Newton Methods for Fast Solution
of Semi-supervised Linear SVMs
SS Keerthi, D DeCoste. A modied nite Newton method for
fast solution of large scale linear SVMs
Bigyan Bhar Seminar, Topics in PR

More Related Content

What's hot

딥러닝 개요 (2015-05-09 KISTEP)
딥러닝 개요 (2015-05-09 KISTEP)딥러닝 개요 (2015-05-09 KISTEP)
딥러닝 개요 (2015-05-09 KISTEP)
Keunwoo Choi
 
2020 11 4_bag_of_tricks
2020 11 4_bag_of_tricks2020 11 4_bag_of_tricks
2020 11 4_bag_of_tricks
JAEMINJEONG5
 
Support Vector Machine and Implementation using Weka
Support Vector Machine and Implementation using WekaSupport Vector Machine and Implementation using Weka
Support Vector Machine and Implementation using Weka
Macha Pujitha
 
GBM theory code and parameters
GBM theory code and parametersGBM theory code and parameters
GBM theory code and parameters
Venkata Reddy Konasani
 
Neural Network Part-2
Neural Network Part-2Neural Network Part-2
Neural Network Part-2
Venkata Reddy Konasani
 
Automatic Tagging using Deep Convolutional Neural Networks - ISMIR 2016
Automatic Tagging using Deep Convolutional Neural Networks - ISMIR 2016Automatic Tagging using Deep Convolutional Neural Networks - ISMIR 2016
Automatic Tagging using Deep Convolutional Neural Networks - ISMIR 2016
Keunwoo Choi
 
ECCV2010: distance function and metric learning part 2
ECCV2010: distance function and metric learning part 2ECCV2010: distance function and metric learning part 2
ECCV2010: distance function and metric learning part 2
zukun
 
Chap 8. Optimization for training deep models
Chap 8. Optimization for training deep modelsChap 8. Optimization for training deep models
Chap 8. Optimization for training deep models
Young-Geun Choi
 
SVM Tutorial
SVM TutorialSVM Tutorial
SVM Tutorial
butest
 
Data mining
Data miningData mining
Data mining
Behnaz Motavali
 
Deep learning-practical
Deep learning-practicalDeep learning-practical
Deep learning-practical
Hitesh Mohapatra
 
Hands-on Tutorial of Machine Learning in Python
Hands-on Tutorial of Machine Learning in PythonHands-on Tutorial of Machine Learning in Python
Hands-on Tutorial of Machine Learning in Python
Chun-Ming Chang
 
2020 12-1-adam w
2020 12-1-adam w2020 12-1-adam w
2020 12-1-adam w
JAEMINJEONG5
 
Vc dimension in Machine Learning
Vc dimension in Machine LearningVc dimension in Machine Learning
Vc dimension in Machine Learning
VARUN KUMAR
 
Toward wave net speech synthesis
Toward wave net speech synthesisToward wave net speech synthesis
Toward wave net speech synthesis
NAVER Engineering
 
K Means Clustering Algorithm | K Means Clustering Example | Machine Learning ...
K Means Clustering Algorithm | K Means Clustering Example | Machine Learning ...K Means Clustering Algorithm | K Means Clustering Example | Machine Learning ...
K Means Clustering Algorithm | K Means Clustering Example | Machine Learning ...
Simplilearn
 
Time Series Forecasting Using Recurrent Neural Network and Vector Autoregress...
Time Series Forecasting Using Recurrent Neural Network and Vector Autoregress...Time Series Forecasting Using Recurrent Neural Network and Vector Autoregress...
Time Series Forecasting Using Recurrent Neural Network and Vector Autoregress...
Databricks
 
Unit 7 dynamic programming
Unit 7   dynamic programmingUnit 7   dynamic programming
Unit 7 dynamic programming
Nageswara Rao Thots
 
Two strategies for large-scale multi-label classification on the YouTube-8M d...
Two strategies for large-scale multi-label classification on the YouTube-8M d...Two strategies for large-scale multi-label classification on the YouTube-8M d...
Two strategies for large-scale multi-label classification on the YouTube-8M d...
Dalei Li
 
EE660_Report_YaxinLiu_8448347171
EE660_Report_YaxinLiu_8448347171EE660_Report_YaxinLiu_8448347171
EE660_Report_YaxinLiu_8448347171
Yaxin Liu
 

What's hot (20)

딥러닝 개요 (2015-05-09 KISTEP)
딥러닝 개요 (2015-05-09 KISTEP)딥러닝 개요 (2015-05-09 KISTEP)
딥러닝 개요 (2015-05-09 KISTEP)
 
2020 11 4_bag_of_tricks
2020 11 4_bag_of_tricks2020 11 4_bag_of_tricks
2020 11 4_bag_of_tricks
 
Support Vector Machine and Implementation using Weka
Support Vector Machine and Implementation using WekaSupport Vector Machine and Implementation using Weka
Support Vector Machine and Implementation using Weka
 
GBM theory code and parameters
GBM theory code and parametersGBM theory code and parameters
GBM theory code and parameters
 
Neural Network Part-2
Neural Network Part-2Neural Network Part-2
Neural Network Part-2
 
Automatic Tagging using Deep Convolutional Neural Networks - ISMIR 2016
Automatic Tagging using Deep Convolutional Neural Networks - ISMIR 2016Automatic Tagging using Deep Convolutional Neural Networks - ISMIR 2016
Automatic Tagging using Deep Convolutional Neural Networks - ISMIR 2016
 
ECCV2010: distance function and metric learning part 2
ECCV2010: distance function and metric learning part 2ECCV2010: distance function and metric learning part 2
ECCV2010: distance function and metric learning part 2
 
Chap 8. Optimization for training deep models
Chap 8. Optimization for training deep modelsChap 8. Optimization for training deep models
Chap 8. Optimization for training deep models
 
SVM Tutorial
SVM TutorialSVM Tutorial
SVM Tutorial
 
Data mining
Data miningData mining
Data mining
 
Deep learning-practical
Deep learning-practicalDeep learning-practical
Deep learning-practical
 
Hands-on Tutorial of Machine Learning in Python
Hands-on Tutorial of Machine Learning in PythonHands-on Tutorial of Machine Learning in Python
Hands-on Tutorial of Machine Learning in Python
 
2020 12-1-adam w
2020 12-1-adam w2020 12-1-adam w
2020 12-1-adam w
 
Vc dimension in Machine Learning
Vc dimension in Machine LearningVc dimension in Machine Learning
Vc dimension in Machine Learning
 
Toward wave net speech synthesis
Toward wave net speech synthesisToward wave net speech synthesis
Toward wave net speech synthesis
 
K Means Clustering Algorithm | K Means Clustering Example | Machine Learning ...
K Means Clustering Algorithm | K Means Clustering Example | Machine Learning ...K Means Clustering Algorithm | K Means Clustering Example | Machine Learning ...
K Means Clustering Algorithm | K Means Clustering Example | Machine Learning ...
 
Time Series Forecasting Using Recurrent Neural Network and Vector Autoregress...
Time Series Forecasting Using Recurrent Neural Network and Vector Autoregress...Time Series Forecasting Using Recurrent Neural Network and Vector Autoregress...
Time Series Forecasting Using Recurrent Neural Network and Vector Autoregress...
 
Unit 7 dynamic programming
Unit 7   dynamic programmingUnit 7   dynamic programming
Unit 7 dynamic programming
 
Two strategies for large-scale multi-label classification on the YouTube-8M d...
Two strategies for large-scale multi-label classification on the YouTube-8M d...Two strategies for large-scale multi-label classification on the YouTube-8M d...
Two strategies for large-scale multi-label classification on the YouTube-8M d...
 
EE660_Report_YaxinLiu_8448347171
EE660_Report_YaxinLiu_8448347171EE660_Report_YaxinLiu_8448347171
EE660_Report_YaxinLiu_8448347171
 

Similar to Presentation 01

Dp
DpDp
Support Vector Machines USING MACHINE LEARNING HOW IT WORKS
Support Vector Machines USING MACHINE LEARNING HOW IT WORKSSupport Vector Machines USING MACHINE LEARNING HOW IT WORKS
Support Vector Machines USING MACHINE LEARNING HOW IT WORKS
rajalakshmi5921
 
Paper review: Learned Optimizers that Scale and Generalize.
Paper review: Learned Optimizers that Scale and Generalize.Paper review: Learned Optimizers that Scale and Generalize.
Paper review: Learned Optimizers that Scale and Generalize.
Wuhyun Rico Shin
 
A Multi-Objective Genetic Algorithm for Pruning Support Vector Machines
A Multi-Objective Genetic Algorithm for Pruning Support Vector MachinesA Multi-Objective Genetic Algorithm for Pruning Support Vector Machines
A Multi-Objective Genetic Algorithm for Pruning Support Vector Machines
Mohamed Farouk
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)
theijes
 
OM-DS-Fall2022-Session10-Support vector machine.pdf
OM-DS-Fall2022-Session10-Support vector machine.pdfOM-DS-Fall2022-Session10-Support vector machine.pdf
OM-DS-Fall2022-Session10-Support vector machine.pdf
ssuserb016ab
 
A BA-based algorithm for parameter optimization of support vector machine
A BA-based algorithm for parameter optimization of support vector machineA BA-based algorithm for parameter optimization of support vector machine
A BA-based algorithm for parameter optimization of support vector machine
Aboul Ella Hassanien
 
Guide
GuideGuide
RMA-LSCh-CMA, presentation for WCCI'2014 (IEEE CEC'2014)
RMA-LSCh-CMA, presentation for WCCI'2014 (IEEE CEC'2014)RMA-LSCh-CMA, presentation for WCCI'2014 (IEEE CEC'2014)
RMA-LSCh-CMA, presentation for WCCI'2014 (IEEE CEC'2014)
Daniel Molina Cabrera
 
New Surrogate-Assisted Search Control and Restart Strategies for CMA-ES
New Surrogate-Assisted Search Control and Restart Strategies for CMA-ESNew Surrogate-Assisted Search Control and Restart Strategies for CMA-ES
New Surrogate-Assisted Search Control and Restart Strategies for CMA-ES
Ilya Loshchilov
 
FIDUCIAL POINTS DETECTION USING SVM LINEAR CLASSIFIERS
FIDUCIAL POINTS DETECTION USING SVM LINEAR CLASSIFIERSFIDUCIAL POINTS DETECTION USING SVM LINEAR CLASSIFIERS
FIDUCIAL POINTS DETECTION USING SVM LINEAR CLASSIFIERS
csandit
 
Multilayer Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intell...
Multilayer Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intell...Multilayer Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intell...
Multilayer Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intell...
Universitat Politècnica de Catalunya
 
Optimization_methods.pdf
Optimization_methods.pdfOptimization_methods.pdf
Optimization_methods.pdf
VaibhavSharma563532
 
Support Vector Machines
Support Vector MachinesSupport Vector Machines
Support Vector Machines
CloudxLab
 
MLHEP Lectures - day 2, basic track
MLHEP Lectures - day 2, basic trackMLHEP Lectures - day 2, basic track
MLHEP Lectures - day 2, basic track
arogozhnikov
 
4.Support Vector Machines.ppt machine learning and development
4.Support Vector Machines.ppt machine learning and development4.Support Vector Machines.ppt machine learning and development
4.Support Vector Machines.ppt machine learning and development
PriyankaRamavath3
 
COCOA: Communication-Efficient Coordinate Ascent
COCOA: Communication-Efficient Coordinate AscentCOCOA: Communication-Efficient Coordinate Ascent
COCOA: Communication-Efficient Coordinate Ascent
jeykottalam
 
GPU Parallel Computing of Support Vector Machines as applied to Intrusion Det...
GPU Parallel Computing of Support Vector Machines as applied to Intrusion Det...GPU Parallel Computing of Support Vector Machines as applied to Intrusion Det...
GPU Parallel Computing of Support Vector Machines as applied to Intrusion Det...
IJCSIS Research Publications
 
support vector machine 1.pptx
support vector machine 1.pptxsupport vector machine 1.pptx
support vector machine 1.pptx
surbhidutta4
 
Event classification & prediction using support vector machine
Event classification & prediction using support vector machineEvent classification & prediction using support vector machine
Event classification & prediction using support vector machine
Ruta Kambli
 

Similar to Presentation 01 (20)

Dp
DpDp
Dp
 
Support Vector Machines USING MACHINE LEARNING HOW IT WORKS
Support Vector Machines USING MACHINE LEARNING HOW IT WORKSSupport Vector Machines USING MACHINE LEARNING HOW IT WORKS
Support Vector Machines USING MACHINE LEARNING HOW IT WORKS
 
Paper review: Learned Optimizers that Scale and Generalize.
Paper review: Learned Optimizers that Scale and Generalize.Paper review: Learned Optimizers that Scale and Generalize.
Paper review: Learned Optimizers that Scale and Generalize.
 
A Multi-Objective Genetic Algorithm for Pruning Support Vector Machines
A Multi-Objective Genetic Algorithm for Pruning Support Vector MachinesA Multi-Objective Genetic Algorithm for Pruning Support Vector Machines
A Multi-Objective Genetic Algorithm for Pruning Support Vector Machines
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)
 
OM-DS-Fall2022-Session10-Support vector machine.pdf
OM-DS-Fall2022-Session10-Support vector machine.pdfOM-DS-Fall2022-Session10-Support vector machine.pdf
OM-DS-Fall2022-Session10-Support vector machine.pdf
 
A BA-based algorithm for parameter optimization of support vector machine
A BA-based algorithm for parameter optimization of support vector machineA BA-based algorithm for parameter optimization of support vector machine
A BA-based algorithm for parameter optimization of support vector machine
 
Guide
GuideGuide
Guide
 
RMA-LSCh-CMA, presentation for WCCI'2014 (IEEE CEC'2014)
RMA-LSCh-CMA, presentation for WCCI'2014 (IEEE CEC'2014)RMA-LSCh-CMA, presentation for WCCI'2014 (IEEE CEC'2014)
RMA-LSCh-CMA, presentation for WCCI'2014 (IEEE CEC'2014)
 
New Surrogate-Assisted Search Control and Restart Strategies for CMA-ES
New Surrogate-Assisted Search Control and Restart Strategies for CMA-ESNew Surrogate-Assisted Search Control and Restart Strategies for CMA-ES
New Surrogate-Assisted Search Control and Restart Strategies for CMA-ES
 
FIDUCIAL POINTS DETECTION USING SVM LINEAR CLASSIFIERS
FIDUCIAL POINTS DETECTION USING SVM LINEAR CLASSIFIERSFIDUCIAL POINTS DETECTION USING SVM LINEAR CLASSIFIERS
FIDUCIAL POINTS DETECTION USING SVM LINEAR CLASSIFIERS
 
Multilayer Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intell...
Multilayer Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intell...Multilayer Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intell...
Multilayer Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intell...
 
Optimization_methods.pdf
Optimization_methods.pdfOptimization_methods.pdf
Optimization_methods.pdf
 
Support Vector Machines
Support Vector MachinesSupport Vector Machines
Support Vector Machines
 
MLHEP Lectures - day 2, basic track
MLHEP Lectures - day 2, basic trackMLHEP Lectures - day 2, basic track
MLHEP Lectures - day 2, basic track
 
4.Support Vector Machines.ppt machine learning and development
4.Support Vector Machines.ppt machine learning and development4.Support Vector Machines.ppt machine learning and development
4.Support Vector Machines.ppt machine learning and development
 
COCOA: Communication-Efficient Coordinate Ascent
COCOA: Communication-Efficient Coordinate AscentCOCOA: Communication-Efficient Coordinate Ascent
COCOA: Communication-Efficient Coordinate Ascent
 
GPU Parallel Computing of Support Vector Machines as applied to Intrusion Det...
GPU Parallel Computing of Support Vector Machines as applied to Intrusion Det...GPU Parallel Computing of Support Vector Machines as applied to Intrusion Det...
GPU Parallel Computing of Support Vector Machines as applied to Intrusion Det...
 
support vector machine 1.pptx
support vector machine 1.pptxsupport vector machine 1.pptx
support vector machine 1.pptx
 
Event classification & prediction using support vector machine
Event classification & prediction using support vector machineEvent classification & prediction using support vector machine
Event classification & prediction using support vector machine
 

Presentation 01

  • 1. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion SVM based Semi-Supervised Classication Topics in Pattern Recognition Bigyan Bhar M.E. CSA, IISc 4710-410-091-07064 Oct 11th, 2010 Bigyan Bhar Seminar, Topics in PR
  • 2. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Outline 1 Classication 2 Support Vector Machine (SVM) 3 Using SVM for Semi-Supervised Classication Transductive SVM Modications Augmented Lagrangian Other Methods All Methods 4 Results 5 Conclusion New Facts Further Directions Acknowledgments References Bigyan Bhar Seminar, Topics in PR
  • 3. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Outline 1 Classication 2 Support Vector Machine (SVM) 3 Using SVM for Semi-Supervised Classication Transductive SVM Modications Augmented Lagrangian Other Methods All Methods 4 Results 5 Conclusion New Facts Further Directions Acknowledgments References Bigyan Bhar Seminar, Topics in PR
  • 4. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion What is a Classication? Classication refers to an algorithmic procedure for assigning a given piece of input data into one of a given number of categories Class test Final Exam Project Seminar 13 35 16 18 10 31 5 19 11 21 9 11 12 29 10 15 Grade A B C B Bigyan Bhar Seminar, Topics in PR
  • 5. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Traditional Classier Classifier Builder Labelled Data Classifier Unlabelled Data Classifier Label for Data Bigyan Bhar Seminar, Topics in PR
  • 6. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Classier Classier is supposed to classify unlabeled data We have a lot of unlabeled data; typically much more than number of labeled data So far we have seen classiers being built using only labeled data What if we could also use the large set of unclassied data to build a better classier? Bigyan Bhar Seminar, Topics in PR
  • 7. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Semi-supervised Classier Classifier Builder Unlabelled Classifier Builder Labelled Data Classifier Labelled Data Classifier Semi−supervised Bigyan Bhar Seminar, Topics in PR
  • 8. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion How to use the unlabeled data? The separating plane has to pass through a low density region Bigyan Bhar Seminar, Topics in PR
  • 9. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion How to use the unlabeled data? The separating plane has to pass through a low density region Bigyan Bhar Seminar, Topics in PR
  • 10. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Labeling Constraint The low density region principle that we observed can be realized using a fractional constraint # of positive class examples total # of of examples = r r is an user supplied input We enforce the above constraint on unlabeled examples as they are large in number Bigyan Bhar Seminar, Topics in PR
  • 11. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Outline 1 Classication 2 Support Vector Machine (SVM) 3 Using SVM for Semi-Supervised Classication Transductive SVM Modications Augmented Lagrangian Other Methods All Methods 4 Results 5 Conclusion New Facts Further Directions Acknowledgments References Bigyan Bhar Seminar, Topics in PR
  • 12. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion What is SVM? SVM = Support Vector Machine Maximal Margin Classier Bigyan Bhar Seminar, Topics in PR
  • 13. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion SVM Continued w w ’x+b=1 w ’x+b=0 w ’x+b=−1 m argin Total margin= 1 w + 1 w = 2 w Optimization problem min w 1 2 wTw subject to, yi wTxi +b ≥ 1 ∀1 ≤ i ≤ l Bigyan Bhar Seminar, Topics in PR
  • 14. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion SVM Formulation Using KKT conditions, we get the nal SVM problem as: w∗ = argmin w 1 2 l ∑ i=1 loss yiwTxi + λ 2 wTw Bigyan Bhar Seminar, Topics in PR
  • 15. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Transductive SVM Modications Augmented Lagrangian Other Methods All Methods Outline 1 Classication 2 Support Vector Machine (SVM) 3 Using SVM for Semi-Supervised Classication Transductive SVM Modications Augmented Lagrangian Other Methods All Methods 4 Results 5 Conclusion New Facts Further Directions Acknowledgments References Bigyan Bhar Seminar, Topics in PR
  • 16. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Transductive SVM Modications Augmented Lagrangian Other Methods All Methods Transductive SVM (TSVM) min w,{yj } u j=1 λ 2 w 2 + 1 2l l ∑ i=1 loss yiwTxi + λ 2u u ∑ j=1 loss yjwTxi subject to: 1 u u ∑ j=1 max 0,sign wTxj = r Bigyan Bhar Seminar, Topics in PR
  • 17. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Transductive SVM Modications Augmented Lagrangian Other Methods All Methods Modifying TSVM What is the cost to importance ratio of the terms in TSVM formulation? min w,{yj } u j=1 λ 2 w 2 + 1 2l l ∑ i=1 loss yiwTxi + λ 2u u ∑ j=1 loss yjwTxi Clearly the third term, unlabeled loss is the costliest computation of yi for the large set of unlabeled terms What if we can avoid it altogether? Bigyan Bhar Seminar, Topics in PR
  • 18. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Transductive SVM Modications Augmented Lagrangian Other Methods All Methods Modied TSVM TSVM formulation: min w,{yj } u j=1 λ 2 w 2 + 1 2l l ∑ i=1 loss yiwTxi + λ 2u u ∑ j=1 loss yjwTxi Our formulation: min w λ 2 w 2 + 1 2l l ∑ i=1 loss yiwTxi subject to: 1 u u ∑ j=1 max 0,sign wTxj = r Bigyan Bhar Seminar, Topics in PR
  • 19. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Transductive SVM Modications Augmented Lagrangian Other Methods All Methods Augmented Lagrangian Technique Augmented Lagrangian is a technique for solving minimization problems with equality constraints It converges faster than the generalized methods Original problem: min f(x), subject to g(x) = 0 Can be written as an unconstrained minimization over: L(x,λ,µ) = f(x)−λg(x)+ 1 2µ g(x) 2 Since f and the Lagrangian (for any λ) agree on the feasible set g(x) = 0, the basic idea remains same as that of Lagrangian a small value of µ forces the minimizer(s) of L to lie close to the feasible set values of x that that reduce f are preferred Bigyan Bhar Seminar, Topics in PR
  • 20. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Transductive SVM Modications Augmented Lagrangian Other Methods All Methods Modied TSVM using Augmented Lagrangian Our formulation: min w [f(w)] =⇒ min w λ 2 w 2 + 1 2l l ∑ i=1 loss yiwTxi subject to: g(w) = 0 =⇒ 1 u u ∑ j=1 max 0,sign wTxj −r = 0 Augmented Lagrangian: min x [L(x,λ,µ)] = min x f(x)−λg(x)+ 1 2µ g(x) 2 Bigyan Bhar Seminar, Topics in PR
  • 21. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Transductive SVM Modications Augmented Lagrangian Other Methods All Methods Penalty Method Augmented Lagrangian: min x [L(x,λ,µ)] = min x f(x)−λg(x)+ 1 2µ g(x) 2 Penalty Method: min x f(x)+ 1 2µ g(x) 2 Bigyan Bhar Seminar, Topics in PR
  • 22. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Transductive SVM Modications Augmented Lagrangian Other Methods All Methods SVM based Supervised SVM (SSVM) w∗ = arg min w∈Rd λ 2 w 2 + 1 2 l ∑ i=1 loss yiwTxi SSVM with Threshold Adjustment Obtain w∗ from SSVM Adjust threshold to satisfy la belling constraint 1 u u ∑ j=1 max 0,sign wTxj = r Bigyan Bhar Seminar, Topics in PR
  • 23. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Transductive SVM Modications Augmented Lagrangian Other Methods All Methods All Methods at a Glance SVM based: SSVM on labeled data SSVM on labeled data with threshold adjustment Methods proposed in this work: Augmented Lagrangian Penalty Method TSVM Deterministic Annealing Switching Bigyan Bhar Seminar, Topics in PR
  • 24. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Outline 1 Classication 2 Support Vector Machine (SVM) 3 Using SVM for Semi-Supervised Classication Transductive SVM Modications Augmented Lagrangian Other Methods All Methods 4 Results 5 Conclusion New Facts Further Directions Acknowledgments References Bigyan Bhar Seminar, Topics in PR
  • 25. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Accuracy Vs # of Labled Example (gcat) Bigyan Bhar Seminar, Topics in PR
  • 26. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Accuracy Vs # of Labled Example (aut-avn) Bigyan Bhar Seminar, Topics in PR
  • 27. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Accuracy Vs # of Noise in r (gcat) Bigyan Bhar Seminar, Topics in PR
  • 28. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion Accuracy Vs # of Noise in r (aut-avn) Bigyan Bhar Seminar, Topics in PR
  • 29. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion New Facts Further Directions Acknowledgments References Outline 1 Classication 2 Support Vector Machine (SVM) 3 Using SVM for Semi-Supervised Classication Transductive SVM Modications Augmented Lagrangian Other Methods All Methods 4 Results 5 Conclusion New Facts Further Directions Acknowledgments References Bigyan Bhar Seminar, Topics in PR
  • 30. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion New Facts Further Directions Acknowledgments References Some Results Simple penalty method is the most robust method WRT estimation of r TSVM still leads in terms of accuracy Augmented Lagrangian is a direction worth investigating due to its faster computational time Defeating the SSVM is possible only for reasonably accurate estimation of r If labeled dataset does not follow r, then alternate methods perform better Bigyan Bhar Seminar, Topics in PR
  • 31. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion New Facts Further Directions Acknowledgments References Future Directions Establish theoretical bounds for accuracy of our methods WRT that of TSVM Look at non-SVM based semi-supervised classiers (e.g. decision tree) and come up with a way to express the fractional constraint Can we use something other than the fractional constraint to enforce the low density criterion? Bigyan Bhar Seminar, Topics in PR
  • 32. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion New Facts Further Directions Acknowledgments References Acknowledgments I thank the following persons for their able guidance and help in this work: S S Keerthi (Yahoo! Labs) M N Murthy (IISc) S Sundararajan (Yahoo! Labs) S Shevade (IISc) Bigyan Bhar Seminar, Topics in PR
  • 33. Classication Support Vector Machine (SVM) Using SVM for Semi-Supervised Classication Results Conclusion New Facts Further Directions Acknowledgments References References MS Gockenbach. The augmented Lagrangian method for equality-constrained optimizations V Sindhwani, SS Keerthi. Newton Methods for Fast Solution of Semi-supervised Linear SVMs SS Keerthi, D DeCoste. A modied nite Newton method for fast solution of large scale linear SVMs Bigyan Bhar Seminar, Topics in PR