SlideShare a Scribd company logo
Support Vector Machines and
Associative Classifiers
SVM—Support Vector Machines
 A new classification method for both linear and nonlinear data
 It uses a nonlinear mapping to transform the original training data
into a higher dimension
 With the new dimension, it searches for the linear optimal separating
hyperplane (i.e., “decision boundary”)
 With an appropriate nonlinear mapping to a sufficiently high
dimension, data from two classes can always be separated by a
hyperplane
 SVM finds this hyperplane using support vectors (“essential” training
tuples) and margins (defined by the support vectors)
SVM—History and Applications
 Vapnik and colleagues (1992)—groundwork from Vapnik &
Chervonenkis’ statistical learning theory in 1960s
 Features: training can be slow but accuracy is high owing to
their ability to model complex nonlinear decision boundaries
(margin maximization)
 Used both for classification and prediction
 Applications:
 handwritten digit recognition, object recognition, speaker
identification, benchmarking time-series prediction tests
SVM—General Philosophy
 Data Set D (X1, y1), (X2, y2)…(Xn, yn)
 Xi – Training tuples
 yi – Class labels
 Linearly Separable
 Best separating line – minimum
classification error
 Plane / Hyperplane
 Maximal Margin Hyperplane (MMH)
 Larger margin – more accurate at classifying
future samples
 Gives largest separation among classes
 Sides of margin are parallel to hyperplane
 Distance from MMH to closest training tuple
of either class
SVM—General Philosophy
Support Vectors
Small Margin Large Margin
SVM—Linearly Separable
 A separating hyperplane can be written as
W ● X + b = 0
where W={w1, w2, …, wn} is a weight vector and b a scalar (bias)
 For 2-D it can be written as
w0 + w1 x1 + w2 x2 = 0
 The hyperplane defining the sides of the margin:
H1: w0 + w1 x1 + w2 x2 ≥ 1 for yi = +1, and
H2: w0 + w1 x1 + w2 x2 ≤ – 1 for yi = –1
 Any training tuples that fall on hyperplanes H1 or H2 (i.e., the sides defining the
margin) are Support vectors
 Size of maximal margin is 2 / ||W||
 ||W|| - Euclidean norm of W = (w1
2
+w2
2
+…+wn
2
)1/2
 This becomes a constrained (convex) quadratic optimization problem
High Dimensional Data
 The complexity of trained classifier is characterized by the # of support vectors rather
than the dimensionality of the data
 The support vectors are the essential or critical training examples —they lie closest
to the decision boundary (MMH)
 If all other training examples are removed and the training is repeated, the same
separating hyperplane would be found
 The number of support vectors found can be used to compute an (upper) bound on the
expected error rate of the SVM classifier, which is independent of the data
dimensionality
 Thus, an SVM with a small number of support vectors can have good generalization,
even when the dimensionality of the data is high
 Classification using SVM
 Maximal Margin Hyperplane: d(XT
)= ∑i=1
l
yiαiXiXT
+ b0
 Xi –Support vectors (l numbers) and XT
–Test data
 Class depends on sign of result
SVM—Linearly Inseparable
 Transform the original input data into a higher dimensional space
 Search for a linear separating hyperplane in the new space
 3-D vector X =(x1, x2, x3) is mapped into a 6-D space using φ1(X) = x1, φ2(X) = x2, φ3(X) = x3,
φ4(X) = (x1)2
φ5(X) = x1x2 φ6(X) = x1x3
 Decision plane d(Z) = w1x1 + w2x2+ w3x3 + w4(x1)2
+ w5 x1x2 + w6 x1x3 + b
d(Z) = w1z1 + w2z2+ w3z3 + w4z4 + w5 z5 + w6 z6+ b
- Choosing non-linear mapping
- Expensive Computation XiXT
A 1
A 2
SVM—Kernel functions
 Instead of computing the dot product on the transformed data
tuples, it is mathematically equivalent to instead applying a kernel
function K(Xi, Xj) to the original data, i.e., K(Xi, Xj) = Φ(Xi) Φ(Xj)
 Typical Kernel Functions
 Polynomial Kernel : K(Xi, Xj) = (Xi, Xj + 1)h
 Sigmoid Kernel : K(Xi, Xj) = tanh(κXi.Xj - δ)
 Gaussian Radial Basis Function kernel :
K(Xi, Xj) = e-||Xi - Xj ||2 / 2 σ2
 SVM can also be used for classifying multiple (> 2) classes and for
regression analysis (with additional user parameters)
SVM vs. Neural Network
 SVM
 Relatively new concept
 Deterministic algorithm
 Nice Generalization
properties
 Hard to learn – learned in
batch mode using quadratic
programming techniques
 Using kernels can learn very
complex functions
 Neural Network
 Relatively old
 Nondeterministic algorithm
 Generalizes well but doesn’t
have strong mathematical
foundation
 Can easily be learned in
incremental fashion
 To learn complex functions—
use multilayer perceptron
Associative Classification
 Associative classification
 Association rules are generated and analyzed for use in classification
 Search for strong associations between frequent patterns (conjunctions of attribute-
value pairs) and class labels
 Classification: Based on evaluating a set of rules in the form of
p1 ^ p2 … ^ pl  “Aclass = C” (conf, sup)
 Effectiveness
 It explores highly confident associations among multiple attributes
 Overcome some constraints introduced by decision-tree induction which considers
only one attribute at a time
 Associative classification has been found to be more accurate than some traditional
classification methods, such as C4.5
Typical Associative Classification Methods
 CBA (Classification By Association)
 Mine possible rules in the form of
 Cond-set (a set of attribute-value pairs)  class label
 Iterative Approach similar to Apriori
 Build classifier: Organize rules according to decreasing precedence based on
confidence and then support
 CMAR (Classification based on Multiple Association Rules)
 Uses variant of FP-Growth
 Mined rules are stored and retrieved using tree structure
 Rule Pruning is carried out – Specialized rules with low confidence are pruned
 Classification: Statistical analysis on multiple rules
 CPAR (Classification based on Predictive Association Rules)
 Generation of predictive rules (FOIL-like analysis)
 High efficiency, accuracy similar to CMAR
 Uses best-k rules of each group to predict class label

More Related Content

What's hot

Support Vector Machines ( SVM )
Support Vector Machines ( SVM ) Support Vector Machines ( SVM )
Support Vector Machines ( SVM )
Mohammad Junaid Khan
 
Support Vector Machine - How Support Vector Machine works | SVM in Machine Le...
Support Vector Machine - How Support Vector Machine works | SVM in Machine Le...Support Vector Machine - How Support Vector Machine works | SVM in Machine Le...
Support Vector Machine - How Support Vector Machine works | SVM in Machine Le...
Simplilearn
 
2.3 bayesian classification
2.3 bayesian classification2.3 bayesian classification
2.3 bayesian classification
Krish_ver2
 
Support Vector Machines- SVM
Support Vector Machines- SVMSupport Vector Machines- SVM
Support Vector Machines- SVM
Carlo Carandang
 
Support vector machines (svm)
Support vector machines (svm)Support vector machines (svm)
Support vector machines (svm)
Sharayu Patil
 
Naive Bayes
Naive BayesNaive Bayes
Naive Bayes
CloudxLab
 
Decision tree
Decision treeDecision tree
Decision tree
Ami_Surati
 
Genetic algorithms in Data Mining
Genetic algorithms in Data MiningGenetic algorithms in Data Mining
Genetic algorithms in Data Mining
Atul Khanna
 
Decision tree
Decision treeDecision tree
Decision tree
SEMINARGROOT
 
Ensemble methods in machine learning
Ensemble methods in machine learningEnsemble methods in machine learning
Ensemble methods in machine learning
SANTHOSH RAJA M G
 
Support Vector Machine ppt presentation
Support Vector Machine ppt presentationSupport Vector Machine ppt presentation
Support Vector Machine ppt presentation
AyanaRukasar
 
Classification Based Machine Learning Algorithms
Classification Based Machine Learning AlgorithmsClassification Based Machine Learning Algorithms
Classification Based Machine Learning Algorithms
Md. Main Uddin Rony
 
Support vector machine-SVM's
Support vector machine-SVM'sSupport vector machine-SVM's
Support vector machine-SVM's
Anudeep Chowdary Kamepalli
 
Overfitting & Underfitting
Overfitting & UnderfittingOverfitting & Underfitting
Overfitting & Underfitting
SOUMIT KAR
 
Support vector machine
Support vector machineSupport vector machine
Support vector machine
SomnathMore3
 
Data Mining: clustering and analysis
Data Mining: clustering and analysisData Mining: clustering and analysis
Data Mining: clustering and analysis
DataminingTools Inc
 
Clustering in Data Mining
Clustering in Data MiningClustering in Data Mining
Clustering in Data Mining
Archana Swaminathan
 
Classification in data mining
Classification in data mining Classification in data mining
Classification in data mining
Sulman Ahmed
 

What's hot (20)

Support Vector Machines ( SVM )
Support Vector Machines ( SVM ) Support Vector Machines ( SVM )
Support Vector Machines ( SVM )
 
Support Vector Machine - How Support Vector Machine works | SVM in Machine Le...
Support Vector Machine - How Support Vector Machine works | SVM in Machine Le...Support Vector Machine - How Support Vector Machine works | SVM in Machine Le...
Support Vector Machine - How Support Vector Machine works | SVM in Machine Le...
 
2.3 bayesian classification
2.3 bayesian classification2.3 bayesian classification
2.3 bayesian classification
 
Support Vector Machines- SVM
Support Vector Machines- SVMSupport Vector Machines- SVM
Support Vector Machines- SVM
 
Support vector machines (svm)
Support vector machines (svm)Support vector machines (svm)
Support vector machines (svm)
 
Naive Bayes
Naive BayesNaive Bayes
Naive Bayes
 
Decision tree
Decision treeDecision tree
Decision tree
 
Genetic algorithms in Data Mining
Genetic algorithms in Data MiningGenetic algorithms in Data Mining
Genetic algorithms in Data Mining
 
Decision tree
Decision treeDecision tree
Decision tree
 
3. mining frequent patterns
3. mining frequent patterns3. mining frequent patterns
3. mining frequent patterns
 
Ensemble methods in machine learning
Ensemble methods in machine learningEnsemble methods in machine learning
Ensemble methods in machine learning
 
Support Vector Machine ppt presentation
Support Vector Machine ppt presentationSupport Vector Machine ppt presentation
Support Vector Machine ppt presentation
 
Classification Based Machine Learning Algorithms
Classification Based Machine Learning AlgorithmsClassification Based Machine Learning Algorithms
Classification Based Machine Learning Algorithms
 
Support Vector Machine
Support Vector MachineSupport Vector Machine
Support Vector Machine
 
Support vector machine-SVM's
Support vector machine-SVM'sSupport vector machine-SVM's
Support vector machine-SVM's
 
Overfitting & Underfitting
Overfitting & UnderfittingOverfitting & Underfitting
Overfitting & Underfitting
 
Support vector machine
Support vector machineSupport vector machine
Support vector machine
 
Data Mining: clustering and analysis
Data Mining: clustering and analysisData Mining: clustering and analysis
Data Mining: clustering and analysis
 
Clustering in Data Mining
Clustering in Data MiningClustering in Data Mining
Clustering in Data Mining
 
Classification in data mining
Classification in data mining Classification in data mining
Classification in data mining
 

Similar to 2.6 support vector machines and associative classifiers revised

4.Support Vector Machines.ppt machine learning and development
4.Support Vector Machines.ppt machine learning and development4.Support Vector Machines.ppt machine learning and development
4.Support Vector Machines.ppt machine learning and development
PriyankaRamavath3
 
Introduction to Support Vector Machines
Introduction to Support Vector MachinesIntroduction to Support Vector Machines
Introduction to Support Vector Machines
Silicon Mentor
 
Text categorization
Text categorizationText categorization
Text categorization
Phuong Nguyen
 
MLHEP Lectures - day 1, basic track
MLHEP Lectures - day 1, basic trackMLHEP Lectures - day 1, basic track
MLHEP Lectures - day 1, basic track
arogozhnikov
 
SVM (2).ppt
SVM (2).pptSVM (2).ppt
SVM (2).ppt
NoorUlHaq47
 
Lecture 2
Lecture 2Lecture 2
Lecture 2butest
 
Machine learning in science and industry — day 1
Machine learning in science and industry — day 1Machine learning in science and industry — day 1
Machine learning in science and industry — day 1
arogozhnikov
 
2.7 other classifiers
2.7 other classifiers2.7 other classifiers
2.7 other classifiers
Krish_ver2
 
Machine Learning Algorithms (Part 1)
Machine Learning Algorithms (Part 1)Machine Learning Algorithms (Part 1)
Machine Learning Algorithms (Part 1)
Zihui Li
 
Km2417821785
Km2417821785Km2417821785
Km2417821785
IJERA Editor
 
Application of combined support vector machines in process fault diagnosis
Application of combined support vector machines in process fault diagnosisApplication of combined support vector machines in process fault diagnosis
Application of combined support vector machines in process fault diagnosis
Dr.Pooja Jain
 
[ML]-SVM2.ppt.pdf
[ML]-SVM2.ppt.pdf[ML]-SVM2.ppt.pdf
[ML]-SVM2.ppt.pdf
4NM20IS025BHUSHANNAY
 
MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1
arogozhnikov
 
Unit-1 Introduction and Mathematical Preliminaries.pptx
Unit-1 Introduction and Mathematical Preliminaries.pptxUnit-1 Introduction and Mathematical Preliminaries.pptx
Unit-1 Introduction and Mathematical Preliminaries.pptx
avinashBajpayee1
 
Tutorial - Support vector machines
Tutorial - Support vector machinesTutorial - Support vector machines
Tutorial - Support vector machinesbutest
 
Cs501 cluster analysis
Cs501 cluster analysisCs501 cluster analysis
Cs501 cluster analysis
Kamal Singh Lodhi
 

Similar to 2.6 support vector machines and associative classifiers revised (20)

4.Support Vector Machines.ppt machine learning and development
4.Support Vector Machines.ppt machine learning and development4.Support Vector Machines.ppt machine learning and development
4.Support Vector Machines.ppt machine learning and development
 
Introduction to Support Vector Machines
Introduction to Support Vector MachinesIntroduction to Support Vector Machines
Introduction to Support Vector Machines
 
Text categorization
Text categorizationText categorization
Text categorization
 
SVM.ppt
SVM.pptSVM.ppt
SVM.ppt
 
MLHEP Lectures - day 1, basic track
MLHEP Lectures - day 1, basic trackMLHEP Lectures - day 1, basic track
MLHEP Lectures - day 1, basic track
 
SVM (2).ppt
SVM (2).pptSVM (2).ppt
SVM (2).ppt
 
Lecture 2
Lecture 2Lecture 2
Lecture 2
 
Machine learning in science and industry — day 1
Machine learning in science and industry — day 1Machine learning in science and industry — day 1
Machine learning in science and industry — day 1
 
2.7 other classifiers
2.7 other classifiers2.7 other classifiers
2.7 other classifiers
 
Machine Learning Algorithms (Part 1)
Machine Learning Algorithms (Part 1)Machine Learning Algorithms (Part 1)
Machine Learning Algorithms (Part 1)
 
Km2417821785
Km2417821785Km2417821785
Km2417821785
 
Application of combined support vector machines in process fault diagnosis
Application of combined support vector machines in process fault diagnosisApplication of combined support vector machines in process fault diagnosis
Application of combined support vector machines in process fault diagnosis
 
[ML]-SVM2.ppt.pdf
[ML]-SVM2.ppt.pdf[ML]-SVM2.ppt.pdf
[ML]-SVM2.ppt.pdf
 
MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1
 
[ppt]
[ppt][ppt]
[ppt]
 
[ppt]
[ppt][ppt]
[ppt]
 
Unit-1 Introduction and Mathematical Preliminaries.pptx
Unit-1 Introduction and Mathematical Preliminaries.pptxUnit-1 Introduction and Mathematical Preliminaries.pptx
Unit-1 Introduction and Mathematical Preliminaries.pptx
 
Lect4
Lect4Lect4
Lect4
 
Tutorial - Support vector machines
Tutorial - Support vector machinesTutorial - Support vector machines
Tutorial - Support vector machines
 
Cs501 cluster analysis
Cs501 cluster analysisCs501 cluster analysis
Cs501 cluster analysis
 

More from Krish_ver2

5.5 back tracking
5.5 back tracking5.5 back tracking
5.5 back tracking
Krish_ver2
 
5.5 back track
5.5 back track5.5 back track
5.5 back track
Krish_ver2
 
5.5 back tracking 02
5.5 back tracking 025.5 back tracking 02
5.5 back tracking 02
Krish_ver2
 
5.4 randomized datastructures
5.4 randomized datastructures5.4 randomized datastructures
5.4 randomized datastructures
Krish_ver2
 
5.4 randomized datastructures
5.4 randomized datastructures5.4 randomized datastructures
5.4 randomized datastructures
Krish_ver2
 
5.4 randamized algorithm
5.4 randamized algorithm5.4 randamized algorithm
5.4 randamized algorithm
Krish_ver2
 
5.3 dynamic programming 03
5.3 dynamic programming 035.3 dynamic programming 03
5.3 dynamic programming 03
Krish_ver2
 
5.3 dynamic programming
5.3 dynamic programming5.3 dynamic programming
5.3 dynamic programming
Krish_ver2
 
5.3 dyn algo-i
5.3 dyn algo-i5.3 dyn algo-i
5.3 dyn algo-i
Krish_ver2
 
5.2 divede and conquer 03
5.2 divede and conquer 035.2 divede and conquer 03
5.2 divede and conquer 03
Krish_ver2
 
5.2 divide and conquer
5.2 divide and conquer5.2 divide and conquer
5.2 divide and conquer
Krish_ver2
 
5.2 divede and conquer 03
5.2 divede and conquer 035.2 divede and conquer 03
5.2 divede and conquer 03
Krish_ver2
 
5.1 greedyyy 02
5.1 greedyyy 025.1 greedyyy 02
5.1 greedyyy 02
Krish_ver2
 
5.1 greedy
5.1 greedy5.1 greedy
5.1 greedy
Krish_ver2
 
5.1 greedy 03
5.1 greedy 035.1 greedy 03
5.1 greedy 03
Krish_ver2
 
4.4 hashing02
4.4 hashing024.4 hashing02
4.4 hashing02
Krish_ver2
 
4.4 hashing
4.4 hashing4.4 hashing
4.4 hashing
Krish_ver2
 
4.4 hashing ext
4.4 hashing  ext4.4 hashing  ext
4.4 hashing ext
Krish_ver2
 
4.4 external hashing
4.4 external hashing4.4 external hashing
4.4 external hashing
Krish_ver2
 
4.2 bst
4.2 bst4.2 bst
4.2 bst
Krish_ver2
 

More from Krish_ver2 (20)

5.5 back tracking
5.5 back tracking5.5 back tracking
5.5 back tracking
 
5.5 back track
5.5 back track5.5 back track
5.5 back track
 
5.5 back tracking 02
5.5 back tracking 025.5 back tracking 02
5.5 back tracking 02
 
5.4 randomized datastructures
5.4 randomized datastructures5.4 randomized datastructures
5.4 randomized datastructures
 
5.4 randomized datastructures
5.4 randomized datastructures5.4 randomized datastructures
5.4 randomized datastructures
 
5.4 randamized algorithm
5.4 randamized algorithm5.4 randamized algorithm
5.4 randamized algorithm
 
5.3 dynamic programming 03
5.3 dynamic programming 035.3 dynamic programming 03
5.3 dynamic programming 03
 
5.3 dynamic programming
5.3 dynamic programming5.3 dynamic programming
5.3 dynamic programming
 
5.3 dyn algo-i
5.3 dyn algo-i5.3 dyn algo-i
5.3 dyn algo-i
 
5.2 divede and conquer 03
5.2 divede and conquer 035.2 divede and conquer 03
5.2 divede and conquer 03
 
5.2 divide and conquer
5.2 divide and conquer5.2 divide and conquer
5.2 divide and conquer
 
5.2 divede and conquer 03
5.2 divede and conquer 035.2 divede and conquer 03
5.2 divede and conquer 03
 
5.1 greedyyy 02
5.1 greedyyy 025.1 greedyyy 02
5.1 greedyyy 02
 
5.1 greedy
5.1 greedy5.1 greedy
5.1 greedy
 
5.1 greedy 03
5.1 greedy 035.1 greedy 03
5.1 greedy 03
 
4.4 hashing02
4.4 hashing024.4 hashing02
4.4 hashing02
 
4.4 hashing
4.4 hashing4.4 hashing
4.4 hashing
 
4.4 hashing ext
4.4 hashing  ext4.4 hashing  ext
4.4 hashing ext
 
4.4 external hashing
4.4 external hashing4.4 external hashing
4.4 external hashing
 
4.2 bst
4.2 bst4.2 bst
4.2 bst
 

Recently uploaded

Model Attribute Check Company Auto Property
Model Attribute  Check Company Auto PropertyModel Attribute  Check Company Auto Property
Model Attribute Check Company Auto Property
Celine George
 
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptxMARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
bennyroshan06
 
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
Nguyen Thanh Tu Collection
 
Polish students' mobility in the Czech Republic
Polish students' mobility in the Czech RepublicPolish students' mobility in the Czech Republic
Polish students' mobility in the Czech Republic
Anna Sz.
 
The Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdfThe Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdf
kaushalkr1407
 
Unit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdfUnit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdf
Thiyagu K
 
Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)
rosedainty
 
How to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleHow to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS Module
Celine George
 
Introduction to Quality Improvement Essentials
Introduction to Quality Improvement EssentialsIntroduction to Quality Improvement Essentials
Introduction to Quality Improvement Essentials
Excellence Foundation for South Sudan
 
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdfESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
Fundacja Rozwoju Społeczeństwa Przedsiębiorczego
 
The Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official PublicationThe Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official Publication
Delapenabediema
 
Overview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with MechanismOverview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with Mechanism
DeeptiGupta154
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
Atul Kumar Singh
 
Operation Blue Star - Saka Neela Tara
Operation Blue Star   -  Saka Neela TaraOperation Blue Star   -  Saka Neela Tara
Operation Blue Star - Saka Neela Tara
Balvir Singh
 
How to Break the cycle of negative Thoughts
How to Break the cycle of negative ThoughtsHow to Break the cycle of negative Thoughts
How to Break the cycle of negative Thoughts
Col Mukteshwar Prasad
 
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
MysoreMuleSoftMeetup
 
How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17
Celine George
 
The Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve ThomasonThe Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve Thomason
Steve Thomason
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
siemaillard
 
Instructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptxInstructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptx
Jheel Barad
 

Recently uploaded (20)

Model Attribute Check Company Auto Property
Model Attribute  Check Company Auto PropertyModel Attribute  Check Company Auto Property
Model Attribute Check Company Auto Property
 
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptxMARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
 
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
 
Polish students' mobility in the Czech Republic
Polish students' mobility in the Czech RepublicPolish students' mobility in the Czech Republic
Polish students' mobility in the Czech Republic
 
The Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdfThe Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdf
 
Unit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdfUnit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdf
 
Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)
 
How to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleHow to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS Module
 
Introduction to Quality Improvement Essentials
Introduction to Quality Improvement EssentialsIntroduction to Quality Improvement Essentials
Introduction to Quality Improvement Essentials
 
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdfESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
 
The Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official PublicationThe Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official Publication
 
Overview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with MechanismOverview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with Mechanism
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
 
Operation Blue Star - Saka Neela Tara
Operation Blue Star   -  Saka Neela TaraOperation Blue Star   -  Saka Neela Tara
Operation Blue Star - Saka Neela Tara
 
How to Break the cycle of negative Thoughts
How to Break the cycle of negative ThoughtsHow to Break the cycle of negative Thoughts
How to Break the cycle of negative Thoughts
 
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
 
How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17
 
The Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve ThomasonThe Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve Thomason
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
 
Instructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptxInstructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptx
 

2.6 support vector machines and associative classifiers revised

  • 1. Support Vector Machines and Associative Classifiers
  • 2. SVM—Support Vector Machines  A new classification method for both linear and nonlinear data  It uses a nonlinear mapping to transform the original training data into a higher dimension  With the new dimension, it searches for the linear optimal separating hyperplane (i.e., “decision boundary”)  With an appropriate nonlinear mapping to a sufficiently high dimension, data from two classes can always be separated by a hyperplane  SVM finds this hyperplane using support vectors (“essential” training tuples) and margins (defined by the support vectors)
  • 3. SVM—History and Applications  Vapnik and colleagues (1992)—groundwork from Vapnik & Chervonenkis’ statistical learning theory in 1960s  Features: training can be slow but accuracy is high owing to their ability to model complex nonlinear decision boundaries (margin maximization)  Used both for classification and prediction  Applications:  handwritten digit recognition, object recognition, speaker identification, benchmarking time-series prediction tests
  • 4. SVM—General Philosophy  Data Set D (X1, y1), (X2, y2)…(Xn, yn)  Xi – Training tuples  yi – Class labels  Linearly Separable  Best separating line – minimum classification error  Plane / Hyperplane  Maximal Margin Hyperplane (MMH)  Larger margin – more accurate at classifying future samples  Gives largest separation among classes  Sides of margin are parallel to hyperplane  Distance from MMH to closest training tuple of either class
  • 6. SVM—Linearly Separable  A separating hyperplane can be written as W ● X + b = 0 where W={w1, w2, …, wn} is a weight vector and b a scalar (bias)  For 2-D it can be written as w0 + w1 x1 + w2 x2 = 0  The hyperplane defining the sides of the margin: H1: w0 + w1 x1 + w2 x2 ≥ 1 for yi = +1, and H2: w0 + w1 x1 + w2 x2 ≤ – 1 for yi = –1  Any training tuples that fall on hyperplanes H1 or H2 (i.e., the sides defining the margin) are Support vectors  Size of maximal margin is 2 / ||W||  ||W|| - Euclidean norm of W = (w1 2 +w2 2 +…+wn 2 )1/2  This becomes a constrained (convex) quadratic optimization problem
  • 7. High Dimensional Data  The complexity of trained classifier is characterized by the # of support vectors rather than the dimensionality of the data  The support vectors are the essential or critical training examples —they lie closest to the decision boundary (MMH)  If all other training examples are removed and the training is repeated, the same separating hyperplane would be found  The number of support vectors found can be used to compute an (upper) bound on the expected error rate of the SVM classifier, which is independent of the data dimensionality  Thus, an SVM with a small number of support vectors can have good generalization, even when the dimensionality of the data is high  Classification using SVM  Maximal Margin Hyperplane: d(XT )= ∑i=1 l yiαiXiXT + b0  Xi –Support vectors (l numbers) and XT –Test data  Class depends on sign of result
  • 8. SVM—Linearly Inseparable  Transform the original input data into a higher dimensional space  Search for a linear separating hyperplane in the new space  3-D vector X =(x1, x2, x3) is mapped into a 6-D space using φ1(X) = x1, φ2(X) = x2, φ3(X) = x3, φ4(X) = (x1)2 φ5(X) = x1x2 φ6(X) = x1x3  Decision plane d(Z) = w1x1 + w2x2+ w3x3 + w4(x1)2 + w5 x1x2 + w6 x1x3 + b d(Z) = w1z1 + w2z2+ w3z3 + w4z4 + w5 z5 + w6 z6+ b - Choosing non-linear mapping - Expensive Computation XiXT A 1 A 2
  • 9. SVM—Kernel functions  Instead of computing the dot product on the transformed data tuples, it is mathematically equivalent to instead applying a kernel function K(Xi, Xj) to the original data, i.e., K(Xi, Xj) = Φ(Xi) Φ(Xj)  Typical Kernel Functions  Polynomial Kernel : K(Xi, Xj) = (Xi, Xj + 1)h  Sigmoid Kernel : K(Xi, Xj) = tanh(κXi.Xj - δ)  Gaussian Radial Basis Function kernel : K(Xi, Xj) = e-||Xi - Xj ||2 / 2 σ2  SVM can also be used for classifying multiple (> 2) classes and for regression analysis (with additional user parameters)
  • 10. SVM vs. Neural Network  SVM  Relatively new concept  Deterministic algorithm  Nice Generalization properties  Hard to learn – learned in batch mode using quadratic programming techniques  Using kernels can learn very complex functions  Neural Network  Relatively old  Nondeterministic algorithm  Generalizes well but doesn’t have strong mathematical foundation  Can easily be learned in incremental fashion  To learn complex functions— use multilayer perceptron
  • 11. Associative Classification  Associative classification  Association rules are generated and analyzed for use in classification  Search for strong associations between frequent patterns (conjunctions of attribute- value pairs) and class labels  Classification: Based on evaluating a set of rules in the form of p1 ^ p2 … ^ pl  “Aclass = C” (conf, sup)  Effectiveness  It explores highly confident associations among multiple attributes  Overcome some constraints introduced by decision-tree induction which considers only one attribute at a time  Associative classification has been found to be more accurate than some traditional classification methods, such as C4.5
  • 12. Typical Associative Classification Methods  CBA (Classification By Association)  Mine possible rules in the form of  Cond-set (a set of attribute-value pairs)  class label  Iterative Approach similar to Apriori  Build classifier: Organize rules according to decreasing precedence based on confidence and then support  CMAR (Classification based on Multiple Association Rules)  Uses variant of FP-Growth  Mined rules are stored and retrieved using tree structure  Rule Pruning is carried out – Specialized rules with low confidence are pruned  Classification: Statistical analysis on multiple rules  CPAR (Classification based on Predictive Association Rules)  Generation of predictive rules (FOIL-like analysis)  High efficiency, accuracy similar to CMAR  Uses best-k rules of each group to predict class label