Support vector machines(svm)
Presenting by:
Eisa Jafari Amirbandi
Eisa.Jafari.Amirbandi@gmail.com
What is Support vector machines(svm)?
SVM‫کننده‬ ‫بندی‬ ‫دسته‬‫ای‬(classifier)‫شاخه‬ ‫جزو‬ ‫که‬ ‫است‬
Kernel Methods‫یادگیری‬ ‫در‬‫میشود‬ ‫محسوب‬ ‫ماشین‬.
SVM‫سال‬ ‫در‬1995‫توسط‬Vapnik‫معرفی‬‫شده‬‫است‬
http://link.springer.com/article/10.1007/BF00994018
Cited by 13557
‫شهرت‬SVM‫دست‬ ‫حروف‬ ‫تشخیص‬ ‫در‬ ‫آن‬ ‫موفقیت‬ ‫بخاطر‬‫نویس‬
‫برابری‬ ‫شده‬ ‫تنظیم‬ ‫بدقت‬ ‫عصبی‬ ‫های‬ ‫شبکه‬ ‫با‬ ‫که‬ ‫است‬‫میک‬‫ند‬
2
Introduction
what’s svm?
application
Flashback
Reminder
Types of svm’s
Issue
Summary
Conclusion
Example
‫کاربرد‬
 SVM is widely used in object detection & recognition,
content-based
 image retrieval, text recognition, biometrics, speech
recognition, etc.
 Also used for regression (SVR), (will not cover today)
‫برای‬ ‫خطی‬ ‫غیر‬ ‫و‬ ‫خطی‬ ‫های‬ ‫ساز‬ ‫جدا‬ ‫که‬ ‫مساله‬ ‫هر‬ ‫در‬
‫از‬ ‫توان‬ ‫می‬ ‫شود‬ ‫استفاده‬ ‫کالسبندی‬svm‫کرد‬ ‫استفاده‬
3
Introduction
what’s svm?
application
Flashback
Reminder
Types of svm’s
Issue
Summary
Conclusion
Example
Flashback (ml)
SVM‫ای‬ ‫کننده‬ ‫بندی‬ ‫دسته‬(classifier)‫شاخه‬ ‫جزو‬ ‫که‬ ‫است‬
Kernel Methods‫یادگیری‬ ‫در‬‫میشود‬ ‫محسوب‬ ‫ماشین‬.
‫چست؟‬ ‫یادگیری‬
‫کالسیفایر‬‫چیست؟‬
4
Introduction
what’s svm?
application
Flashback
Reminder
Types of svm’s
Issue
Summary
Conclusion
Example
Flashback (ml)
 Supervised classification
perceptron, support vector machine, loss functions, kernels,random
forests
 Supervised regression
ridge regression, lasso regression, SVM regression
 Unsupervised learning
graphical models, sequential Monte Carlo, PCA, Gaussian Mixture
Models, probabilistic PCA, hidden Markov models
5
Refrenc:Hilary Term 2014
A. Zisserman
Introduction
what’s svm?
application
Flashback
Reminder
Types of svm’s
Issue
Summary
Conclusion
Example
Flashback (ml)
1-Regression - supervised
estimate parameters, e.g. of weight vs height
2-Classification - supervised
estimate class, e.g. handwritten digit classification
6
Refrenc:Hilary Term 2014
A. Zisserman
Introduction
what’s svm?
application
Flashback
Reminder
Types of svm’s
Issue
Summary
Conclusion
Example
Flashback (ml)
3- Unsupervised learning – model the data
clustering
dimensionality reduction
7
Introduction
what’s svm?
application
Flashback
Reminder
Types of svm’s
Issue
Summary
Conclusion
Example
Intuition
‫سوال‬:‫کنیم؟‬ ‫انتخاب‬ ‫را‬ ‫ها‬ ‫ساز‬ ‫جدا‬ ‫از‬ ‫یک‬ ‫کدام‬
‫سوال‬:‫کدام‬‫است؟‬ ‫بهتر‬ ‫یک‬
8
Introduction
Reminder
Discriminant Function
Margin
Types of svm’s
Issue
Summary
Conclusion
Example
‫کنیم‬ ‫انتخاب‬ ‫را‬ ‫ها‬ ‫ساز‬ ‫جدا‬ ‫از‬ ‫یک‬ ‫کدام‬
‫داشته‬ ‫وجود‬ ‫دسته‬ ‫دو‬ ‫اگر‬
‫که‬ ‫باشند‬‫بصورت‬‫از‬ ‫خطی‬
‫هم‬‫جداپذیر‬‫بهترین‬ ،‫باشند‬
‫دسته‬ ‫دو‬ ‫این‬ ‫کننده‬ ‫جدا‬
‫چیست؟‬
‫الگوریتم‬‫از‬ ‫مختلفی‬ ‫های‬
‫جمله‬‫پرسپترون‬‫میتوانند‬‫این‬
‫دهند‬ ‫انجام‬ ‫را‬ ‫جداسازی‬.
‫این‬ ‫همه‬ ‫آیا‬‫الگوریتمها‬
‫بر‬ ‫اینکار‬ ‫عهده‬ ‫از‬ ‫بخوبی‬
‫میآیند؟‬
9
Introduction
Reminder
Discriminant Function
Margin
Types of svm’s
Issue
Summary
Conclusion
Example
Refrence image:wikipedia
‫کنیم‬ ‫انتخاب‬ ‫را‬ ‫ها‬ ‫ساز‬ ‫جدا‬ ‫از‬ ‫یک‬ ‫کدام‬
H2‫یا‬ ‫است‬ ‫بهتر‬H3‫؟‬
10
Introduction
Reminder
Discriminant Function
Margin
Types of svm’s
Issue
Summary
Conclusion
Example
Refrence image:wikipedia
‫کنیم‬ ‫انتخاب‬ ‫را‬ ‫ها‬ ‫ساز‬ ‫جدا‬ ‫از‬ ‫یک‬ ‫کدام‬
H2‫یا‬ ‫است‬ ‫بهتر‬H3‫؟‬
‫که‬ ‫است‬ ‫بهتر‬ ‫سازی‬ ‫جدا‬
‫حاشیه‬(margin)‫بزرگتری‬
‫داشته‬‫باشدو‬‫بزرگت‬ ‫حاشیه‬‫ر‬
‫اطمینان‬ ‫ضریب‬ ‫معنی‬ ‫به‬
‫است‬ ‫بیشتر‬
11
Introduction
Reminder
Discriminant Function
Margin
Types of svm’s
Issue
Summary
Conclusion
Example
Refrence image:wikipedia
‫حاشیه‬(margin)
• The hyperplane with the largest margin has equal
distances to the nearest sample of both classes
• SVM finds the solution with maximum margin
12
Introduction
Reminder
Discriminant Function
Margin
Types of svm’s
Issue
Summary
Conclusion
Example
What is Support vector(‫پشتیبان‬ ‫?)بردار‬
‫جدا‬ ‫های‬ ‫صفحه‬ ‫ابر‬ ‫به‬ ‫آموزشی‬ ‫های‬ ‫داده‬ ‫نزدیکترین‬‫کننده‬,‫بردار‬
‫میشوند‬ ‫نامیده‬ ‫پشتیبان‬
Support vectors
Maximizes
marginNarrower
margin
13
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
‫یادآوری‬(‫صفحه‬ ‫در‬ ‫خط‬ ‫معادله‬)14
‫خط‬ ‫ضمنی‬ ‫معادله‬‫فضای‬ ‫در‬2‫بعدی‬:
Ax+By+C=0
‫تغییر‬‫نوتیشن‬:
𝑤1 𝑥1 + 𝑤2 𝑥2 + 𝑏 = 0
W=
𝑤1
𝑤2
X=
𝑥1
𝑥2
𝑊 𝑇
𝑋 + b = 0
x
y
Ax+By+C=0
‫فرم‬‫ماتریسی‬:
W‫خط‬ ‫بر‬ ‫عمود‬ ‫است‬ ‫خطی‬‫جداساز‬‫آن‬ ‫جهت‬ ‫و‬
‫مثبت‬ ‫جهت‬ ‫در‬‫جداساز‬‫باشد‬ ‫می‬
‫یادآوری‬(Linear Discriminant Function)15
‫فضای‬ ‫دهیم‬ ‫تعمیم‬ ‫توانیم‬ ‫می‬2‫فضای‬ ‫به‬ ‫را‬ ‫بعدی‬n‫بعدی‬
‫داریم‬ ‫پس‬.
𝑖
𝑤𝑖 𝑥𝑖 + 𝑏 = 0 𝑊 𝑇 𝑋 + b = 0
Refrence image:wikipedia
‫انواع‬svm
Hard margin linear SVM
 Soft margin linear SVM
 Non-linear SVM
16
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
Hard Margin SVM17
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
‫فرض‬‫آموزشی‬ ‫های‬ ‫داده‬ ‫کنید‬‫صورت‬ ‫به‬< X 𝑖, Y𝑖 >‫است‬‫در‬ ‫که‬
‫داریم‬ ‫آن‬:
𝑊 𝑇 𝑋 + b < −1
𝑊 𝑇 𝑋 + b > 1
{X 𝑖, Y𝑖} i=1,2,…,n
X 𝑖 ∈ 𝑅 𝑑
Y𝑖∈ {1, −1}
𝑊 𝑇 𝑋𝑖 + b > 1 if Y𝑖 = 1
𝑊 𝑇
𝑋𝑖 + b < −1 if Y𝑖 = −1
For i=1,2,…,n
Hard Margin SVM18
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
‫تعریف‬ ‫طبق‬svm‫حاشیه‬ ‫که‬ ‫هستیم‬ ‫جداسازی‬ ‫بدنبال‬ ‫ما‬
‫به‬ ‫داریم‬ ‫احتیاج‬ ‫اینکار‬ ‫برای‬ ‫باشد‬ ‫داشته‬ ‫بزرگتری‬‫فرمولی‬‫ب‬‫رسیم‬
‫دهد‬ ‫ما‬ ‫به‬ ‫ریاضی‬ ‫پارامتر‬ ‫حسب‬ ‫بر‬ ‫را‬ ‫حاشیه‬ ‫مقدار‬ ‫که‬
‫فاصله‬ ‫همان‬ ‫حاشیه‬ ‫دانیم‬ ‫می‬ ‫طرفی‬ ‫از‬2‫اطراف‬ ‫موازی‬ ‫خط‬
‫جداساز‬‫پ‬ ‫است‬‫س‬‫داریم‬:
𝑊 𝑇 𝑋 + b < −1
𝑊 𝑇 𝑋 + b > 1
𝑑 =
2
| 𝑤 |
Hard Margin SVM19
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
‫مساله‬ ‫تعریف‬:
‫در‬svm‫حاشیه‬ ‫مقدار‬ ‫خواهیم‬ ‫می‬ ‫ما‬(𝑑 =
2
| 𝑤 |
)‫دهیم‬ ‫افزایش‬ ‫را‬
‫که‬ ‫این‬ ‫شرط‬ ‫به‬‫باشد‬ ‫برقرار‬ ‫زیر‬ ‫رابطه‬:
𝑊 𝑇 𝑋 + b < −1
𝑊 𝑇 𝑋 + b > 1
For i=1,2,…,n
𝑊 𝑇
𝑋𝑖 + b > 1 if Y𝑖 = 1
𝑊 𝑇 𝑋𝑖 + b < −1 if Y𝑖 = −1
Hard Margin SVM20
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
‫حاشیه‬ ‫افزایش‬ ‫برای‬(𝑑 =
2
| 𝑤 |
)‫مقدار‬ ‫باید‬| 𝑤 |(‫کسر‬ ‫مخرج‬)‫را‬
‫مینیم‬‫بنویسی‬ ‫گونه‬ ‫این‬ ‫توانیم‬ ‫می‬ ‫را‬ ‫مساله‬ ‫شرط‬ ‫و‬ ‫کنیم‬‫م‬:
Y𝑖(𝑊 𝑇 𝑋𝑖 + b) > 1 For i=1,2,…,n𝑊 𝑇 𝑋𝑖 + b > 1 if Y𝑖 = 1
𝑊 𝑇
𝑋𝑖 + b < −1 if Y𝑖 = −1
Hard Margin SVM21
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
‫جای‬ ‫به‬ ‫خطی‬ ‫جبر‬ ‫از‬ ‫استفاده‬ ‫و‬ ‫کار‬ ‫راحتی‬ ‫برای‬‫مینیم‬‫کردن‬| 𝑤 |
‫تابع‬
1
2
| 𝑤 |2
‫را‬‫مینمم‬‫داریم‬ ‫پس‬ ‫کنیم‬ ‫می‬:
For i=1,2,…,n
Minimum
1
2
| 𝑤 |2
Such that:
Y𝑖(𝑊 𝑇
𝑋𝑖 + b) > 1
‫توابع‬ ‫یادآوری‬‫الگرانژ‬‫شروط‬ ‫و‬KKT
‫تابع‬ ‫بخواهیم‬ ‫اگر‬F(x)‫را‬‫نسبت‬‫به‬x‫مینمم‬‫با‬ ‫کنیم‬
‫که‬ ‫این‬ ‫شرط‬g(x)≥0‫تابع‬ ‫از‬ ‫توانیم‬ ‫می‬‫الگرانژ‬
‫کنیم‬ ‫استفاده‬
L(x,u)=F(x)-uG(x) ; u≥0
‫بایست‬ ‫می‬ ‫تابع‬ ‫این‬ ‫در‬ ‫که‬x‫را‬‫مینمم‬‫و‬ ‫کرده‬u‫را‬
‫ماکسیمم‬‫سازبم‬
‫شروط‬KKT:
22
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
∂L(x,u)
∂x
=0
uG(x)=0
Hard Margin SVM23
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
Minimum
1
2
| 𝑤 |2
Such that: Y𝑖(𝑊 𝑇
𝑋𝑖 + b) − 1 > 0
‫قضیه‬ ‫طبق‬‫الگرانژ‬:
Solving the Optimization Problem
0]1)([  by iii wx
only SVs will have non-zero ai
24
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
Karush-Kuhn-Tucker (KKT) conditions:
𝑎𝑖 ≥0


0]1)([:  bycondKKT iii wx
6=1.4
Class 1
Class 2
1=0.8
2=0
3=0
4=0
5=0
7=0
8=0.6
9=0
10=0
25
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
Solving the Optimization Problem
‫تابع‬ ‫در‬ ‫گذاری‬ ‫جا‬ ‫با‬‫الگرانژ‬‫داریم‬:
‫قضیه‬ ‫طبق‬dual‫برای‬‫ماکسیمم‬‫حسب‬ ‫بر‬ ‫تابع‬ ‫کردن‬𝑎𝑖‫توانیم‬ ‫می‬
‫مساله‬ ‫و‬ ‫کرده‬ ‫ضرب‬ ‫منفی‬ ‫در‬ ‫را‬ ‫باال‬ ‫رابطه‬‫مینیمایزیشن‬‫شود‬
26
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
Q(α)
Solving the Optimization Problem
‫به‬ ‫تبدیل‬ ‫ما‬ ‫مساله‬‫یک‬quadratic program‫که‬ ‫است‬ ‫شده‬
‫در‬ ‫توان‬ ‫می‬‫متلب‬‫دستور‬ ‫با‬quadprog‫مقدار‬ ‫و‬ ‫کرد‬ ‫حل‬a‫را‬ ‫ها‬
‫آورد‬ ‫بدست‬
27
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
‫دستور‬ ‫از‬ ‫استفاده‬ ‫برای‬quadprog‫دو‬ ‫باید‬‫ماتریس‬‫کنیم‬ ‫تعریف‬ ‫را‬
H=𝑦𝑖 𝑦𝑗 𝑥𝑖
𝑇
𝑥𝑗
f=-1
a = quadprog(H,f)
Hard Margin SVM
‫مقدار‬ ‫وقتی‬a‫مقدار‬ ‫توانیم‬ ‫می‬ ‫آوریم‬ ‫بدست‬ ‫را‬w‫زیر‬ ‫رابطه‬ ‫از‬ ‫را‬
‫مقدار‬ ‫فقط‬ ‫و‬ ‫آوریم‬ ‫بدست‬b‫دوم‬ ‫شرط‬ ‫باید‬ ‫که‬ ‫ماند‬ ‫می‬kkt‫برقرار‬
‫شود‬
28
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
Hard Margin SVM29
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
0]1)([  by iii wxKKT cond: or
01)(  by ii wx
0i NSV
iiyb wx
SVs
only SVs will have non-zero ai
0]1)([:  bycondKKT iii wx
6=1.4
Class 1
Class 2
1=0.8
2=0
3=0
4=0
5=0
7=0
8=0.6
9=0
10=0
30
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
Summary Solving the Optimization Problem
‫گام‬1:‫دیتاست‬{X 𝑖, Y𝑖}‫مقدار‬ ‫که‬ ‫است‬ ‫این‬ ‫هدف‬ ‫گیریم‬ ‫می‬ ‫را‬ ‫اولیه‬w,b
‫بیاوریم‬ ‫بدست‬ ‫را‬
‫گام‬2:‫ماتریس‬H ,f‫تابع‬ ‫از‬ ‫و‬ ‫سازیم‬ ‫می‬ ‫را‬‫متلب‬a = quadprog(H,f)
‫کنیم‬ ‫می‬ ‫استفاده‬‫مقدار‬ ‫تا‬a‫آید‬ ‫بدست‬
h𝑖𝑗=𝑦𝑖 𝑦𝑗 𝑥𝑖
𝑇
𝑥𝑗 , f=-1 , H=[h𝑖𝑗]
‫گام‬3:‫مقدار‬w ,b‫آوریم‬ ‫می‬ ‫بدست‬ ‫زیر‬ ‫فرمول‬ ‫از‬ ‫را‬
‫گام‬4:‫شود‬ ‫می‬ ‫زیر‬ ‫رابطه‬ ‫ما‬ ‫خطی‬ ‫ساز‬ ‫جدا‬:
31
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
‫خروجی‬ = 𝑠𝑖𝑔𝑛(𝑊 𝑇 𝑋𝑖 + b)
The Quadratic Programming Problem32
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
Soft Margin SVM
 What if data is not linear separable?
(noisy data, outliers, etc.)
 Slack variables ξi can be added to allow miss-classification of
difficult or noisy data points
33
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
𝑊 𝑇 𝑋 + b > 1
yi (<w,xi> + b) ≥1 -xi , xi ≥ 0
Soft Margin SVM
‫متغیر‬ ‫معرفی‬ ‫با‬xi, i=1, 2, …, N,‫رابطه‬ ‫و‬ ‫شده‬ ‫تر‬ ‫ساده‬ ‫قبلی‬ ‫های‬ ‫محدودیت‬
yi (<w,xi> + b) ≥1
‫بصورت‬‫میکند‬ ‫تغییر‬ ‫زیر‬:
yi (<w,xi> + b) ≥1 -xi , xi ≥ 0
‫در‬‫متغیر‬ ‫این‬ ‫همه‬ ‫آل‬ ‫ایده‬ ‫حالت‬‫ها‬xi‫باشند‬ ‫صفر‬ ‫باید‬.
34
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
Soft Margin SVM
‫مساله‬hard margin‫که‬ ‫کند‬ ‫می‬ ‫تغییر‬ ‫گونه‬ ‫این‬ ‫ما‬‫را‬ ‫زیر‬ ‫تابع‬‫مینمم‬‫کنیم‬.
Such that
35
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
𝑊 𝑇 𝑋 + b > 1
Soft Margin SVM
C trades-off margin width and misclassifications
SVM tries to maintain to zero while maximizing margin.
36
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
Parametr c can be viewed as a way to control over-fiting
Such that:
Soft Margin SVM37
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
ST:
‫قضیه‬ ‫طبق‬‫الگرانژ‬:
Soft Margin SVM38
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
KKT conditions:
‫قضیه‬‫الگرانژ‬
Soft Margin SVM39
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
KKT conditions:
0≤𝑎𝑖 ≤ 𝐶
0≤𝐵𝑖 ≤ 𝐶
Soft Margin SVM
‫تابع‬ ‫در‬ ‫گذاری‬ ‫جا‬ ‫با‬‫الگرانژ‬‫و‬ ‫سازی‬ ‫ساده‬ ‫و‬
‫قضیه‬ ‫از‬ ‫استفاده‬daul(‫مثل‬hard margin)‫داریم‬:
‫همان‬ ‫این‬ ‫که‬quadratic program‫قسمت‬ ‫در‬ ‫که‬ ‫است‬hard
margin‫دستور‬ ‫با‬ ‫و‬ ‫است‬ ‫شده‬ ‫گفته‬‫متلب‬quadprog‫مقدار‬
a‫آید‬ ‫می‬ ‫بدست‬
40
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
Soft Margin SVM
‫با‬‫مقدار‬ ‫داشتن‬a‫مقدار‬w‫آید‬ ‫می‬ ‫بدست‬ ‫زیر‬ ‫رابطه‬ ‫از‬
41
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
Soft Margin SVM
‫آوردن‬ ‫بدست‬ ‫برای‬b‫آخر‬ ‫شرط‬kkt‫کنیم‬ ‫می‬ ‫بررسی‬ ‫را‬:
‫داریم‬ ‫باال‬ ‫شروط‬ ‫از‬:
1)
42
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
(𝐶-𝑎𝑖) xi =0
Not Support Vector
Soft Margin SVM
2)
3)
43
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
(𝐶-𝑎𝑖) xi =0
Where U is the set of unbounded support vector
Summary Soft Margin SVM44
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
‫گام‬1:‫دیتاست‬{X 𝑖, Y𝑖}‫پارامتر‬ ‫برای‬ ‫و‬ ‫گیریم‬ ‫می‬ ‫را‬ ‫اولیه‬c‫تعیین‬ ‫مقدار‬ ‫یک‬
‫مقدار‬ ‫که‬ ‫است‬ ‫این‬ ‫هدف‬ ‫کنیم‬ ‫می‬w,b‫بیاوریم‬ ‫بدست‬ ‫را‬
‫گام‬2:‫ماتریس‬H ,f‫تابع‬ ‫از‬ ‫و‬ ‫سازیم‬ ‫می‬ ‫را‬‫متلب‬
a = quadprog(H,f,1,c)‫کنیم‬ ‫می‬ ‫استفاده‬‫مقدار‬ ‫تا‬a‫آید‬ ‫بدست‬
H=𝑦𝑖 𝑦𝑗 𝑥𝑖
𝑇
𝑥𝑗 , f=-1
‫گام‬3:‫مقدار‬w ,b‫آوریم‬ ‫می‬ ‫بدست‬ ‫زیر‬ ‫فرمول‬ ‫از‬ ‫را‬
‫گام‬4:‫شود‬ ‫می‬ ‫زیر‬ ‫رابطه‬ ‫ما‬ ‫خطی‬ ‫ساز‬ ‫جدا‬:
‫خروجی‬ = 𝑠𝑖𝑔𝑛(𝑊 𝑇 𝑋𝑖 + b)
‫بین‬ ‫مقایسه‬soft margin‫و‬hard margin
 Soft-Margin is more robust to outliers
 Hard-Margin does not require to guess the cost
parameter (requires noparameters at all)
45
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
Nonlinear SVMs46
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
Nonlinear SVMs
‫ها‬ ‫نمونه‬ ‫که‬ ‫است‬ ‫این‬ ‫اصلی‬ ‫ایده‬ ‫نباشد‬ ‫خطی‬ ‫صورت‬ ‫به‬ ‫پذیری‬ ‫جدا‬ ‫که‬ ‫حالتی‬ ‫در‬
‫باال‬ ‫بعد‬ ‫با‬ ‫فضای‬ ‫یک‬ ‫به‬ ‫را‬(feature space)‫در‬ ‫که‬ ‫دهیم‬ ‫نگاشت‬ ‫مشخصه‬ ‫فضای‬
‫شوند‬ ‫جدا‬ ‫هم‬ ‫از‬ ‫خطی‬ ‫صورت‬ ‫به‬ ‫توانند‬ ‫می‬ ‫ها‬ ‫نمونه‬ ‫؛‬ ‫ها‬ ‫مشخصه‬ ‫جدید‬ ‫فضای‬.
47
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
Nonlinear SVMs
‫نگاشت‬ ‫برای‬(maping)‫هسته‬ ‫تابع‬ ‫یک‬ ‫اعمال‬ ‫به‬ ‫نیاز‬(‫کرنل‬)‫می‬
‫باشد‬
48
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
Nonlinear SVMs49
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
‫ها‬ ‫کرنل‬ ‫معرفی‬50
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
 Linear Kernels
 Polynomial Kernels
 Radial Basis Function Kernels
 Three-Layer Neural Network Kernels
 Normalizing Kernels
Nonlinear SVMs
‫اولیه‬ ‫تابع‬
 G(x)=w 𝑇 𝑋 + 𝑏
 G(x)=w 𝑇
φ(X)+ 𝑏
quadratic program
‫کردن‬ ‫خطی‬ ‫غیر‬
51
Kernel mode
Nonlinear SVMs
‫اولیه‬ ‫تابع‬
 G(X)=w 𝑇 𝑋 + 𝑏
 G(X)=w 𝑇
φ(X)+𝑏
‫نهایی‬ ‫تابع‬(‫نمونه‬ ‫خروجی‬ ‫مقدار‬X)
‫کردن‬ ‫خطی‬ ‫غیر‬
52
Summary Nonlinear SVMs53
Introduction
Reminder
Types of svm’s
What’s SV?
Hard Margin
Soft Margin
Nonlinear Margin
Issue
Summary
Conclusion
Example
‫گام‬1:‫دیتاست‬{X 𝑖, Y𝑖}‫پارامتر‬ ‫برای‬ ‫و‬ ‫گیریم‬ ‫می‬ ‫را‬ ‫اولیه‬c‫تعیین‬ ‫مقدار‬ ‫یک‬
‫مقدار‬ ‫که‬ ‫است‬ ‫این‬ ‫هدف‬ ‫کنیم‬ ‫می‬b,a‫بیاوریم‬ ‫بدست‬ ‫را‬
‫گام‬2:‫ماتریس‬H ,f‫تابع‬ ‫از‬ ‫و‬ ‫سازیم‬ ‫می‬ ‫را‬‫متلب‬
a = quadprog(H,f,1,c)‫کنیم‬ ‫می‬ ‫استفاده‬‫مقدار‬ ‫تا‬a‫آید‬ ‫بدست‬
H=𝑦𝑖 𝑦𝑗 𝑥𝑖
𝑇
𝑥𝑗 , f=-1
‫گام‬4:‫شود‬ ‫می‬ ‫زیر‬ ‫رابطه‬ ‫ما‬ ‫ساز‬ ‫جدا‬:
C‫اس‬ ‫کننده‬ ‫تنظیم‬ ‫پارامتر‬ ‫یک‬‫ت‬
C‫گرفته‬ ‫نادیده‬ ‫راحتی‬ ‫به‬ ‫که‬ ‫محدودیت‬ ‫تا‬ ‫دهد‬ ‫می‬ ‫اجازه‬ ‫کوچک‬‫شود‬
C‫باعث‬ ‫بزرگ‬‫محدودیت‬ ‫شود‬ ‫می‬‫را‬ ‫ها‬‫سخت‬‫نادیده‬‫بگیرد‬
54
Introduction
Reminder
Types of svm’s
Issue
parameter c
Multi-Class Classification
Choice of kernel
Optimization criterion
Summary
Conclusion
Example
RBF Kernel SVM Example
data is not linearly separable in original feature space
http://www.robots.ox.ac.uk/~az/lectures/ml/
55
Introduction
Reminder
Types of svm’s
Issue
parameter c
Multi-Class Classification
Choice of kernel
Optimization criterion
Summary
Conclusion
Example
RBF Kernel SVM Example56
Introduction
Reminder
Types of svm’s
Issue
parameter c
Multi-Class Classification
Choice of kernel
Optimization criterion
Summary
Conclusion
Example
RBF Kernel SVM Example57
Introduction
Reminder
Types of svm’s
Issue
parameter c
Multi-Class Classification
Choice of kernel
Optimization criterion
Summary
Conclusion
Example
RBF Kernel SVM Example58
Introduction
Reminder
Types of svm’s
Issue
parameter c
Multi-Class Classification
Choice of kernel
Optimization criterion
Summary
Conclusion
Example
RBF Kernel SVM Example
‫تغیرات‬ ‫بررسی‬‫سیگما‬
59
Introduction
Reminder
Types of svm’s
Issue
parameter c
Multi-Class Classification
Choice of kernel
Optimization criterion
Summary
Conclusion
Example
RBF Kernel SVM Example60
Introduction
Reminder
Types of svm’s
Issue
parameter c
Multi-Class Classification
Choice of kernel
Optimization criterion
Summary
Conclusion
Example
RBF Kernel SVM Example61
Introduction
Reminder
Types of svm’s
Issue
parameter c
Multi-Class Classification
Choice of kernel
Optimization criterion
Summary
Conclusion
Example
Multi-Class Classification Build from binary
classifiers
62
Introduction
Reminder
Types of svm’s
Issue
parameter c
Multi-Class Classification
Choice of kernel
Optimization criterion
Summary
Conclusion
Example
Some Issues(Choice of kernel)
 Gaussian or polynomial kernel is default
 If ineffective, more elaborate kernels are needed
 Domain experts can give assistance in formulating
appropriate similarity measure.
63
Introduction
Reminder
Types of svm’s
Issue
parameter c
Multi-Class Classification
Choice of kernel
Optimization criterion
Summary
Conclusion
Example
Some Issues(Optimization criterion)
 Hard margin v.s. Soft margin
 a lengthy series of experiments in which various
parameters are tested
64
Introduction
Reminder
Types of svm’s
Issue
parameter c
Multi-Class Classification
Choice of kernel
Optimization criterion
Summary
Conclusion
Example
‫الگوریتم‬svm
(1‫کرنل‬ ‫تابع‬ ‫انتخاب‬
(2‫مقدار‬ ‫کردن‬ ‫مشخص‬C
(3‫مساله‬ ‫کردن‬ ‫حل‬‫کوا‬‫د‬‫راتیک‬
Solve the quadratic programming problem (many software packages available)
(4‫تابع‬ ‫ساختن‬svm‫آمده‬ ‫بدست‬ ‫پارامترهای‬ ‫از‬
65
Introduction
Reminder
Types of svm’s
Issue
Summary
Conclusion
Example
Summary: Support Vector Machine
 Soft Margin Classifier :
Better generalization ability & less over-fitting
 The Kernel Trick:
Map data points to higher dimensional space in order to
make them
 linearly separable:
Since only dot product is used, we do not need to
represent the mapping explicitly
66
Introduction
Reminder
Types of svm’s
Issue
Summary
Conclusion
Example
Strengths
 No local minimal
 Robustness to outliers
 Training is relatively easy
 Good generalization in theory and practice
 Work well with few training instances
 Find globally best model, No local optimal, unlike in
neural networks
 It scales relatively well to high dimensional data
 Tradeoff between classifier complexity and error can be
controlled explicitly
 Non-traditional data like strings and trees can be used as
input to SVM, instead of feature vectors
 Notice: SVM does not minimize the number of
misclassifications (NP-complete problem) but the sum of
distancesfrom the margin hyperplanes.
67
Introduction
Reminder
Types of svm’s
Issue
Summary
Strengths
Weakness
SVMs vs. Neural Networks
Conclusion
Example
Weakness
 Selection of parameters)Need to choose a “ good”
kernel function.(
 Extension to multiclass problems
Refrenc:Advances in Pattern Recognition
Chapter 2 part 2.5 “Advantages and Disadvantages”
Shigeo Abe
68
Introduction
Reminder
Types of svm’s
Issue
Summary
Strengths
Weakness
SVMs vs. Neural Networks
Conclusion
Example
SVMs vs. Neural Networks
 Kernel maps to a high
dimensional spaces
 Search space has a
unique minimum
 Training is extremely
efficient
 Classification extremely
efficient
 Kernel and cost are the
two parameters to select
 Very good accuracy in
typical domains
 Extremely robust
 Hidden Layers map to
lower dimensional
spaces
 Search space has
multiple local minimum
 Training is expensive
 Classification extremely
efficient
 Requires number of
hidden units and layers
 Very good accuracy in
typical domains
 Could be robust
Refrenc:Advances in Pattern Recognition
Chapter 2 part 2.6 “Characteristics of Solutions”
Shigeo Abe
69
Introduction
Reminder
Types of svm’s
Issue
Summary
Strengths
Weakness
SVMs vs. Neural Networks
Conclusion
Example
Conclusion
Svm‫ترین‬ ‫بهینه‬‫جداکننده‬‫خطی‬‫راپیدا‬،‫کند‬ ‫می‬
‫دیگر‬ ‫خالف‬ ‫بر‬‫الگوریتم‬‫خطای‬ ‫که‬ ‫ها‬‫مدلسازی‬‫کنند‬ ‫می‬ ‫کمینه‬ ‫را‬
svm‫گیرد‬ ‫می‬ ‫نظر‬ ‫در‬ ‫هدف‬ ‫تابع‬ ‫عنوان‬ ‫به‬ ‫را‬ ‫عملیاتی‬ ‫ریسک‬
‫ابر‬‫که‬ ‫را‬ ‫صفحه‬‫بیشتربن‬‫داشته‬ ‫را‬ ‫حاشیه‬‫باشدانتخاب‬‫کند‬ ‫می‬
‫شوند‬ ‫می‬ ‫باعث‬ ‫ها‬ ‫کرنل‬‫الگوریتم‬svm‫شود‬ ‫خطی‬ ‫غیر‬
‫از‬ ‫مناسب‬ ‫استفاده‬ ‫صورت‬ ‫در‬SVM‫این‬‫الگوریتم‬‫خوبی‬ ‫تعمیم‬ ‫قدرت‬
‫خواهد‬‫داشت‬
‫زیاد‬ ‫ابعاد‬ ‫داشتن‬ ‫علیرغم‬high dimensionality))‫از‬overfitting
‫میکند‬ ‫پرهیز‬.‫از‬ ‫ناشی‬ ‫خاصیت‬ ‫این‬optimization‫این‬‫الگوریتم‬‫است‬
‫اطالعات‬ ‫سازی‬ ‫فشرده‬:
‫از‬ ‫آموزشی‬ ‫های‬ ‫داده‬ ‫بجای‬‫بردارهای‬‫میکند‬ ‫استفاده‬ ‫پشتیبان‬.
Refrenc:Advances in Pattern Recognition
Chapter 2 part 2.6 “Characteristics of Solutions”
Shigeo Abe
70
Introduction
Reminder
Types of svm’s
Issue
Summary
Conclusion
Example
Svm code
 http://www.kernel-machines.org/software.html
 http://www.csie.ntu.edu.tw/~cjlin/libsvm
 http://svmlight.joachims.org/
 A Practical Guide to Support Vector
Classification(Chih-Wei Hsu, Chih-Chung Chang, and
Chih-Jen Lin)
71
Introduction
Reminder
Types of svm’s
Issue
Summary
Conclusion
Example
matlab
‫از‬svmtrain‫ساختار‬ ‫کردن‬ ‫درست‬ ‫برای‬svm‫شود‬ ‫می‬ ‫استفاده‬
 Syntax
SVMStruct = svmtrain(Training,Group)
SVMStruct = svmtrain(Training,Group,Name,Value)
‫از‬svmclassify‫برای‬‫شود‬ ‫می‬ ‫استفاده‬ ‫تست‬ ‫نمونه‬ ‫کالسبندی‬
 Syntax
Group = svmclassify(SVMStruct,Sample)
Group = svmclassify(SVMStruct,Sample,'Showplot',true)
72
Introduction
Reminder
Types of svm’s
Issue
Summary
Conclusion
Example
matlab
 svmtrain(Training,Group, 'kernel_function',Value)
 Value:
 'linear' — Linear kernel, meaning dot product.
 'quadratic' — Quadratic kernel.
 'polynomial' — Polynomial kernel (default order 3). Specify
another order with the polyorder name-value pair.
 'rbf' — Gaussian Radial Basis Function kernel with a default
scaling factor, sigma, of 1. Specify another value for sigma
with the rbf_sigma name-value pair.
 'mlp' — Multilayer Perceptron kernel with default scale [1 –1].
Specify another scale with the mlp_params name-value pair.
 Default: 'linear
73
Introduction
Reminder
Types of svm’s
Issue
Summary
Conclusion
Example
‫مثال‬1( :fisheriris)
 load fisheriris
 xdata = meas(51:end,3:4);
 group = species(51:end);
 svmStruct = svmtrain(xdata,group,'ShowPlot',true);
74
Introduction
Reminder
Types of svm’s
Issue
Summary
Conclusion
Example
‫مثال‬1:‫نمونه‬ ‫یک‬ ‫تست‬
 species = svmclassify(svmStruct,[5 2],'ShowPlot',true)
 hold on;
 plot(5,2,'ro','MarkerSize',12);
 hold off
75
Introduction
Reminder
Types of svm’s
Issue
Summary
Conclusion
Example
‫رفرنس‬
 Advances in Pattern Recognition(Support Vector Machines
for Pattern Classification) Professor Dr Shigeo Abe
‫اسالید‬A. Zisserman
‫فایل‬MATLAB Workshop 2‫سایت‬ ‫از‬www.david-lindsay.co.uk
‫ماشین‬ ‫یادگیری‬ ‫درس‬ ‫ویدیویی‬ ‫فایل‬Andrew Ng
76
Introduction
Reminder
Types of svm’s
Issue
Summary
Conclusion
Example
‫پایان‬77

End1

  • 1.
    Support vector machines(svm) Presentingby: Eisa Jafari Amirbandi Eisa.Jafari.Amirbandi@gmail.com
  • 2.
    What is Supportvector machines(svm)? SVM‫کننده‬ ‫بندی‬ ‫دسته‬‫ای‬(classifier)‫شاخه‬ ‫جزو‬ ‫که‬ ‫است‬ Kernel Methods‫یادگیری‬ ‫در‬‫میشود‬ ‫محسوب‬ ‫ماشین‬. SVM‫سال‬ ‫در‬1995‫توسط‬Vapnik‫معرفی‬‫شده‬‫است‬ http://link.springer.com/article/10.1007/BF00994018 Cited by 13557 ‫شهرت‬SVM‫دست‬ ‫حروف‬ ‫تشخیص‬ ‫در‬ ‫آن‬ ‫موفقیت‬ ‫بخاطر‬‫نویس‬ ‫برابری‬ ‫شده‬ ‫تنظیم‬ ‫بدقت‬ ‫عصبی‬ ‫های‬ ‫شبکه‬ ‫با‬ ‫که‬ ‫است‬‫میک‬‫ند‬ 2 Introduction what’s svm? application Flashback Reminder Types of svm’s Issue Summary Conclusion Example
  • 3.
    ‫کاربرد‬  SVM iswidely used in object detection & recognition, content-based  image retrieval, text recognition, biometrics, speech recognition, etc.  Also used for regression (SVR), (will not cover today) ‫برای‬ ‫خطی‬ ‫غیر‬ ‫و‬ ‫خطی‬ ‫های‬ ‫ساز‬ ‫جدا‬ ‫که‬ ‫مساله‬ ‫هر‬ ‫در‬ ‫از‬ ‫توان‬ ‫می‬ ‫شود‬ ‫استفاده‬ ‫کالسبندی‬svm‫کرد‬ ‫استفاده‬ 3 Introduction what’s svm? application Flashback Reminder Types of svm’s Issue Summary Conclusion Example
  • 4.
    Flashback (ml) SVM‫ای‬ ‫کننده‬‫بندی‬ ‫دسته‬(classifier)‫شاخه‬ ‫جزو‬ ‫که‬ ‫است‬ Kernel Methods‫یادگیری‬ ‫در‬‫میشود‬ ‫محسوب‬ ‫ماشین‬. ‫چست؟‬ ‫یادگیری‬ ‫کالسیفایر‬‫چیست؟‬ 4 Introduction what’s svm? application Flashback Reminder Types of svm’s Issue Summary Conclusion Example
  • 5.
    Flashback (ml)  Supervisedclassification perceptron, support vector machine, loss functions, kernels,random forests  Supervised regression ridge regression, lasso regression, SVM regression  Unsupervised learning graphical models, sequential Monte Carlo, PCA, Gaussian Mixture Models, probabilistic PCA, hidden Markov models 5 Refrenc:Hilary Term 2014 A. Zisserman Introduction what’s svm? application Flashback Reminder Types of svm’s Issue Summary Conclusion Example
  • 6.
    Flashback (ml) 1-Regression -supervised estimate parameters, e.g. of weight vs height 2-Classification - supervised estimate class, e.g. handwritten digit classification 6 Refrenc:Hilary Term 2014 A. Zisserman Introduction what’s svm? application Flashback Reminder Types of svm’s Issue Summary Conclusion Example
  • 7.
    Flashback (ml) 3- Unsupervisedlearning – model the data clustering dimensionality reduction 7 Introduction what’s svm? application Flashback Reminder Types of svm’s Issue Summary Conclusion Example
  • 8.
    Intuition ‫سوال‬:‫کنیم؟‬ ‫انتخاب‬ ‫را‬‫ها‬ ‫ساز‬ ‫جدا‬ ‫از‬ ‫یک‬ ‫کدام‬ ‫سوال‬:‫کدام‬‫است؟‬ ‫بهتر‬ ‫یک‬ 8 Introduction Reminder Discriminant Function Margin Types of svm’s Issue Summary Conclusion Example
  • 9.
    ‫کنیم‬ ‫انتخاب‬ ‫را‬‫ها‬ ‫ساز‬ ‫جدا‬ ‫از‬ ‫یک‬ ‫کدام‬ ‫داشته‬ ‫وجود‬ ‫دسته‬ ‫دو‬ ‫اگر‬ ‫که‬ ‫باشند‬‫بصورت‬‫از‬ ‫خطی‬ ‫هم‬‫جداپذیر‬‫بهترین‬ ،‫باشند‬ ‫دسته‬ ‫دو‬ ‫این‬ ‫کننده‬ ‫جدا‬ ‫چیست؟‬ ‫الگوریتم‬‫از‬ ‫مختلفی‬ ‫های‬ ‫جمله‬‫پرسپترون‬‫میتوانند‬‫این‬ ‫دهند‬ ‫انجام‬ ‫را‬ ‫جداسازی‬. ‫این‬ ‫همه‬ ‫آیا‬‫الگوریتمها‬ ‫بر‬ ‫اینکار‬ ‫عهده‬ ‫از‬ ‫بخوبی‬ ‫میآیند؟‬ 9 Introduction Reminder Discriminant Function Margin Types of svm’s Issue Summary Conclusion Example Refrence image:wikipedia
  • 10.
    ‫کنیم‬ ‫انتخاب‬ ‫را‬‫ها‬ ‫ساز‬ ‫جدا‬ ‫از‬ ‫یک‬ ‫کدام‬ H2‫یا‬ ‫است‬ ‫بهتر‬H3‫؟‬ 10 Introduction Reminder Discriminant Function Margin Types of svm’s Issue Summary Conclusion Example Refrence image:wikipedia
  • 11.
    ‫کنیم‬ ‫انتخاب‬ ‫را‬‫ها‬ ‫ساز‬ ‫جدا‬ ‫از‬ ‫یک‬ ‫کدام‬ H2‫یا‬ ‫است‬ ‫بهتر‬H3‫؟‬ ‫که‬ ‫است‬ ‫بهتر‬ ‫سازی‬ ‫جدا‬ ‫حاشیه‬(margin)‫بزرگتری‬ ‫داشته‬‫باشدو‬‫بزرگت‬ ‫حاشیه‬‫ر‬ ‫اطمینان‬ ‫ضریب‬ ‫معنی‬ ‫به‬ ‫است‬ ‫بیشتر‬ 11 Introduction Reminder Discriminant Function Margin Types of svm’s Issue Summary Conclusion Example Refrence image:wikipedia
  • 12.
    ‫حاشیه‬(margin) • The hyperplanewith the largest margin has equal distances to the nearest sample of both classes • SVM finds the solution with maximum margin 12 Introduction Reminder Discriminant Function Margin Types of svm’s Issue Summary Conclusion Example
  • 13.
    What is Supportvector(‫پشتیبان‬ ‫?)بردار‬ ‫جدا‬ ‫های‬ ‫صفحه‬ ‫ابر‬ ‫به‬ ‫آموزشی‬ ‫های‬ ‫داده‬ ‫نزدیکترین‬‫کننده‬,‫بردار‬ ‫میشوند‬ ‫نامیده‬ ‫پشتیبان‬ Support vectors Maximizes marginNarrower margin 13 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example
  • 14.
    ‫یادآوری‬(‫صفحه‬ ‫در‬ ‫خط‬‫معادله‬)14 ‫خط‬ ‫ضمنی‬ ‫معادله‬‫فضای‬ ‫در‬2‫بعدی‬: Ax+By+C=0 ‫تغییر‬‫نوتیشن‬: 𝑤1 𝑥1 + 𝑤2 𝑥2 + 𝑏 = 0 W= 𝑤1 𝑤2 X= 𝑥1 𝑥2 𝑊 𝑇 𝑋 + b = 0 x y Ax+By+C=0 ‫فرم‬‫ماتریسی‬: W‫خط‬ ‫بر‬ ‫عمود‬ ‫است‬ ‫خطی‬‫جداساز‬‫آن‬ ‫جهت‬ ‫و‬ ‫مثبت‬ ‫جهت‬ ‫در‬‫جداساز‬‫باشد‬ ‫می‬
  • 15.
    ‫یادآوری‬(Linear Discriminant Function)15 ‫فضای‬‫دهیم‬ ‫تعمیم‬ ‫توانیم‬ ‫می‬2‫فضای‬ ‫به‬ ‫را‬ ‫بعدی‬n‫بعدی‬ ‫داریم‬ ‫پس‬. 𝑖 𝑤𝑖 𝑥𝑖 + 𝑏 = 0 𝑊 𝑇 𝑋 + b = 0 Refrence image:wikipedia
  • 16.
    ‫انواع‬svm Hard margin linearSVM  Soft margin linear SVM  Non-linear SVM 16 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example
  • 17.
    Hard Margin SVM17 Introduction Reminder Typesof svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example ‫فرض‬‫آموزشی‬ ‫های‬ ‫داده‬ ‫کنید‬‫صورت‬ ‫به‬< X 𝑖, Y𝑖 >‫است‬‫در‬ ‫که‬ ‫داریم‬ ‫آن‬: 𝑊 𝑇 𝑋 + b < −1 𝑊 𝑇 𝑋 + b > 1 {X 𝑖, Y𝑖} i=1,2,…,n X 𝑖 ∈ 𝑅 𝑑 Y𝑖∈ {1, −1} 𝑊 𝑇 𝑋𝑖 + b > 1 if Y𝑖 = 1 𝑊 𝑇 𝑋𝑖 + b < −1 if Y𝑖 = −1 For i=1,2,…,n
  • 18.
    Hard Margin SVM18 Introduction Reminder Typesof svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example ‫تعریف‬ ‫طبق‬svm‫حاشیه‬ ‫که‬ ‫هستیم‬ ‫جداسازی‬ ‫بدنبال‬ ‫ما‬ ‫به‬ ‫داریم‬ ‫احتیاج‬ ‫اینکار‬ ‫برای‬ ‫باشد‬ ‫داشته‬ ‫بزرگتری‬‫فرمولی‬‫ب‬‫رسیم‬ ‫دهد‬ ‫ما‬ ‫به‬ ‫ریاضی‬ ‫پارامتر‬ ‫حسب‬ ‫بر‬ ‫را‬ ‫حاشیه‬ ‫مقدار‬ ‫که‬ ‫فاصله‬ ‫همان‬ ‫حاشیه‬ ‫دانیم‬ ‫می‬ ‫طرفی‬ ‫از‬2‫اطراف‬ ‫موازی‬ ‫خط‬ ‫جداساز‬‫پ‬ ‫است‬‫س‬‫داریم‬: 𝑊 𝑇 𝑋 + b < −1 𝑊 𝑇 𝑋 + b > 1 𝑑 = 2 | 𝑤 |
  • 19.
    Hard Margin SVM19 Introduction Reminder Typesof svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example ‫مساله‬ ‫تعریف‬: ‫در‬svm‫حاشیه‬ ‫مقدار‬ ‫خواهیم‬ ‫می‬ ‫ما‬(𝑑 = 2 | 𝑤 | )‫دهیم‬ ‫افزایش‬ ‫را‬ ‫که‬ ‫این‬ ‫شرط‬ ‫به‬‫باشد‬ ‫برقرار‬ ‫زیر‬ ‫رابطه‬: 𝑊 𝑇 𝑋 + b < −1 𝑊 𝑇 𝑋 + b > 1 For i=1,2,…,n 𝑊 𝑇 𝑋𝑖 + b > 1 if Y𝑖 = 1 𝑊 𝑇 𝑋𝑖 + b < −1 if Y𝑖 = −1
  • 20.
    Hard Margin SVM20 Introduction Reminder Typesof svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example ‫حاشیه‬ ‫افزایش‬ ‫برای‬(𝑑 = 2 | 𝑤 | )‫مقدار‬ ‫باید‬| 𝑤 |(‫کسر‬ ‫مخرج‬)‫را‬ ‫مینیم‬‫بنویسی‬ ‫گونه‬ ‫این‬ ‫توانیم‬ ‫می‬ ‫را‬ ‫مساله‬ ‫شرط‬ ‫و‬ ‫کنیم‬‫م‬: Y𝑖(𝑊 𝑇 𝑋𝑖 + b) > 1 For i=1,2,…,n𝑊 𝑇 𝑋𝑖 + b > 1 if Y𝑖 = 1 𝑊 𝑇 𝑋𝑖 + b < −1 if Y𝑖 = −1
  • 21.
    Hard Margin SVM21 Introduction Reminder Typesof svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example ‫جای‬ ‫به‬ ‫خطی‬ ‫جبر‬ ‫از‬ ‫استفاده‬ ‫و‬ ‫کار‬ ‫راحتی‬ ‫برای‬‫مینیم‬‫کردن‬| 𝑤 | ‫تابع‬ 1 2 | 𝑤 |2 ‫را‬‫مینمم‬‫داریم‬ ‫پس‬ ‫کنیم‬ ‫می‬: For i=1,2,…,n Minimum 1 2 | 𝑤 |2 Such that: Y𝑖(𝑊 𝑇 𝑋𝑖 + b) > 1
  • 22.
    ‫توابع‬ ‫یادآوری‬‫الگرانژ‬‫شروط‬ ‫و‬KKT ‫تابع‬‫بخواهیم‬ ‫اگر‬F(x)‫را‬‫نسبت‬‫به‬x‫مینمم‬‫با‬ ‫کنیم‬ ‫که‬ ‫این‬ ‫شرط‬g(x)≥0‫تابع‬ ‫از‬ ‫توانیم‬ ‫می‬‫الگرانژ‬ ‫کنیم‬ ‫استفاده‬ L(x,u)=F(x)-uG(x) ; u≥0 ‫بایست‬ ‫می‬ ‫تابع‬ ‫این‬ ‫در‬ ‫که‬x‫را‬‫مینمم‬‫و‬ ‫کرده‬u‫را‬ ‫ماکسیمم‬‫سازبم‬ ‫شروط‬KKT: 22 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example ∂L(x,u) ∂x =0 uG(x)=0
  • 23.
    Hard Margin SVM23 Introduction Reminder Typesof svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example Minimum 1 2 | 𝑤 |2 Such that: Y𝑖(𝑊 𝑇 𝑋𝑖 + b) − 1 > 0 ‫قضیه‬ ‫طبق‬‫الگرانژ‬:
  • 24.
    Solving the OptimizationProblem 0]1)([  by iii wx only SVs will have non-zero ai 24 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example Karush-Kuhn-Tucker (KKT) conditions: 𝑎𝑖 ≥0  
  • 25.
    0]1)([:  bycondKKTiii wx 6=1.4 Class 1 Class 2 1=0.8 2=0 3=0 4=0 5=0 7=0 8=0.6 9=0 10=0 25 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example
  • 26.
    Solving the OptimizationProblem ‫تابع‬ ‫در‬ ‫گذاری‬ ‫جا‬ ‫با‬‫الگرانژ‬‫داریم‬: ‫قضیه‬ ‫طبق‬dual‫برای‬‫ماکسیمم‬‫حسب‬ ‫بر‬ ‫تابع‬ ‫کردن‬𝑎𝑖‫توانیم‬ ‫می‬ ‫مساله‬ ‫و‬ ‫کرده‬ ‫ضرب‬ ‫منفی‬ ‫در‬ ‫را‬ ‫باال‬ ‫رابطه‬‫مینیمایزیشن‬‫شود‬ 26 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example Q(α)
  • 27.
    Solving the OptimizationProblem ‫به‬ ‫تبدیل‬ ‫ما‬ ‫مساله‬‫یک‬quadratic program‫که‬ ‫است‬ ‫شده‬ ‫در‬ ‫توان‬ ‫می‬‫متلب‬‫دستور‬ ‫با‬quadprog‫مقدار‬ ‫و‬ ‫کرد‬ ‫حل‬a‫را‬ ‫ها‬ ‫آورد‬ ‫بدست‬ 27 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example ‫دستور‬ ‫از‬ ‫استفاده‬ ‫برای‬quadprog‫دو‬ ‫باید‬‫ماتریس‬‫کنیم‬ ‫تعریف‬ ‫را‬ H=𝑦𝑖 𝑦𝑗 𝑥𝑖 𝑇 𝑥𝑗 f=-1 a = quadprog(H,f)
  • 28.
    Hard Margin SVM ‫مقدار‬‫وقتی‬a‫مقدار‬ ‫توانیم‬ ‫می‬ ‫آوریم‬ ‫بدست‬ ‫را‬w‫زیر‬ ‫رابطه‬ ‫از‬ ‫را‬ ‫مقدار‬ ‫فقط‬ ‫و‬ ‫آوریم‬ ‫بدست‬b‫دوم‬ ‫شرط‬ ‫باید‬ ‫که‬ ‫ماند‬ ‫می‬kkt‫برقرار‬ ‫شود‬ 28 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example
  • 29.
    Hard Margin SVM29 Introduction Reminder Typesof svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example 0]1)([  by iii wxKKT cond: or 01)(  by ii wx 0i NSV iiyb wx SVs only SVs will have non-zero ai
  • 30.
    0]1)([:  bycondKKTiii wx 6=1.4 Class 1 Class 2 1=0.8 2=0 3=0 4=0 5=0 7=0 8=0.6 9=0 10=0 30 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example
  • 31.
    Summary Solving theOptimization Problem ‫گام‬1:‫دیتاست‬{X 𝑖, Y𝑖}‫مقدار‬ ‫که‬ ‫است‬ ‫این‬ ‫هدف‬ ‫گیریم‬ ‫می‬ ‫را‬ ‫اولیه‬w,b ‫بیاوریم‬ ‫بدست‬ ‫را‬ ‫گام‬2:‫ماتریس‬H ,f‫تابع‬ ‫از‬ ‫و‬ ‫سازیم‬ ‫می‬ ‫را‬‫متلب‬a = quadprog(H,f) ‫کنیم‬ ‫می‬ ‫استفاده‬‫مقدار‬ ‫تا‬a‫آید‬ ‫بدست‬ h𝑖𝑗=𝑦𝑖 𝑦𝑗 𝑥𝑖 𝑇 𝑥𝑗 , f=-1 , H=[h𝑖𝑗] ‫گام‬3:‫مقدار‬w ,b‫آوریم‬ ‫می‬ ‫بدست‬ ‫زیر‬ ‫فرمول‬ ‫از‬ ‫را‬ ‫گام‬4:‫شود‬ ‫می‬ ‫زیر‬ ‫رابطه‬ ‫ما‬ ‫خطی‬ ‫ساز‬ ‫جدا‬: 31 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example ‫خروجی‬ = 𝑠𝑖𝑔𝑛(𝑊 𝑇 𝑋𝑖 + b)
  • 32.
    The Quadratic ProgrammingProblem32 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example
  • 33.
    Soft Margin SVM What if data is not linear separable? (noisy data, outliers, etc.)  Slack variables ξi can be added to allow miss-classification of difficult or noisy data points 33 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example 𝑊 𝑇 𝑋 + b > 1 yi (<w,xi> + b) ≥1 -xi , xi ≥ 0
  • 34.
    Soft Margin SVM ‫متغیر‬‫معرفی‬ ‫با‬xi, i=1, 2, …, N,‫رابطه‬ ‫و‬ ‫شده‬ ‫تر‬ ‫ساده‬ ‫قبلی‬ ‫های‬ ‫محدودیت‬ yi (<w,xi> + b) ≥1 ‫بصورت‬‫میکند‬ ‫تغییر‬ ‫زیر‬: yi (<w,xi> + b) ≥1 -xi , xi ≥ 0 ‫در‬‫متغیر‬ ‫این‬ ‫همه‬ ‫آل‬ ‫ایده‬ ‫حالت‬‫ها‬xi‫باشند‬ ‫صفر‬ ‫باید‬. 34 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example
  • 35.
    Soft Margin SVM ‫مساله‬hardmargin‫که‬ ‫کند‬ ‫می‬ ‫تغییر‬ ‫گونه‬ ‫این‬ ‫ما‬‫را‬ ‫زیر‬ ‫تابع‬‫مینمم‬‫کنیم‬. Such that 35 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example 𝑊 𝑇 𝑋 + b > 1
  • 36.
    Soft Margin SVM Ctrades-off margin width and misclassifications SVM tries to maintain to zero while maximizing margin. 36 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example Parametr c can be viewed as a way to control over-fiting Such that:
  • 37.
    Soft Margin SVM37 Introduction Reminder Typesof svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example ST: ‫قضیه‬ ‫طبق‬‫الگرانژ‬:
  • 38.
    Soft Margin SVM38 Introduction Reminder Typesof svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example KKT conditions: ‫قضیه‬‫الگرانژ‬
  • 39.
    Soft Margin SVM39 Introduction Reminder Typesof svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example KKT conditions: 0≤𝑎𝑖 ≤ 𝐶 0≤𝐵𝑖 ≤ 𝐶
  • 40.
    Soft Margin SVM ‫تابع‬‫در‬ ‫گذاری‬ ‫جا‬ ‫با‬‫الگرانژ‬‫و‬ ‫سازی‬ ‫ساده‬ ‫و‬ ‫قضیه‬ ‫از‬ ‫استفاده‬daul(‫مثل‬hard margin)‫داریم‬: ‫همان‬ ‫این‬ ‫که‬quadratic program‫قسمت‬ ‫در‬ ‫که‬ ‫است‬hard margin‫دستور‬ ‫با‬ ‫و‬ ‫است‬ ‫شده‬ ‫گفته‬‫متلب‬quadprog‫مقدار‬ a‫آید‬ ‫می‬ ‫بدست‬ 40 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example
  • 41.
    Soft Margin SVM ‫با‬‫مقدار‬‫داشتن‬a‫مقدار‬w‫آید‬ ‫می‬ ‫بدست‬ ‫زیر‬ ‫رابطه‬ ‫از‬ 41 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example
  • 42.
    Soft Margin SVM ‫آوردن‬‫بدست‬ ‫برای‬b‫آخر‬ ‫شرط‬kkt‫کنیم‬ ‫می‬ ‫بررسی‬ ‫را‬: ‫داریم‬ ‫باال‬ ‫شروط‬ ‫از‬: 1) 42 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example (𝐶-𝑎𝑖) xi =0 Not Support Vector
  • 43.
    Soft Margin SVM 2) 3) 43 Introduction Reminder Typesof svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example (𝐶-𝑎𝑖) xi =0 Where U is the set of unbounded support vector
  • 44.
    Summary Soft MarginSVM44 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example ‫گام‬1:‫دیتاست‬{X 𝑖, Y𝑖}‫پارامتر‬ ‫برای‬ ‫و‬ ‫گیریم‬ ‫می‬ ‫را‬ ‫اولیه‬c‫تعیین‬ ‫مقدار‬ ‫یک‬ ‫مقدار‬ ‫که‬ ‫است‬ ‫این‬ ‫هدف‬ ‫کنیم‬ ‫می‬w,b‫بیاوریم‬ ‫بدست‬ ‫را‬ ‫گام‬2:‫ماتریس‬H ,f‫تابع‬ ‫از‬ ‫و‬ ‫سازیم‬ ‫می‬ ‫را‬‫متلب‬ a = quadprog(H,f,1,c)‫کنیم‬ ‫می‬ ‫استفاده‬‫مقدار‬ ‫تا‬a‫آید‬ ‫بدست‬ H=𝑦𝑖 𝑦𝑗 𝑥𝑖 𝑇 𝑥𝑗 , f=-1 ‫گام‬3:‫مقدار‬w ,b‫آوریم‬ ‫می‬ ‫بدست‬ ‫زیر‬ ‫فرمول‬ ‫از‬ ‫را‬ ‫گام‬4:‫شود‬ ‫می‬ ‫زیر‬ ‫رابطه‬ ‫ما‬ ‫خطی‬ ‫ساز‬ ‫جدا‬: ‫خروجی‬ = 𝑠𝑖𝑔𝑛(𝑊 𝑇 𝑋𝑖 + b)
  • 45.
    ‫بین‬ ‫مقایسه‬soft margin‫و‬hardmargin  Soft-Margin is more robust to outliers  Hard-Margin does not require to guess the cost parameter (requires noparameters at all) 45 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example
  • 46.
    Nonlinear SVMs46 Introduction Reminder Types ofsvm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example
  • 47.
    Nonlinear SVMs ‫ها‬ ‫نمونه‬‫که‬ ‫است‬ ‫این‬ ‫اصلی‬ ‫ایده‬ ‫نباشد‬ ‫خطی‬ ‫صورت‬ ‫به‬ ‫پذیری‬ ‫جدا‬ ‫که‬ ‫حالتی‬ ‫در‬ ‫باال‬ ‫بعد‬ ‫با‬ ‫فضای‬ ‫یک‬ ‫به‬ ‫را‬(feature space)‫در‬ ‫که‬ ‫دهیم‬ ‫نگاشت‬ ‫مشخصه‬ ‫فضای‬ ‫شوند‬ ‫جدا‬ ‫هم‬ ‫از‬ ‫خطی‬ ‫صورت‬ ‫به‬ ‫توانند‬ ‫می‬ ‫ها‬ ‫نمونه‬ ‫؛‬ ‫ها‬ ‫مشخصه‬ ‫جدید‬ ‫فضای‬. 47 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example
  • 48.
    Nonlinear SVMs ‫نگاشت‬ ‫برای‬(maping)‫هسته‬‫تابع‬ ‫یک‬ ‫اعمال‬ ‫به‬ ‫نیاز‬(‫کرنل‬)‫می‬ ‫باشد‬ 48 Introduction Reminder Types of svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example
  • 49.
    Nonlinear SVMs49 Introduction Reminder Types ofsvm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example
  • 50.
    ‫ها‬ ‫کرنل‬ ‫معرفی‬50 Introduction Reminder Typesof svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example  Linear Kernels  Polynomial Kernels  Radial Basis Function Kernels  Three-Layer Neural Network Kernels  Normalizing Kernels
  • 51.
    Nonlinear SVMs ‫اولیه‬ ‫تابع‬ G(x)=w 𝑇 𝑋 + 𝑏  G(x)=w 𝑇 φ(X)+ 𝑏 quadratic program ‫کردن‬ ‫خطی‬ ‫غیر‬ 51 Kernel mode
  • 52.
    Nonlinear SVMs ‫اولیه‬ ‫تابع‬ G(X)=w 𝑇 𝑋 + 𝑏  G(X)=w 𝑇 φ(X)+𝑏 ‫نهایی‬ ‫تابع‬(‫نمونه‬ ‫خروجی‬ ‫مقدار‬X) ‫کردن‬ ‫خطی‬ ‫غیر‬ 52
  • 53.
    Summary Nonlinear SVMs53 Introduction Reminder Typesof svm’s What’s SV? Hard Margin Soft Margin Nonlinear Margin Issue Summary Conclusion Example ‫گام‬1:‫دیتاست‬{X 𝑖, Y𝑖}‫پارامتر‬ ‫برای‬ ‫و‬ ‫گیریم‬ ‫می‬ ‫را‬ ‫اولیه‬c‫تعیین‬ ‫مقدار‬ ‫یک‬ ‫مقدار‬ ‫که‬ ‫است‬ ‫این‬ ‫هدف‬ ‫کنیم‬ ‫می‬b,a‫بیاوریم‬ ‫بدست‬ ‫را‬ ‫گام‬2:‫ماتریس‬H ,f‫تابع‬ ‫از‬ ‫و‬ ‫سازیم‬ ‫می‬ ‫را‬‫متلب‬ a = quadprog(H,f,1,c)‫کنیم‬ ‫می‬ ‫استفاده‬‫مقدار‬ ‫تا‬a‫آید‬ ‫بدست‬ H=𝑦𝑖 𝑦𝑗 𝑥𝑖 𝑇 𝑥𝑗 , f=-1 ‫گام‬4:‫شود‬ ‫می‬ ‫زیر‬ ‫رابطه‬ ‫ما‬ ‫ساز‬ ‫جدا‬:
  • 54.
    C‫اس‬ ‫کننده‬ ‫تنظیم‬‫پارامتر‬ ‫یک‬‫ت‬ C‫گرفته‬ ‫نادیده‬ ‫راحتی‬ ‫به‬ ‫که‬ ‫محدودیت‬ ‫تا‬ ‫دهد‬ ‫می‬ ‫اجازه‬ ‫کوچک‬‫شود‬ C‫باعث‬ ‫بزرگ‬‫محدودیت‬ ‫شود‬ ‫می‬‫را‬ ‫ها‬‫سخت‬‫نادیده‬‫بگیرد‬ 54 Introduction Reminder Types of svm’s Issue parameter c Multi-Class Classification Choice of kernel Optimization criterion Summary Conclusion Example
  • 55.
    RBF Kernel SVMExample data is not linearly separable in original feature space http://www.robots.ox.ac.uk/~az/lectures/ml/ 55 Introduction Reminder Types of svm’s Issue parameter c Multi-Class Classification Choice of kernel Optimization criterion Summary Conclusion Example
  • 56.
    RBF Kernel SVMExample56 Introduction Reminder Types of svm’s Issue parameter c Multi-Class Classification Choice of kernel Optimization criterion Summary Conclusion Example
  • 57.
    RBF Kernel SVMExample57 Introduction Reminder Types of svm’s Issue parameter c Multi-Class Classification Choice of kernel Optimization criterion Summary Conclusion Example
  • 58.
    RBF Kernel SVMExample58 Introduction Reminder Types of svm’s Issue parameter c Multi-Class Classification Choice of kernel Optimization criterion Summary Conclusion Example
  • 59.
    RBF Kernel SVMExample ‫تغیرات‬ ‫بررسی‬‫سیگما‬ 59 Introduction Reminder Types of svm’s Issue parameter c Multi-Class Classification Choice of kernel Optimization criterion Summary Conclusion Example
  • 60.
    RBF Kernel SVMExample60 Introduction Reminder Types of svm’s Issue parameter c Multi-Class Classification Choice of kernel Optimization criterion Summary Conclusion Example
  • 61.
    RBF Kernel SVMExample61 Introduction Reminder Types of svm’s Issue parameter c Multi-Class Classification Choice of kernel Optimization criterion Summary Conclusion Example
  • 62.
    Multi-Class Classification Buildfrom binary classifiers 62 Introduction Reminder Types of svm’s Issue parameter c Multi-Class Classification Choice of kernel Optimization criterion Summary Conclusion Example
  • 63.
    Some Issues(Choice ofkernel)  Gaussian or polynomial kernel is default  If ineffective, more elaborate kernels are needed  Domain experts can give assistance in formulating appropriate similarity measure. 63 Introduction Reminder Types of svm’s Issue parameter c Multi-Class Classification Choice of kernel Optimization criterion Summary Conclusion Example
  • 64.
    Some Issues(Optimization criterion) Hard margin v.s. Soft margin  a lengthy series of experiments in which various parameters are tested 64 Introduction Reminder Types of svm’s Issue parameter c Multi-Class Classification Choice of kernel Optimization criterion Summary Conclusion Example
  • 65.
    ‫الگوریتم‬svm (1‫کرنل‬ ‫تابع‬ ‫انتخاب‬ (2‫مقدار‬‫کردن‬ ‫مشخص‬C (3‫مساله‬ ‫کردن‬ ‫حل‬‫کوا‬‫د‬‫راتیک‬ Solve the quadratic programming problem (many software packages available) (4‫تابع‬ ‫ساختن‬svm‫آمده‬ ‫بدست‬ ‫پارامترهای‬ ‫از‬ 65 Introduction Reminder Types of svm’s Issue Summary Conclusion Example
  • 66.
    Summary: Support VectorMachine  Soft Margin Classifier : Better generalization ability & less over-fitting  The Kernel Trick: Map data points to higher dimensional space in order to make them  linearly separable: Since only dot product is used, we do not need to represent the mapping explicitly 66 Introduction Reminder Types of svm’s Issue Summary Conclusion Example
  • 67.
    Strengths  No localminimal  Robustness to outliers  Training is relatively easy  Good generalization in theory and practice  Work well with few training instances  Find globally best model, No local optimal, unlike in neural networks  It scales relatively well to high dimensional data  Tradeoff between classifier complexity and error can be controlled explicitly  Non-traditional data like strings and trees can be used as input to SVM, instead of feature vectors  Notice: SVM does not minimize the number of misclassifications (NP-complete problem) but the sum of distancesfrom the margin hyperplanes. 67 Introduction Reminder Types of svm’s Issue Summary Strengths Weakness SVMs vs. Neural Networks Conclusion Example
  • 68.
    Weakness  Selection ofparameters)Need to choose a “ good” kernel function.(  Extension to multiclass problems Refrenc:Advances in Pattern Recognition Chapter 2 part 2.5 “Advantages and Disadvantages” Shigeo Abe 68 Introduction Reminder Types of svm’s Issue Summary Strengths Weakness SVMs vs. Neural Networks Conclusion Example
  • 69.
    SVMs vs. NeuralNetworks  Kernel maps to a high dimensional spaces  Search space has a unique minimum  Training is extremely efficient  Classification extremely efficient  Kernel and cost are the two parameters to select  Very good accuracy in typical domains  Extremely robust  Hidden Layers map to lower dimensional spaces  Search space has multiple local minimum  Training is expensive  Classification extremely efficient  Requires number of hidden units and layers  Very good accuracy in typical domains  Could be robust Refrenc:Advances in Pattern Recognition Chapter 2 part 2.6 “Characteristics of Solutions” Shigeo Abe 69 Introduction Reminder Types of svm’s Issue Summary Strengths Weakness SVMs vs. Neural Networks Conclusion Example
  • 70.
    Conclusion Svm‫ترین‬ ‫بهینه‬‫جداکننده‬‫خطی‬‫راپیدا‬،‫کند‬ ‫می‬ ‫دیگر‬‫خالف‬ ‫بر‬‫الگوریتم‬‫خطای‬ ‫که‬ ‫ها‬‫مدلسازی‬‫کنند‬ ‫می‬ ‫کمینه‬ ‫را‬ svm‫گیرد‬ ‫می‬ ‫نظر‬ ‫در‬ ‫هدف‬ ‫تابع‬ ‫عنوان‬ ‫به‬ ‫را‬ ‫عملیاتی‬ ‫ریسک‬ ‫ابر‬‫که‬ ‫را‬ ‫صفحه‬‫بیشتربن‬‫داشته‬ ‫را‬ ‫حاشیه‬‫باشدانتخاب‬‫کند‬ ‫می‬ ‫شوند‬ ‫می‬ ‫باعث‬ ‫ها‬ ‫کرنل‬‫الگوریتم‬svm‫شود‬ ‫خطی‬ ‫غیر‬ ‫از‬ ‫مناسب‬ ‫استفاده‬ ‫صورت‬ ‫در‬SVM‫این‬‫الگوریتم‬‫خوبی‬ ‫تعمیم‬ ‫قدرت‬ ‫خواهد‬‫داشت‬ ‫زیاد‬ ‫ابعاد‬ ‫داشتن‬ ‫علیرغم‬high dimensionality))‫از‬overfitting ‫میکند‬ ‫پرهیز‬.‫از‬ ‫ناشی‬ ‫خاصیت‬ ‫این‬optimization‫این‬‫الگوریتم‬‫است‬ ‫اطالعات‬ ‫سازی‬ ‫فشرده‬: ‫از‬ ‫آموزشی‬ ‫های‬ ‫داده‬ ‫بجای‬‫بردارهای‬‫میکند‬ ‫استفاده‬ ‫پشتیبان‬. Refrenc:Advances in Pattern Recognition Chapter 2 part 2.6 “Characteristics of Solutions” Shigeo Abe 70 Introduction Reminder Types of svm’s Issue Summary Conclusion Example
  • 71.
    Svm code  http://www.kernel-machines.org/software.html http://www.csie.ntu.edu.tw/~cjlin/libsvm  http://svmlight.joachims.org/  A Practical Guide to Support Vector Classification(Chih-Wei Hsu, Chih-Chung Chang, and Chih-Jen Lin) 71 Introduction Reminder Types of svm’s Issue Summary Conclusion Example
  • 72.
    matlab ‫از‬svmtrain‫ساختار‬ ‫کردن‬ ‫درست‬‫برای‬svm‫شود‬ ‫می‬ ‫استفاده‬  Syntax SVMStruct = svmtrain(Training,Group) SVMStruct = svmtrain(Training,Group,Name,Value) ‫از‬svmclassify‫برای‬‫شود‬ ‫می‬ ‫استفاده‬ ‫تست‬ ‫نمونه‬ ‫کالسبندی‬  Syntax Group = svmclassify(SVMStruct,Sample) Group = svmclassify(SVMStruct,Sample,'Showplot',true) 72 Introduction Reminder Types of svm’s Issue Summary Conclusion Example
  • 73.
    matlab  svmtrain(Training,Group, 'kernel_function',Value) Value:  'linear' — Linear kernel, meaning dot product.  'quadratic' — Quadratic kernel.  'polynomial' — Polynomial kernel (default order 3). Specify another order with the polyorder name-value pair.  'rbf' — Gaussian Radial Basis Function kernel with a default scaling factor, sigma, of 1. Specify another value for sigma with the rbf_sigma name-value pair.  'mlp' — Multilayer Perceptron kernel with default scale [1 –1]. Specify another scale with the mlp_params name-value pair.  Default: 'linear 73 Introduction Reminder Types of svm’s Issue Summary Conclusion Example
  • 74.
    ‫مثال‬1( :fisheriris)  loadfisheriris  xdata = meas(51:end,3:4);  group = species(51:end);  svmStruct = svmtrain(xdata,group,'ShowPlot',true); 74 Introduction Reminder Types of svm’s Issue Summary Conclusion Example
  • 75.
    ‫مثال‬1:‫نمونه‬ ‫یک‬ ‫تست‬ species = svmclassify(svmStruct,[5 2],'ShowPlot',true)  hold on;  plot(5,2,'ro','MarkerSize',12);  hold off 75 Introduction Reminder Types of svm’s Issue Summary Conclusion Example
  • 76.
    ‫رفرنس‬  Advances inPattern Recognition(Support Vector Machines for Pattern Classification) Professor Dr Shigeo Abe ‫اسالید‬A. Zisserman ‫فایل‬MATLAB Workshop 2‫سایت‬ ‫از‬www.david-lindsay.co.uk ‫ماشین‬ ‫یادگیری‬ ‫درس‬ ‫ویدیویی‬ ‫فایل‬Andrew Ng 76 Introduction Reminder Types of svm’s Issue Summary Conclusion Example
  • 77.