4. Statistical Learning
Supervised Learning : We have both features and responses
• Algorithms: Support Vector Machine (SVM)
Unsupervised Learning: We only have data with features
• Principle Component Analysis (PCA)
7. Support Vector Classifier
The band in the figure is M units away from the hyperplane on either side, and hence 2M units wide. It is called the
margin.
Separable Case
12. We plugged in two dimensional data in kernel, kernel turned it into six dimensional data and did simple SVC
In six dimensional space.
13. Linear problem in higher dimension = Non-linear problem in lower dimension
14. Principle Component Analysis
We will always assume that we have centered data
sample-covariance matrix dim(C) = p x p
dim(X) = p x 1
dim(u) = p x 1 and total p eigenvectors
If we take k eigenvectors then
dim(Uk) = k x p
dim(y) = k x 1
17. Kernel Principle Component Analysis
If we have some data xi that does not perform good if we apply PCA, but if we map xi to
some higher dimension through non-linear transformation and apply KPCA with proper
kernel we can reduce the dimension.
18.
19. Results from :
Kernel Principal Component Analysis and its Applications in Face Recognition and Active Shape Models