Support Vector Machine (SVM)
• Machine Learning Algorithm
• Prepared for Lecture
Learning Objectives
• • Understand SVM concept
• • Learn hyperplane & margin
• • Understand kernel functions
• • Kernel selection
Introduction to SVM
• • Supervised learning algorithm
• • Used for classification & regression
• • Finds optimal separating hyperplane
Why SVM?
• • Works well in high dimensions
• • Robust to overfitting
• • Uses only support vectors
SVM Classification
• • Separates data using hyperplane
• • Maximizes margin
• • Support vectors define boundary
Hyperplane
• • Decision boundary
• • Line (2D), Plane (3D)
• • w.x + b = 0
Margin
• • Distance between hyperplane and closest
points
• • Larger margin = better generalization
Optimal Separating Hyperplane
• • Hyperplane with maximum margin
• • Depends only on support vectors
Non-Linearly Separable Data
• • Real-world data is often non-linear
• • Linear boundary fails
• • Use kernel functions
Kernel Trick
• • Transforms data to higher dimensions
• • Enables linear separation
Types of Kernels
• • Linear
• • Polynomial
• • RBF (Gaussian)
• • Sigmoid
Linear Kernel
• • Simple & fast
• • Used for text classification
Polynomial Kernel
• • Handles curved boundaries
• • Used in image processing
RBF Kernel
• • Most popular
• • Handles complex data patterns
Kernel Selection
• • Linear: simple data
• • Polynomial: moderate complexity
• • RBF: complex data
Advantages of SVM
• • High accuracy
• • Works with small datasets
• • Effective in high dimensions
Limitations of SVM
• • Computationally expensive
• • Kernel tuning required
Applications of SVM
• • Spam detection
• • Image recognition
• • Fraud detection
• • Medical diagnosis
Thank You
• Questions & Discussion

ML_Support_Vector_Machine_Lecture_1.pptx