SUPERVIS
ED
LEARNING
Presented By
Assistant Professor
AI&DS
Rathinam Technical Campus
kowsysara.soc@rathinam.in
Supervised Learning
Labeled Data: Supervised learning uses labeled datasets to train models.
Prediction & Classification: It helps in tasks like regression (predicting values) and classification
(categorizing data).
Supervised Learning
Definition:
Supervised Learning is a type of machine learning where an algorithm is trained on labeled data,
meaning each input is paired with the correct output. The model learns patterns from this data to
make predictions or classifications on new, unseen data.
Types of Supervised Learning in Machine
Learning
Types of Supervised Learning in Machine
Learning
Machine Learning Infographics
Regression Algorithms (For Predicting
Numbers)
Linear Regression – Finds a straight-line relationship.
Eg: Predicts house prices based on size.
Polynomial Regression – Fits curves instead of straight lines.
Eg:Predicts car mileage based on engine size.
Regression Algorithms (For Predicting
Numbers)
Ridge Regression – Prevents overfitting using regularization.
Eg: Predicting house prices where all features (size, location, no. of rooms)
have some impact.
Lasso Regression – Shrinks less important features to zero
Eg: In medical diagnosis, Lasso can select only the most relevant
symptoms while ignoring minor ones.
Classification Algorithms (For Predicting
Categories)
Logistic Regression – Classifies data into two or more categories.
Eg: Predicts if an email is spam or not.
Linear Discriminant Analysis (LDA)– LDA is mainly used when there are multiple categories
to classify efficiently
Eg: Credit Card Fraud Detection – Identifies fraudulent transactions by distinguishing normal
vs. suspicious behavior.
Naïve Bayes : A probability-based classifier that assumes features are independent.
Classification Algorithms (For Predicting Categories)
Perceptron Algorithm : A simple model that learns to classify data into two groups using a
straight-line decision boundary.
✅ Handwritten Digit Recognition – Identifies if a digit is "0" or "1".
✅ Spam Detection – Decides if an email is spam or not.
✅ Disease Prediction – Classifies if a person has a disease or not.
Classification Algorithms (For Predicting Categories)
Perceptron Algorithm : The Perceptron Learning Algorithm is a binary classification
algorithm that adjusts weights associated with input features iteratively based on
misclassifications, aiming to find a decision boundary that separates classes.
Classification Algorithms (For Predicting Categories)
Perceptron Algorithm : Perceptron is a type of neural network that performs binary
classification that maps input features to an output decision, usually classifying data into
one of two categories, such as 0 or 1.
Regression and Classification Algorithms
Regression and Classification Algorithms
Regression and Classification Algorithms
Over-Fitting – Under-Fitting
Over-Fitting – Under-Fitting
Linear Regression Models
 Linear Regression is a supervised learning algorithm used to predict continuous values
based on input features.
 A linear regression model is a statistical method used to predict a dependent variable (Y)
based on one or more independent variables (X).
 It assumes a linear relationship between the variables and fits a straight line to minimize
errors.
X-axis → Represents the independent variable (e.g., study hours).
Y-axis → Represents the dependent variable (e.g., exam scores).
Line of Best Fit → A straight line that best represents the data trend
Linear Regression Models
Simple Example
Imagine you own a car dealership and want to predict the price of a car based on its age.
Independent Variable (Input): Age of the car (in years)
Dependent Variable (Output): Price of the car (in dollars)
• When you plot this data on a graph, older cars
tend to have lower prices, forming a downward
trend.
• Linear regression finds the best-fit straight line
that represents this trend.
Linear Regression Models
Simple Example
Imagine you own a car dealership and want to predict the price of a car based on its age.
Independent Variable (Input): Age of the car (in years)
Dependent Variable (Output): Price of the car (in dollars)
Linear Regression Models
Simple Example
Imagine you own a car dealership and want to predict the price of a car based on its age.
Independent Variable (Input): Age of the car (in years)
Dependent Variable (Output): Price of the car (in dollars)
Linear Regression Models
Types of Linear Regression Models
1. Simple Linear Regression
Definition: Uses one independent variable to predict the outcome.
Example: Predicting car price based on age.
2. Multiple Linear Regression
Definition: Uses multiple independent variables to predict the outcome.
Example: Predicting house price based on square footage, number of bedrooms, and location.
Linear Regression Models
Types of Linear Regression Models
Linear Regression Models
Types of Linear Regression Models
Linear Regression Models
• Both Simple Linear Regression and Multiple Linear Regression use the
Least Squares Method to find the best-fit line by minimizing the error
(difference between actual and predicted values).
• Least Squares Method is a mathematical technique used to find the best-
fitting line or curve by minimizing the sum of squared differences between
actual and predicted values.
• It is widely used in regression analysis, data science, and AI to make accurate
predictions from given data
Linear Regression Models
Least Squares Method - Finding The “Line Of Best Fit”
Linear Regression Models
Least Squares Method
Linear Regression Models
Types of Linear Regression Models
Linear Regression Models
Linear Regression Models
.
Libraries for Single Linear Regression, Multiple Linear Regression, and Least Squares Calculation:
Linear Regression Models
.
Python code for Single Linear Regression, Multiple Linear Regression, and Least Squares Calculation:
Linear Regression Models
.
Python code for Single Linear Regression, Multiple Linear Regression, and Least Squares Calculation:
print(f"Equation: Price = {model.intercept_:.2f} + {model.coef_[0]:.4f}(Size) + {model.coef_[1]:.2f}(Bedrooms)")
Linear Regression Models
.
• Bayesian Linear Regression and Gradient Descent play different roles in building a linear regression model:
• Both use different techniques to find the best-fit line.
Bayesian Linear Regression:
 Think of it like making predictions with confidence.
 Instead of finding just one best-fit line, it gives a range of possible lines based on prior knowledge and new data.
 It helps in understanding uncertainty in predictions.
Linear Regression Models
.
Bayesian Linear Regression:
 Think of it like making predictions with confidence.
 Instead of finding just one best-fit line, it gives a range of possible lines based on prior knowledge and new data.
 It helps in understanding uncertainty in predictions.
Linear Regression Models
.
Bayesian Linear Regression: Bayesian theorem
.
Linear Regression Models
Gradient Descent – An Optimization Algorithm
 A gradient is like a direction and speed guide that tells us how to change something to get a better result.
 "Descent" simply means moving downward or going lower.
 Gradient Descent is an optimization algorithm used to minimize a cost function by iteratively adjusting parameters.
 It helps machine learning models learn by updating weights in the direction that reduces error.
 Commonly used in linear regression, logistic regression, and neural networks.
.
Linear Regression Models
Gradient Descent – An Optimization Algorithm
Linear Classification Models
 A linear classification model is a machine learning model that separates data into categories using a straight line (or
a plane in higher dimensions).
 It classifies data points by drawing a boundary (called a decision boundary) based on a linear equation.
Linear Classification Models
DISCRIMINANT FUNCTION
 A discriminant function is a statistical tool that classifies data
into groups
 It's used in pattern classifiers, medical research, and other
field
 A discriminant function in machine learning is a
function that separates data points into different
classes.
 It's used in pattern classifiers to determine the class of
an input.
Linear Classification Models
PERCEPTRON ALGORITHM
 The Perceptron Algorithm is the simplest type of artificial
neural network used for binary classification (deciding
between two categories like "Yes/No" or "Spam/Not Spam").
 It works by adjusting weights to find the best decision
boundary that separates two classes.
Linear Classification Models
PERCEPTRON ALGORITHM
Linear Classification Models
PERCEPTRON ALGORITHM
Linear Classification Models
PROBABILISTIC DISCRIMINATIVE MODEL and
 A Probabilistic Discriminative Model - Finds the boundary between classes and directly
predicts the probability of a class given an input.
 Example: Spam Filter – Predicts how likely an email is spam (e.g., 80% Spam, 20% Not Spam).
 Probabilistic Generative Model → Learns how data is generated for each class and then uses
that to classify new data.
 Example: Face Generator – Learns the features of human faces and can generate new faces
based on the learned patterns.
Logistic Regression
 Logistic Regression is a machine learning algorithm used for classification problems.
 It predicts whether something belongs to one category or another (e.g., Yes/No, Spam/Not Spam).
 Instead of giving a direct "Yes" or "No," it gives a probability and applies a threshold (usually 0.5).
Logistic Regression
Logistic Regression
Logistic Regression

UNIT II SUPERVISED LEARNING - Introduction

  • 1.
  • 2.
    Supervised Learning Labeled Data:Supervised learning uses labeled datasets to train models. Prediction & Classification: It helps in tasks like regression (predicting values) and classification (categorizing data).
  • 3.
    Supervised Learning Definition: Supervised Learningis a type of machine learning where an algorithm is trained on labeled data, meaning each input is paired with the correct output. The model learns patterns from this data to make predictions or classifications on new, unseen data.
  • 4.
    Types of SupervisedLearning in Machine Learning
  • 5.
    Types of SupervisedLearning in Machine Learning
  • 6.
  • 7.
    Regression Algorithms (ForPredicting Numbers) Linear Regression – Finds a straight-line relationship. Eg: Predicts house prices based on size. Polynomial Regression – Fits curves instead of straight lines. Eg:Predicts car mileage based on engine size.
  • 8.
    Regression Algorithms (ForPredicting Numbers) Ridge Regression – Prevents overfitting using regularization. Eg: Predicting house prices where all features (size, location, no. of rooms) have some impact. Lasso Regression – Shrinks less important features to zero Eg: In medical diagnosis, Lasso can select only the most relevant symptoms while ignoring minor ones.
  • 9.
    Classification Algorithms (ForPredicting Categories) Logistic Regression – Classifies data into two or more categories. Eg: Predicts if an email is spam or not. Linear Discriminant Analysis (LDA)– LDA is mainly used when there are multiple categories to classify efficiently Eg: Credit Card Fraud Detection – Identifies fraudulent transactions by distinguishing normal vs. suspicious behavior. Naïve Bayes : A probability-based classifier that assumes features are independent.
  • 10.
    Classification Algorithms (ForPredicting Categories) Perceptron Algorithm : A simple model that learns to classify data into two groups using a straight-line decision boundary. ✅ Handwritten Digit Recognition – Identifies if a digit is "0" or "1". ✅ Spam Detection – Decides if an email is spam or not. ✅ Disease Prediction – Classifies if a person has a disease or not.
  • 11.
    Classification Algorithms (ForPredicting Categories) Perceptron Algorithm : The Perceptron Learning Algorithm is a binary classification algorithm that adjusts weights associated with input features iteratively based on misclassifications, aiming to find a decision boundary that separates classes.
  • 12.
    Classification Algorithms (ForPredicting Categories) Perceptron Algorithm : Perceptron is a type of neural network that performs binary classification that maps input features to an output decision, usually classifying data into one of two categories, such as 0 or 1.
  • 13.
  • 14.
  • 15.
  • 16.
  • 17.
  • 18.
    Linear Regression Models Linear Regression is a supervised learning algorithm used to predict continuous values based on input features.  A linear regression model is a statistical method used to predict a dependent variable (Y) based on one or more independent variables (X).  It assumes a linear relationship between the variables and fits a straight line to minimize errors. X-axis → Represents the independent variable (e.g., study hours). Y-axis → Represents the dependent variable (e.g., exam scores). Line of Best Fit → A straight line that best represents the data trend
  • 19.
    Linear Regression Models SimpleExample Imagine you own a car dealership and want to predict the price of a car based on its age. Independent Variable (Input): Age of the car (in years) Dependent Variable (Output): Price of the car (in dollars) • When you plot this data on a graph, older cars tend to have lower prices, forming a downward trend. • Linear regression finds the best-fit straight line that represents this trend.
  • 20.
    Linear Regression Models SimpleExample Imagine you own a car dealership and want to predict the price of a car based on its age. Independent Variable (Input): Age of the car (in years) Dependent Variable (Output): Price of the car (in dollars)
  • 21.
    Linear Regression Models SimpleExample Imagine you own a car dealership and want to predict the price of a car based on its age. Independent Variable (Input): Age of the car (in years) Dependent Variable (Output): Price of the car (in dollars)
  • 22.
    Linear Regression Models Typesof Linear Regression Models 1. Simple Linear Regression Definition: Uses one independent variable to predict the outcome. Example: Predicting car price based on age. 2. Multiple Linear Regression Definition: Uses multiple independent variables to predict the outcome. Example: Predicting house price based on square footage, number of bedrooms, and location.
  • 23.
    Linear Regression Models Typesof Linear Regression Models
  • 24.
    Linear Regression Models Typesof Linear Regression Models
  • 25.
    Linear Regression Models •Both Simple Linear Regression and Multiple Linear Regression use the Least Squares Method to find the best-fit line by minimizing the error (difference between actual and predicted values). • Least Squares Method is a mathematical technique used to find the best- fitting line or curve by minimizing the sum of squared differences between actual and predicted values. • It is widely used in regression analysis, data science, and AI to make accurate predictions from given data
  • 26.
    Linear Regression Models LeastSquares Method - Finding The “Line Of Best Fit”
  • 27.
  • 28.
    Linear Regression Models Typesof Linear Regression Models
  • 29.
  • 30.
    Linear Regression Models . Librariesfor Single Linear Regression, Multiple Linear Regression, and Least Squares Calculation:
  • 31.
    Linear Regression Models . Pythoncode for Single Linear Regression, Multiple Linear Regression, and Least Squares Calculation:
  • 32.
    Linear Regression Models . Pythoncode for Single Linear Regression, Multiple Linear Regression, and Least Squares Calculation: print(f"Equation: Price = {model.intercept_:.2f} + {model.coef_[0]:.4f}(Size) + {model.coef_[1]:.2f}(Bedrooms)")
  • 33.
    Linear Regression Models . •Bayesian Linear Regression and Gradient Descent play different roles in building a linear regression model: • Both use different techniques to find the best-fit line. Bayesian Linear Regression:  Think of it like making predictions with confidence.  Instead of finding just one best-fit line, it gives a range of possible lines based on prior knowledge and new data.  It helps in understanding uncertainty in predictions.
  • 34.
    Linear Regression Models . BayesianLinear Regression:  Think of it like making predictions with confidence.  Instead of finding just one best-fit line, it gives a range of possible lines based on prior knowledge and new data.  It helps in understanding uncertainty in predictions.
  • 35.
    Linear Regression Models . BayesianLinear Regression: Bayesian theorem
  • 36.
    . Linear Regression Models GradientDescent – An Optimization Algorithm  A gradient is like a direction and speed guide that tells us how to change something to get a better result.  "Descent" simply means moving downward or going lower.  Gradient Descent is an optimization algorithm used to minimize a cost function by iteratively adjusting parameters.  It helps machine learning models learn by updating weights in the direction that reduces error.  Commonly used in linear regression, logistic regression, and neural networks.
  • 37.
    . Linear Regression Models GradientDescent – An Optimization Algorithm
  • 38.
    Linear Classification Models A linear classification model is a machine learning model that separates data into categories using a straight line (or a plane in higher dimensions).  It classifies data points by drawing a boundary (called a decision boundary) based on a linear equation.
  • 39.
    Linear Classification Models DISCRIMINANTFUNCTION  A discriminant function is a statistical tool that classifies data into groups  It's used in pattern classifiers, medical research, and other field  A discriminant function in machine learning is a function that separates data points into different classes.  It's used in pattern classifiers to determine the class of an input.
  • 40.
    Linear Classification Models PERCEPTRONALGORITHM  The Perceptron Algorithm is the simplest type of artificial neural network used for binary classification (deciding between two categories like "Yes/No" or "Spam/Not Spam").  It works by adjusting weights to find the best decision boundary that separates two classes.
  • 41.
  • 42.
  • 43.
    Linear Classification Models PROBABILISTICDISCRIMINATIVE MODEL and  A Probabilistic Discriminative Model - Finds the boundary between classes and directly predicts the probability of a class given an input.  Example: Spam Filter – Predicts how likely an email is spam (e.g., 80% Spam, 20% Not Spam).  Probabilistic Generative Model → Learns how data is generated for each class and then uses that to classify new data.  Example: Face Generator – Learns the features of human faces and can generate new faces based on the learned patterns.
  • 45.
    Logistic Regression  LogisticRegression is a machine learning algorithm used for classification problems.  It predicts whether something belongs to one category or another (e.g., Yes/No, Spam/Not Spam).  Instead of giving a direct "Yes" or "No," it gives a probability and applies a threshold (usually 0.5).
  • 46.
  • 47.
  • 48.