SlideShare a Scribd company logo
1 of 26
Supervised Learning
Aneeq Hussain
1
• .
2
Machine learning
Supervised
learning
Unsupervised
learning
SemiSupervised
Learning
Reinforcement
learning
Regression Classification
Binary Muliticlass
linear regre.
ridge regre.
polynomial reg.
Lasso Regr.
....
.......
• Logistic Regression
• Support Vector
Machines (SVM)
• Naive Bayes
• Perceptron
• Ridge Classifier
• Categorical Naive
• Decision Trees
• Random Forest
• K-Nearest Neighbors
(KNN)
• Neural Networks
• Gradient Boosting
What is Supervised Learning?
Supervised learning is a type of machine learning where an algorithm learns from labeled
examples to predict or classify future unlabeled data.
• Labeled Data:
– It involves using a dataset with input-output pairs, where inputs are features, and outputs are
known labels or target values.
• Learning Objective:
– The algorithm's goal is to learn a mapping or function that can predict the correct labels for
new, unseen data.
• Training:
– The model iteratively learns from the labeled data, adjusting its parameters to minimize
prediction errors (usually defined by a loss function).
• Validation:
– The model's performance is assessed on a separate validation dataset to ensure it
generalizes well and doesn't overfit.
• Testing:
– The final model is tested on another independent dataset to evaluate its real-world
performance. 3
4
Types of Supervised Learning Algorithms
5
Supervised learning
Regression classification
Binary Multiclass
• Linear Regression
• Ridge Regression
• Lasso Regression
• Elastic Net
Regression
• Polynomial
Regression
• Support Vector
Regression (SVR)
• Decision Tree
Regression
• Random Forest
• Logistic
Regression
• Support Vector
Machines (SVM)
• Naive Bayes
• Perceptron
• Ridge Classifier
• Categorical Naive
Bayes
• Decision Trees
• Random Forest
• K-Nearest Neighbors
(KNN)
• Neural Networks
• Gradient Boosting
Algorithms
• Linear Discriminant
Analysis (LDA)
• Quadratic Discriminant
Regression
• Regression is a method that helps us understand the relationship
between the depended variables and independed varaibales.
• Descibes how one variable (depended variable) changes as
anothes variable (independed variable) changes.
• Depended: the predictive variable or data (Y).
• Independed: that are used to predicat or explain the change in
the depended variable (X)
• examples: predecting the student score in the exaam, salary
predection etc.
6
algorithms
• Linear Regression: Establishes a linear relationship between input features
and the output variable.
• Ridge Regression: Linear regression with L2 regularization to prevent
overfitting.
• Lasso Regression: Linear regression with L1 regularization for feature
selection.
• Elastic Net Regression: Combines L1 and L2 regularization in linear
regression.
• Polynomial Regression: Models non-linear relationships by using polynomial
terms.
• Support Vector Regression (SVR): Applies support vector machines to
regression problems. 7
• Decision Tree Regression: Uses decision trees to model non-
linear relationships.
• Random Forest Regression: Ensemble of decision trees for
improved accuracy.
• Gradient Boosting Regression: Boosting technique that combines
weak learners into a strong regressor.
• K-Nearest Neighbors Regression (KNN): Predicts based on the
majority class among k nearest neighbors.
• Neural Network Regression: Utilizes artificial neural networks for
regression tasks.
8
• Gaussian Process Regression: Models regression as a Gaussian process.
• Bayesian Ridge Regression: Applies Bayesian methods to linear regression.
• Principal Component Regression (PCR): Uses principal components for
dimensionality reduction.
• Partial Least Squares Regression (PLS): Finds linear combinations of input
features to predict the output.
• Huber Regression: Robust regression technique that reduces the influence of
outliers.
• Quantile Regression: Estimates quantiles of the conditional distribution of the
response
9
Linear Regression
• Linear Regression is a fundamental supervised machine learning
algorithm used for predicting output based on input features.
• It assumes a linear relationship between the features and the
output, represented by a straight line in two dimensions or a
hyperplane in higher dimensions.
10
Linear Regression
11
Linear Regression
Equation of linear refression : Y= mx + b
• Y represent the depended variable.
• x represent the independed variable.
• m represent the slope of the line.
• b is the intercept
• m= sum of product of deviation/ sum of squre of deviatin
of x
• b= mean of Y - (m * mean of x)
• 12
Example
• The model learns coefficients that minimize the difference between predicted
and actual values, making it a simple and interpretable tool for tasks like
predicting house prices, stock prices, or any other numeric outcome.
13
predicting house prices
stock prices
Polynomial regression
• Polynomial regression is a type of regression analysis that models the
relationship between the independent variable (predictor) and the dependent
variable (target) as an nth-degree polynomial.
• Unlike linear regression, which assumes a linear relationship between the
variables, polynomial regression allows for a more flexible and curved
relationship
14
Polynomial regression
• Polynomial Equation: In polynomial regression, the
relationship between the input variable (X) and the output
variable (Y) is represented by a polynomial equation of
the form:
Y = β0 + β1X + β2X^2 + β3X^3 + ... + βnX^n + ε
• Here, Y is the predicted output, X is the input feature, β0
to βn are the coefficients of the polynomial terms, n is the
degree of the polynomial (an integer), and ε represents
the error term.
15
Example
• Stock Market Analysis: In finance, you might want to
predict the future price of a stock based on historical data.
Stock prices often exhibit nonlinear behavior, and
polynomial regression can be used to model these
fluctuations
16
Classification
• Classification in supervised learning is a machine learning task
where the goal is to assign data points to predefined categories or
classes based on their features.
• It involves training a model using labeled data to learn patterns
and relationships between features and classes, allowing it to
make predictions on new, unseen data.
• The model essentially learns to classify or categorize input data
into one of several predefined classes, making it a fundamental
tool for tasks like spam detection, image recognition, and medical17
types of classification
1. Binary:
– Type of classification
– Goal is to predict one of two possible classes or outcomes
– two classes are often labeled as "positive" (class 1) and "negative" (class 0) or simply as
"yes" and "no."
– Examples: spam emails, medical diagnosis etc.
18
2. Multiclass:
– Second type classification
– Goal is to classify data points into one of more than two possible classes or categories.
– there are more than two distinct classes that the algorithm needs to assign each data
point to
– Examples: image recognition, natural language processing etcc.
19
classification algorithms
• Logistic Regression: Suitable for binary classification problems.
• Decision Trees: Can handle both binary and multiclass
classification tasks and are easy to visualize.
• Random Forest: An ensemble method that combines multiple
decision trees for improved accuracy and generalization.
• Support Vector Machines (SVM): Effective for binary and
multiclass classification, particularly in high-dimensional spaces.
• Naive Bayes: A probabilistic algorithm based on Bayes' theorem;
commonly used for text classification.
20
cont..
• K-Nearest Neighbors (KNN): Classifies data points based on the majority
class among their nearest neighbors.
• Neural Networks: Deep learning models with multiple layers of neurons; can
handle complex classification tasks with large datasets.
• Gradient Boosting Algorithms (e.g., XGBoost, LightGBM): Ensemble methods
that sequentially build decision trees to improve accuracy.
• Linear Discriminant Analysis (LDA): Reduces dimensionality while preserving
class separability.
• Quadratic Discriminant Analysis (QDA): Similar to LDA but doesn't assume
equal covariance matrices for classes.
21
cont..
• Perceptron: A simple linear classifier used for binary classification tasks.
• AdaBoost: An ensemble method that combines weak classifiers to create a
strong classifier.
• Gradient Descent Algorithms: Used in training neural networks and deep
learning models for classification.
• Categorical Naive Bayes: An extension of Naive Bayes for categorical data.
• Gaussian Processes: Probabilistic models used for classification tasks.
• Ridge Classifier: A variation of logistic regression with L2 regularization.
• Multilayer Perceptron (MLP): A type of artificial neural network with multiple
hidden layers.
22
Logistic Regression
• Explanation:
• Logistic regression is a statistical method used for binary classification, where the goal is to
predict one of two possible outcomes (e.g., yes/no, 1/0, spam/ham) based on one or more
independent variables (features).
• logistic regression is a classification algorithm, not a regression algorithm. It uses the logistic
function (also called the sigmoid function) to model the probability of the binary outcome.
• p = 1 / (1 + e^(-z))
23
Example
• Spam Detection: Logistic regression use in email filtering
systems to classify emails as spam or not spam based on
the content, sender information, and other features.
• Image Classification: In computer vision, logistic
regression can be used as a simple classification
algorithm to distinguish between different objects or
categories in images.
24
Decision Tree
• Used for both regression and classification.
• It works by splitting the dataset into subsets based on the most significant
attribute or feature, ultimately creating a tree-like structure of decision nodes
and leaf nodes.
• decision node
• leaf node
• splitting
• entropy and information gain
• pruning
25
26

More Related Content

Similar to Supervised Learning.pptx

2.7 other classifiers
2.7 other classifiers2.7 other classifiers
2.7 other classifiersKrish_ver2
 
Aaa ped-12-Supervised Learning: Support Vector Machines & Naive Bayes Classifer
Aaa ped-12-Supervised Learning: Support Vector Machines & Naive Bayes ClassiferAaa ped-12-Supervised Learning: Support Vector Machines & Naive Bayes Classifer
Aaa ped-12-Supervised Learning: Support Vector Machines & Naive Bayes ClassiferAminaRepo
 
Unit 3 – AIML.pptx
Unit 3 – AIML.pptxUnit 3 – AIML.pptx
Unit 3 – AIML.pptxhiblooms
 
OBJECTRECOGNITION1.pptxjjjkkkkjjjjkkkkkkk
OBJECTRECOGNITION1.pptxjjjkkkkjjjjkkkkkkkOBJECTRECOGNITION1.pptxjjjkkkkjjjjkkkkkkk
OBJECTRECOGNITION1.pptxjjjkkkkjjjjkkkkkkkshesnasuneer
 
OBJECTRECOGNITION1.pptxjjjkkkkjjjjkkkkkkk
OBJECTRECOGNITION1.pptxjjjkkkkjjjjkkkkkkkOBJECTRECOGNITION1.pptxjjjkkkkjjjjkkkkkkk
OBJECTRECOGNITION1.pptxjjjkkkkjjjjkkkkkkkshesnasuneer
 
Machine learning - session 3
Machine learning - session 3Machine learning - session 3
Machine learning - session 3Luis Borbon
 
ngboost.pptx
ngboost.pptxngboost.pptx
ngboost.pptxHadrian7
 
5. Machine Learning.pptx
5.  Machine Learning.pptx5.  Machine Learning.pptx
5. Machine Learning.pptxssuser6654de1
 
Machine learning and linear regression programming
Machine learning and linear regression programmingMachine learning and linear regression programming
Machine learning and linear regression programmingSoumya Mukherjee
 
Machine Learning Notes for beginners ,Step by step
Machine Learning Notes for beginners ,Step by stepMachine Learning Notes for beginners ,Step by step
Machine Learning Notes for beginners ,Step by stepSanjanaSaxena17
 
ML SFCSE.pptx
ML SFCSE.pptxML SFCSE.pptx
ML SFCSE.pptxNIKHILGR3
 
Machine learning Mind Map
Machine learning Mind MapMachine learning Mind Map
Machine learning Mind MapAshish Patel
 
UNIT 3: Data Warehousing and Data Mining
UNIT 3: Data Warehousing and Data MiningUNIT 3: Data Warehousing and Data Mining
UNIT 3: Data Warehousing and Data MiningNandakumar P
 
Introduction to Machine Learning Elective Course
Introduction to Machine Learning Elective CourseIntroduction to Machine Learning Elective Course
Introduction to Machine Learning Elective CourseMayuraD1
 
Machine Learning techniques used in AI.
Machine Learning  techniques used in AI.Machine Learning  techniques used in AI.
Machine Learning techniques used in AI.ArchanaT32
 

Similar to Supervised Learning.pptx (20)

2.7 other classifiers
2.7 other classifiers2.7 other classifiers
2.7 other classifiers
 
Aaa ped-12-Supervised Learning: Support Vector Machines & Naive Bayes Classifer
Aaa ped-12-Supervised Learning: Support Vector Machines & Naive Bayes ClassiferAaa ped-12-Supervised Learning: Support Vector Machines & Naive Bayes Classifer
Aaa ped-12-Supervised Learning: Support Vector Machines & Naive Bayes Classifer
 
Unit 3 – AIML.pptx
Unit 3 – AIML.pptxUnit 3 – AIML.pptx
Unit 3 – AIML.pptx
 
OBJECTRECOGNITION1.pptxjjjkkkkjjjjkkkkkkk
OBJECTRECOGNITION1.pptxjjjkkkkjjjjkkkkkkkOBJECTRECOGNITION1.pptxjjjkkkkjjjjkkkkkkk
OBJECTRECOGNITION1.pptxjjjkkkkjjjjkkkkkkk
 
OBJECTRECOGNITION1.pptxjjjkkkkjjjjkkkkkkk
OBJECTRECOGNITION1.pptxjjjkkkkjjjjkkkkkkkOBJECTRECOGNITION1.pptxjjjkkkkjjjjkkkkkkk
OBJECTRECOGNITION1.pptxjjjkkkkjjjjkkkkkkk
 
classification.pptx
classification.pptxclassification.pptx
classification.pptx
 
ngboost.pptx
ngboost.pptxngboost.pptx
ngboost.pptx
 
Machine learning - session 3
Machine learning - session 3Machine learning - session 3
Machine learning - session 3
 
ngboost.pptx
ngboost.pptxngboost.pptx
ngboost.pptx
 
5. Machine Learning.pptx
5.  Machine Learning.pptx5.  Machine Learning.pptx
5. Machine Learning.pptx
 
Machine learning and linear regression programming
Machine learning and linear regression programmingMachine learning and linear regression programming
Machine learning and linear regression programming
 
Machine Learning Notes for beginners ,Step by step
Machine Learning Notes for beginners ,Step by stepMachine Learning Notes for beginners ,Step by step
Machine Learning Notes for beginners ,Step by step
 
ML SFCSE.pptx
ML SFCSE.pptxML SFCSE.pptx
ML SFCSE.pptx
 
Cs501 cluster analysis
Cs501 cluster analysisCs501 cluster analysis
Cs501 cluster analysis
 
Machine learning meetup
Machine learning meetupMachine learning meetup
Machine learning meetup
 
PythonML.pptx
PythonML.pptxPythonML.pptx
PythonML.pptx
 
Machine learning Mind Map
Machine learning Mind MapMachine learning Mind Map
Machine learning Mind Map
 
UNIT 3: Data Warehousing and Data Mining
UNIT 3: Data Warehousing and Data MiningUNIT 3: Data Warehousing and Data Mining
UNIT 3: Data Warehousing and Data Mining
 
Introduction to Machine Learning Elective Course
Introduction to Machine Learning Elective CourseIntroduction to Machine Learning Elective Course
Introduction to Machine Learning Elective Course
 
Machine Learning techniques used in AI.
Machine Learning  techniques used in AI.Machine Learning  techniques used in AI.
Machine Learning techniques used in AI.
 

Recently uploaded

Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksSoftradix Technologies
 
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024BookNet Canada
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxnull - The Open Security Community
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024BookNet Canada
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 3652toLead Limited
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr LapshynFwdays
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 

Recently uploaded (20)

Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other Frameworks
 
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 
Vulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptxVulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptx
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping Elbows
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
The transition to renewables in India.pdf
The transition to renewables in India.pdfThe transition to renewables in India.pdf
The transition to renewables in India.pdf
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 

Supervised Learning.pptx

  • 2. • . 2 Machine learning Supervised learning Unsupervised learning SemiSupervised Learning Reinforcement learning Regression Classification Binary Muliticlass linear regre. ridge regre. polynomial reg. Lasso Regr. .... ....... • Logistic Regression • Support Vector Machines (SVM) • Naive Bayes • Perceptron • Ridge Classifier • Categorical Naive • Decision Trees • Random Forest • K-Nearest Neighbors (KNN) • Neural Networks • Gradient Boosting
  • 3. What is Supervised Learning? Supervised learning is a type of machine learning where an algorithm learns from labeled examples to predict or classify future unlabeled data. • Labeled Data: – It involves using a dataset with input-output pairs, where inputs are features, and outputs are known labels or target values. • Learning Objective: – The algorithm's goal is to learn a mapping or function that can predict the correct labels for new, unseen data. • Training: – The model iteratively learns from the labeled data, adjusting its parameters to minimize prediction errors (usually defined by a loss function). • Validation: – The model's performance is assessed on a separate validation dataset to ensure it generalizes well and doesn't overfit. • Testing: – The final model is tested on another independent dataset to evaluate its real-world performance. 3
  • 4. 4
  • 5. Types of Supervised Learning Algorithms 5 Supervised learning Regression classification Binary Multiclass • Linear Regression • Ridge Regression • Lasso Regression • Elastic Net Regression • Polynomial Regression • Support Vector Regression (SVR) • Decision Tree Regression • Random Forest • Logistic Regression • Support Vector Machines (SVM) • Naive Bayes • Perceptron • Ridge Classifier • Categorical Naive Bayes • Decision Trees • Random Forest • K-Nearest Neighbors (KNN) • Neural Networks • Gradient Boosting Algorithms • Linear Discriminant Analysis (LDA) • Quadratic Discriminant
  • 6. Regression • Regression is a method that helps us understand the relationship between the depended variables and independed varaibales. • Descibes how one variable (depended variable) changes as anothes variable (independed variable) changes. • Depended: the predictive variable or data (Y). • Independed: that are used to predicat or explain the change in the depended variable (X) • examples: predecting the student score in the exaam, salary predection etc. 6
  • 7. algorithms • Linear Regression: Establishes a linear relationship between input features and the output variable. • Ridge Regression: Linear regression with L2 regularization to prevent overfitting. • Lasso Regression: Linear regression with L1 regularization for feature selection. • Elastic Net Regression: Combines L1 and L2 regularization in linear regression. • Polynomial Regression: Models non-linear relationships by using polynomial terms. • Support Vector Regression (SVR): Applies support vector machines to regression problems. 7
  • 8. • Decision Tree Regression: Uses decision trees to model non- linear relationships. • Random Forest Regression: Ensemble of decision trees for improved accuracy. • Gradient Boosting Regression: Boosting technique that combines weak learners into a strong regressor. • K-Nearest Neighbors Regression (KNN): Predicts based on the majority class among k nearest neighbors. • Neural Network Regression: Utilizes artificial neural networks for regression tasks. 8
  • 9. • Gaussian Process Regression: Models regression as a Gaussian process. • Bayesian Ridge Regression: Applies Bayesian methods to linear regression. • Principal Component Regression (PCR): Uses principal components for dimensionality reduction. • Partial Least Squares Regression (PLS): Finds linear combinations of input features to predict the output. • Huber Regression: Robust regression technique that reduces the influence of outliers. • Quantile Regression: Estimates quantiles of the conditional distribution of the response 9
  • 10. Linear Regression • Linear Regression is a fundamental supervised machine learning algorithm used for predicting output based on input features. • It assumes a linear relationship between the features and the output, represented by a straight line in two dimensions or a hyperplane in higher dimensions. 10
  • 12. Linear Regression Equation of linear refression : Y= mx + b • Y represent the depended variable. • x represent the independed variable. • m represent the slope of the line. • b is the intercept • m= sum of product of deviation/ sum of squre of deviatin of x • b= mean of Y - (m * mean of x) • 12
  • 13. Example • The model learns coefficients that minimize the difference between predicted and actual values, making it a simple and interpretable tool for tasks like predicting house prices, stock prices, or any other numeric outcome. 13 predicting house prices stock prices
  • 14. Polynomial regression • Polynomial regression is a type of regression analysis that models the relationship between the independent variable (predictor) and the dependent variable (target) as an nth-degree polynomial. • Unlike linear regression, which assumes a linear relationship between the variables, polynomial regression allows for a more flexible and curved relationship 14
  • 15. Polynomial regression • Polynomial Equation: In polynomial regression, the relationship between the input variable (X) and the output variable (Y) is represented by a polynomial equation of the form: Y = β0 + β1X + β2X^2 + β3X^3 + ... + βnX^n + ε • Here, Y is the predicted output, X is the input feature, β0 to βn are the coefficients of the polynomial terms, n is the degree of the polynomial (an integer), and ε represents the error term. 15
  • 16. Example • Stock Market Analysis: In finance, you might want to predict the future price of a stock based on historical data. Stock prices often exhibit nonlinear behavior, and polynomial regression can be used to model these fluctuations 16
  • 17. Classification • Classification in supervised learning is a machine learning task where the goal is to assign data points to predefined categories or classes based on their features. • It involves training a model using labeled data to learn patterns and relationships between features and classes, allowing it to make predictions on new, unseen data. • The model essentially learns to classify or categorize input data into one of several predefined classes, making it a fundamental tool for tasks like spam detection, image recognition, and medical17
  • 18. types of classification 1. Binary: – Type of classification – Goal is to predict one of two possible classes or outcomes – two classes are often labeled as "positive" (class 1) and "negative" (class 0) or simply as "yes" and "no." – Examples: spam emails, medical diagnosis etc. 18
  • 19. 2. Multiclass: – Second type classification – Goal is to classify data points into one of more than two possible classes or categories. – there are more than two distinct classes that the algorithm needs to assign each data point to – Examples: image recognition, natural language processing etcc. 19
  • 20. classification algorithms • Logistic Regression: Suitable for binary classification problems. • Decision Trees: Can handle both binary and multiclass classification tasks and are easy to visualize. • Random Forest: An ensemble method that combines multiple decision trees for improved accuracy and generalization. • Support Vector Machines (SVM): Effective for binary and multiclass classification, particularly in high-dimensional spaces. • Naive Bayes: A probabilistic algorithm based on Bayes' theorem; commonly used for text classification. 20
  • 21. cont.. • K-Nearest Neighbors (KNN): Classifies data points based on the majority class among their nearest neighbors. • Neural Networks: Deep learning models with multiple layers of neurons; can handle complex classification tasks with large datasets. • Gradient Boosting Algorithms (e.g., XGBoost, LightGBM): Ensemble methods that sequentially build decision trees to improve accuracy. • Linear Discriminant Analysis (LDA): Reduces dimensionality while preserving class separability. • Quadratic Discriminant Analysis (QDA): Similar to LDA but doesn't assume equal covariance matrices for classes. 21
  • 22. cont.. • Perceptron: A simple linear classifier used for binary classification tasks. • AdaBoost: An ensemble method that combines weak classifiers to create a strong classifier. • Gradient Descent Algorithms: Used in training neural networks and deep learning models for classification. • Categorical Naive Bayes: An extension of Naive Bayes for categorical data. • Gaussian Processes: Probabilistic models used for classification tasks. • Ridge Classifier: A variation of logistic regression with L2 regularization. • Multilayer Perceptron (MLP): A type of artificial neural network with multiple hidden layers. 22
  • 23. Logistic Regression • Explanation: • Logistic regression is a statistical method used for binary classification, where the goal is to predict one of two possible outcomes (e.g., yes/no, 1/0, spam/ham) based on one or more independent variables (features). • logistic regression is a classification algorithm, not a regression algorithm. It uses the logistic function (also called the sigmoid function) to model the probability of the binary outcome. • p = 1 / (1 + e^(-z)) 23
  • 24. Example • Spam Detection: Logistic regression use in email filtering systems to classify emails as spam or not spam based on the content, sender information, and other features. • Image Classification: In computer vision, logistic regression can be used as a simple classification algorithm to distinguish between different objects or categories in images. 24
  • 25. Decision Tree • Used for both regression and classification. • It works by splitting the dataset into subsets based on the most significant attribute or feature, ultimately creating a tree-like structure of decision nodes and leaf nodes. • decision node • leaf node • splitting • entropy and information gain • pruning 25
  • 26. 26