SlideShare a Scribd company logo
Ensemble Method
(Bagging
Boosting)
Agenda
Ensemble Learning
Bagging
Boosting
Conclusion
Ensemble Learning
• What is Ensemble Learning?
• How Ensemble Learning works?
• How Ensemble Learning comes to
existence?
3
Ensemble Learning
Ensemble learning in machine learning
refers to the technique of combining
predictions from multiple models to
improve overall performance and
accuracy.
By aggregating diverse models such as
decision trees or neural networks,
ensemble methods like bagging and
boosting enhance robustness, reduce
overfitting, and yield more accurate and
stable predictions for various tasks.
5
Bagging
What is Bagging
Bootstraping
Diverse
Models
Aggregation
Reduction
of
Overfitting
Dataset
Model 1 Model 2 Model n
Dataset 1 Dataset 2 Dataset n
Ensemble Model
Advantages of Bagging
Improved
Accuracy
1
Enhanced
Robustnes
s
2
Increased
Stability
3
Reduction
of
Overfitting
4
Easy
Parallelization
5
What is boosting?
• Boosting builds a series of weak learners sequentially.
• Each new model pays more attention to the data points that the previous models
misclassified.
Sequential Training of
Models
• Boosting algorithms identify and prioritize misclassified samples during the
training process.
• More emphasis is given to the data points that are difficult to classify correctly.
Focus on Misclassified
Samples
• Predictions from individual models are combined with different weights.
• Models that perform well are given higher weights, while models that struggle with
certain samples are given lower weights.
• The final prediction is a weighted sum of predictions from all models.
Weighted Aggregation
of Predictions
Training
Set
Subset 1
Subset 2
Subset n
Weak
Learner
Weak
Learner
Weak
Learner
False prediction
False prediction
Overall Prediction
Training
Training
Training
Testing
Testing
12
Advantage of Boosting
Handles Noisy Data
Boosting corrects misclassified samples, reducing the impact of noisy data on the final model's accuracy.
Feature Importance
Boosting reveals crucial features for accurate predictions, aiding in effective feature selection.
Adaptability to Different Data
Boosting algorithms adapt to diverse and complex datasets, making them suitable for various applications, from text classification to image recognition.
Reduction of Bias and Variance
Boosting strikes a balance between bias and variance, resulting in models that generalize well to new data by combining predictions of multiple weak learners.
Improved Accuracy
Boosting corrects errors made by previous models, leading to highly accurate predictions through sequential model refinement.
Differences Between Bagging and Boosting
14
Criteria Bagging Boosting
Predictions data type The simplest way of combining
predictions,
Which belong to the same type.
A way of combining
predictions that
belong to the different types.
Focuses areas Aim to decrease variance, not
bias.
Aim to decrease bias, not
variance.
Weights Each model receives equal
weight.
Models are weighted
according to their
performance.
Models working method
Each model is built
independently.
New models are influenced
by the performance of
previously built models.
Classifier training type In this base classifiers are
trained parallelly.
In this base classifiers are
trained sequentially.
Example The Random forest model uses
Bagging.
The AdaBoost uses Boosting
techniques.
Credit Scoring Customer Churn
Prediction
Stock Market
Forecasting
Medical
Diagnosis
Web Search
Ranking
Fraud Detection Face Detection Natural Language
Processing
Real World Application of Bagging
Real World Application of Boosting
Summary
1. Enhanced Accuracy and Stability
2. Robustness to Data Challenges
3. Continuous Improvement and
Exploration
Thank you
References:
18
✔✔ https://www.simplilearn.com/tutorials/machine-learning-tutorial/bagging-in-machine-learning
✔✔ https://www.analyticsvidhya.com/blog/2023/01/ensemble-learning-methods-bagging-boosting-and stacking/
✔✔ https://www.simplilearn.com/tutorials/machine-learning-tutorial/what-is-boosting
✔✔ https://www.geeksforgeeks.org/bagging-vs-boosting-in-machine-learning/
✔✔ https://olympus.mygreatlearning.com/courses/61356

More Related Content

What's hot

Logistic regression in Machine Learning
Logistic regression in Machine LearningLogistic regression in Machine Learning
Logistic regression in Machine Learning
Kuppusamy P
 
Random forest algorithm
Random forest algorithmRandom forest algorithm
Random forest algorithm
Rashid Ansari
 
Machine Learning - Ensemble Methods
Machine Learning - Ensemble MethodsMachine Learning - Ensemble Methods
Machine Learning - Ensemble Methods
Andrew Ferlitsch
 
DBSCAN : A Clustering Algorithm
DBSCAN : A Clustering AlgorithmDBSCAN : A Clustering Algorithm
DBSCAN : A Clustering Algorithm
Pınar Yahşi
 
K - Nearest neighbor ( KNN )
K - Nearest neighbor  ( KNN )K - Nearest neighbor  ( KNN )
K - Nearest neighbor ( KNN )
Mohammad Junaid Khan
 
04 Classification in Data Mining
04 Classification in Data Mining04 Classification in Data Mining
04 Classification in Data Mining
Valerii Klymchuk
 
Classification techniques in data mining
Classification techniques in data miningClassification techniques in data mining
Classification techniques in data mining
Kamal Acharya
 
Mining Frequent Patterns, Association and Correlations
Mining Frequent Patterns, Association and CorrelationsMining Frequent Patterns, Association and Correlations
Mining Frequent Patterns, Association and Correlations
Justin Cletus
 
Deep Dive into Hyperparameter Tuning
Deep Dive into Hyperparameter TuningDeep Dive into Hyperparameter Tuning
Deep Dive into Hyperparameter Tuning
Shubhmay Potdar
 
Chapter8
Chapter8Chapter8
Support vector machine
Support vector machineSupport vector machine
Support vector machine
zekeLabs Technologies
 
Hyperparameter Optimization for Machine Learning
Hyperparameter Optimization for Machine LearningHyperparameter Optimization for Machine Learning
Hyperparameter Optimization for Machine Learning
Francesco Casalegno
 
Decision tree
Decision treeDecision tree
Decision tree
ShraddhaPandey45
 
Density Based Clustering
Density Based ClusteringDensity Based Clustering
Density Based Clustering
SSA KPI
 
Ensemble learning
Ensemble learningEnsemble learning
Ensemble learning
Haris Jamil
 
K means Clustering Algorithm
K means Clustering AlgorithmK means Clustering Algorithm
K means Clustering Algorithm
Kasun Ranga Wijeweera
 
Feature Engineering in Machine Learning
Feature Engineering in Machine LearningFeature Engineering in Machine Learning
Feature Engineering in Machine Learning
Knoldus Inc.
 
Logistic regression
Logistic regressionLogistic regression
Logistic regression
YashwantGahlot1
 
3.5 model based clustering
3.5 model based clustering3.5 model based clustering
3.5 model based clustering
Krish_ver2
 
Ensemble Learning.pptx
Ensemble Learning.pptxEnsemble Learning.pptx
Ensemble Learning.pptx
piyushkumar222909
 

What's hot (20)

Logistic regression in Machine Learning
Logistic regression in Machine LearningLogistic regression in Machine Learning
Logistic regression in Machine Learning
 
Random forest algorithm
Random forest algorithmRandom forest algorithm
Random forest algorithm
 
Machine Learning - Ensemble Methods
Machine Learning - Ensemble MethodsMachine Learning - Ensemble Methods
Machine Learning - Ensemble Methods
 
DBSCAN : A Clustering Algorithm
DBSCAN : A Clustering AlgorithmDBSCAN : A Clustering Algorithm
DBSCAN : A Clustering Algorithm
 
K - Nearest neighbor ( KNN )
K - Nearest neighbor  ( KNN )K - Nearest neighbor  ( KNN )
K - Nearest neighbor ( KNN )
 
04 Classification in Data Mining
04 Classification in Data Mining04 Classification in Data Mining
04 Classification in Data Mining
 
Classification techniques in data mining
Classification techniques in data miningClassification techniques in data mining
Classification techniques in data mining
 
Mining Frequent Patterns, Association and Correlations
Mining Frequent Patterns, Association and CorrelationsMining Frequent Patterns, Association and Correlations
Mining Frequent Patterns, Association and Correlations
 
Deep Dive into Hyperparameter Tuning
Deep Dive into Hyperparameter TuningDeep Dive into Hyperparameter Tuning
Deep Dive into Hyperparameter Tuning
 
Chapter8
Chapter8Chapter8
Chapter8
 
Support vector machine
Support vector machineSupport vector machine
Support vector machine
 
Hyperparameter Optimization for Machine Learning
Hyperparameter Optimization for Machine LearningHyperparameter Optimization for Machine Learning
Hyperparameter Optimization for Machine Learning
 
Decision tree
Decision treeDecision tree
Decision tree
 
Density Based Clustering
Density Based ClusteringDensity Based Clustering
Density Based Clustering
 
Ensemble learning
Ensemble learningEnsemble learning
Ensemble learning
 
K means Clustering Algorithm
K means Clustering AlgorithmK means Clustering Algorithm
K means Clustering Algorithm
 
Feature Engineering in Machine Learning
Feature Engineering in Machine LearningFeature Engineering in Machine Learning
Feature Engineering in Machine Learning
 
Logistic regression
Logistic regressionLogistic regression
Logistic regression
 
3.5 model based clustering
3.5 model based clustering3.5 model based clustering
3.5 model based clustering
 
Ensemble Learning.pptx
Ensemble Learning.pptxEnsemble Learning.pptx
Ensemble Learning.pptx
 

Similar to Ensemble Method (Bagging Boosting)

Introduction to XGBoost Machine Learning Model.pptx
Introduction to XGBoost Machine Learning Model.pptxIntroduction to XGBoost Machine Learning Model.pptx
Introduction to XGBoost Machine Learning Model.pptx
agathaljjwm20
 
Ensemble Method.pptx
Ensemble Method.pptxEnsemble Method.pptx
Ensemble Method.pptx
yashaswinitiwari1
 
BaggingBoosting.pdf
BaggingBoosting.pdfBaggingBoosting.pdf
BaggingBoosting.pdf
DynamicPitch
 
(Machine Learning) Ensemble learning
(Machine Learning) Ensemble learning (Machine Learning) Ensemble learning
(Machine Learning) Ensemble learning
Omkar Rane
 
Ensemble methods
Ensemble methodsEnsemble methods
Ensemble methods
Christopher Marker
 
Ensemble methods in Machine learning technology
Ensemble methods in Machine learning technologyEnsemble methods in Machine learning technology
Ensemble methods in Machine learning technology
sikethatsarightemail
 
Optimal Model Complexity (1).pptx
Optimal Model Complexity (1).pptxOptimal Model Complexity (1).pptx
Optimal Model Complexity (1).pptx
MurindanyiSudi1
 
dm1.pdf
dm1.pdfdm1.pdf
dm1.pdf
MarriamAmir1
 
Gradient Boosted trees
Gradient Boosted treesGradient Boosted trees
Gradient Boosted trees
Nihar Ranjan
 
Ensemble learning
Ensemble learningEnsemble learning
Ensemble learning
Megha Sharma
 
Random Forest.pptx
Random Forest.pptxRandom Forest.pptx
Random Forest.pptx
SPIDERSRSTV
 
Ensemble methods
Ensemble methods Ensemble methods
Ensemble methods
zekeLabs Technologies
 
Community Learning
Community LearningCommunity Learning
Community Learning
aNumak & Company
 
Top 20 Data Science Interview Questions and Answers in 2023.pdf
Top 20 Data Science Interview Questions and Answers in 2023.pdfTop 20 Data Science Interview Questions and Answers in 2023.pdf
Top 20 Data Science Interview Questions and Answers in 2023.pdf
AnanthReddy38
 
Ensemble hybrid learning technique
Ensemble hybrid learning techniqueEnsemble hybrid learning technique
Ensemble hybrid learning technique
DishaSinha9
 
NPTL - Machine Learning by Madhur Jatiya.pdf
NPTL - Machine Learning by Madhur Jatiya.pdfNPTL - Machine Learning by Madhur Jatiya.pdf
NPTL - Machine Learning by Madhur Jatiya.pdf
Mr. Moms
 
UNIT-II-Machine-Learning.pptx Machine Learning Different AI Models
UNIT-II-Machine-Learning.pptx Machine Learning Different AI ModelsUNIT-II-Machine-Learning.pptx Machine Learning Different AI Models
UNIT-II-Machine-Learning.pptx Machine Learning Different AI Models
JVSTHARUNSAI
 
Overfitting & Underfitting
Overfitting & UnderfittingOverfitting & Underfitting
Overfitting & Underfitting
SOUMIT KAR
 
Statistical Learning and Model Selection module 2.pptx
Statistical Learning and Model Selection module 2.pptxStatistical Learning and Model Selection module 2.pptx
Statistical Learning and Model Selection module 2.pptx
nagarajan740445
 
MACHINE LEARNING INTRODUCTION DIFFERENCE BETWEEN SUOERVISED , UNSUPERVISED AN...
MACHINE LEARNING INTRODUCTION DIFFERENCE BETWEEN SUOERVISED , UNSUPERVISED AN...MACHINE LEARNING INTRODUCTION DIFFERENCE BETWEEN SUOERVISED , UNSUPERVISED AN...
MACHINE LEARNING INTRODUCTION DIFFERENCE BETWEEN SUOERVISED , UNSUPERVISED AN...
DurgaDevi310087
 

Similar to Ensemble Method (Bagging Boosting) (20)

Introduction to XGBoost Machine Learning Model.pptx
Introduction to XGBoost Machine Learning Model.pptxIntroduction to XGBoost Machine Learning Model.pptx
Introduction to XGBoost Machine Learning Model.pptx
 
Ensemble Method.pptx
Ensemble Method.pptxEnsemble Method.pptx
Ensemble Method.pptx
 
BaggingBoosting.pdf
BaggingBoosting.pdfBaggingBoosting.pdf
BaggingBoosting.pdf
 
(Machine Learning) Ensemble learning
(Machine Learning) Ensemble learning (Machine Learning) Ensemble learning
(Machine Learning) Ensemble learning
 
Ensemble methods
Ensemble methodsEnsemble methods
Ensemble methods
 
Ensemble methods in Machine learning technology
Ensemble methods in Machine learning technologyEnsemble methods in Machine learning technology
Ensemble methods in Machine learning technology
 
Optimal Model Complexity (1).pptx
Optimal Model Complexity (1).pptxOptimal Model Complexity (1).pptx
Optimal Model Complexity (1).pptx
 
dm1.pdf
dm1.pdfdm1.pdf
dm1.pdf
 
Gradient Boosted trees
Gradient Boosted treesGradient Boosted trees
Gradient Boosted trees
 
Ensemble learning
Ensemble learningEnsemble learning
Ensemble learning
 
Random Forest.pptx
Random Forest.pptxRandom Forest.pptx
Random Forest.pptx
 
Ensemble methods
Ensemble methods Ensemble methods
Ensemble methods
 
Community Learning
Community LearningCommunity Learning
Community Learning
 
Top 20 Data Science Interview Questions and Answers in 2023.pdf
Top 20 Data Science Interview Questions and Answers in 2023.pdfTop 20 Data Science Interview Questions and Answers in 2023.pdf
Top 20 Data Science Interview Questions and Answers in 2023.pdf
 
Ensemble hybrid learning technique
Ensemble hybrid learning techniqueEnsemble hybrid learning technique
Ensemble hybrid learning technique
 
NPTL - Machine Learning by Madhur Jatiya.pdf
NPTL - Machine Learning by Madhur Jatiya.pdfNPTL - Machine Learning by Madhur Jatiya.pdf
NPTL - Machine Learning by Madhur Jatiya.pdf
 
UNIT-II-Machine-Learning.pptx Machine Learning Different AI Models
UNIT-II-Machine-Learning.pptx Machine Learning Different AI ModelsUNIT-II-Machine-Learning.pptx Machine Learning Different AI Models
UNIT-II-Machine-Learning.pptx Machine Learning Different AI Models
 
Overfitting & Underfitting
Overfitting & UnderfittingOverfitting & Underfitting
Overfitting & Underfitting
 
Statistical Learning and Model Selection module 2.pptx
Statistical Learning and Model Selection module 2.pptxStatistical Learning and Model Selection module 2.pptx
Statistical Learning and Model Selection module 2.pptx
 
MACHINE LEARNING INTRODUCTION DIFFERENCE BETWEEN SUOERVISED , UNSUPERVISED AN...
MACHINE LEARNING INTRODUCTION DIFFERENCE BETWEEN SUOERVISED , UNSUPERVISED AN...MACHINE LEARNING INTRODUCTION DIFFERENCE BETWEEN SUOERVISED , UNSUPERVISED AN...
MACHINE LEARNING INTRODUCTION DIFFERENCE BETWEEN SUOERVISED , UNSUPERVISED AN...
 

More from Abdullah al Mamun

Underfitting and Overfitting in Machine Learning
Underfitting and Overfitting in Machine LearningUnderfitting and Overfitting in Machine Learning
Underfitting and Overfitting in Machine Learning
Abdullah al Mamun
 
Recurrent Neural Networks (RNNs)
Recurrent Neural Networks (RNNs)Recurrent Neural Networks (RNNs)
Recurrent Neural Networks (RNNs)
Abdullah al Mamun
 
Random Forest
Random ForestRandom Forest
Random Forest
Abdullah al Mamun
 
Principal Component Analysis PCA
Principal Component Analysis PCAPrincipal Component Analysis PCA
Principal Component Analysis PCA
Abdullah al Mamun
 
Natural Language Processing (NLP)
Natural Language Processing (NLP)Natural Language Processing (NLP)
Natural Language Processing (NLP)
Abdullah al Mamun
 
Naive Bayes
Naive BayesNaive Bayes
Naive Bayes
Abdullah al Mamun
 
Multilayer Perceptron Neural Network MLP
Multilayer Perceptron Neural Network MLPMultilayer Perceptron Neural Network MLP
Multilayer Perceptron Neural Network MLP
Abdullah al Mamun
 
Long Short Term Memory LSTM
Long Short Term Memory LSTMLong Short Term Memory LSTM
Long Short Term Memory LSTM
Abdullah al Mamun
 
Linear Regression
Linear RegressionLinear Regression
Linear Regression
Abdullah al Mamun
 
K-Nearest Neighbor(KNN)
K-Nearest Neighbor(KNN)K-Nearest Neighbor(KNN)
K-Nearest Neighbor(KNN)
Abdullah al Mamun
 
Hidden Markov Model (HMM)
Hidden Markov Model (HMM)Hidden Markov Model (HMM)
Hidden Markov Model (HMM)
Abdullah al Mamun
 
Convolutional Neural Networks CNN
Convolutional Neural Networks CNNConvolutional Neural Networks CNN
Convolutional Neural Networks CNN
Abdullah al Mamun
 
Artificial Neural Network ANN
Artificial Neural Network ANNArtificial Neural Network ANN
Artificial Neural Network ANN
Abdullah al Mamun
 
Reinforcement Learning, Application and Q-Learning
Reinforcement Learning, Application and Q-LearningReinforcement Learning, Application and Q-Learning
Reinforcement Learning, Application and Q-Learning
Abdullah al Mamun
 
Session on evaluation of DevSecOps
Session on evaluation of DevSecOpsSession on evaluation of DevSecOps
Session on evaluation of DevSecOps
Abdullah al Mamun
 
Artificial Intelligence: Classification, Applications, Opportunities, and Cha...
Artificial Intelligence: Classification, Applications, Opportunities, and Cha...Artificial Intelligence: Classification, Applications, Opportunities, and Cha...
Artificial Intelligence: Classification, Applications, Opportunities, and Cha...
Abdullah al Mamun
 
DevOps Presentation.pptx
DevOps Presentation.pptxDevOps Presentation.pptx
DevOps Presentation.pptx
Abdullah al Mamun
 
Python Virtual Environment.pptx
Python Virtual Environment.pptxPython Virtual Environment.pptx
Python Virtual Environment.pptx
Abdullah al Mamun
 
Artificial intelligence Presentation.pptx
Artificial intelligence Presentation.pptxArtificial intelligence Presentation.pptx
Artificial intelligence Presentation.pptx
Abdullah al Mamun
 
An approach to empirical Optical Character recognition paradigm using Multi-L...
An approach to empirical Optical Character recognition paradigm using Multi-L...An approach to empirical Optical Character recognition paradigm using Multi-L...
An approach to empirical Optical Character recognition paradigm using Multi-L...
Abdullah al Mamun
 

More from Abdullah al Mamun (20)

Underfitting and Overfitting in Machine Learning
Underfitting and Overfitting in Machine LearningUnderfitting and Overfitting in Machine Learning
Underfitting and Overfitting in Machine Learning
 
Recurrent Neural Networks (RNNs)
Recurrent Neural Networks (RNNs)Recurrent Neural Networks (RNNs)
Recurrent Neural Networks (RNNs)
 
Random Forest
Random ForestRandom Forest
Random Forest
 
Principal Component Analysis PCA
Principal Component Analysis PCAPrincipal Component Analysis PCA
Principal Component Analysis PCA
 
Natural Language Processing (NLP)
Natural Language Processing (NLP)Natural Language Processing (NLP)
Natural Language Processing (NLP)
 
Naive Bayes
Naive BayesNaive Bayes
Naive Bayes
 
Multilayer Perceptron Neural Network MLP
Multilayer Perceptron Neural Network MLPMultilayer Perceptron Neural Network MLP
Multilayer Perceptron Neural Network MLP
 
Long Short Term Memory LSTM
Long Short Term Memory LSTMLong Short Term Memory LSTM
Long Short Term Memory LSTM
 
Linear Regression
Linear RegressionLinear Regression
Linear Regression
 
K-Nearest Neighbor(KNN)
K-Nearest Neighbor(KNN)K-Nearest Neighbor(KNN)
K-Nearest Neighbor(KNN)
 
Hidden Markov Model (HMM)
Hidden Markov Model (HMM)Hidden Markov Model (HMM)
Hidden Markov Model (HMM)
 
Convolutional Neural Networks CNN
Convolutional Neural Networks CNNConvolutional Neural Networks CNN
Convolutional Neural Networks CNN
 
Artificial Neural Network ANN
Artificial Neural Network ANNArtificial Neural Network ANN
Artificial Neural Network ANN
 
Reinforcement Learning, Application and Q-Learning
Reinforcement Learning, Application and Q-LearningReinforcement Learning, Application and Q-Learning
Reinforcement Learning, Application and Q-Learning
 
Session on evaluation of DevSecOps
Session on evaluation of DevSecOpsSession on evaluation of DevSecOps
Session on evaluation of DevSecOps
 
Artificial Intelligence: Classification, Applications, Opportunities, and Cha...
Artificial Intelligence: Classification, Applications, Opportunities, and Cha...Artificial Intelligence: Classification, Applications, Opportunities, and Cha...
Artificial Intelligence: Classification, Applications, Opportunities, and Cha...
 
DevOps Presentation.pptx
DevOps Presentation.pptxDevOps Presentation.pptx
DevOps Presentation.pptx
 
Python Virtual Environment.pptx
Python Virtual Environment.pptxPython Virtual Environment.pptx
Python Virtual Environment.pptx
 
Artificial intelligence Presentation.pptx
Artificial intelligence Presentation.pptxArtificial intelligence Presentation.pptx
Artificial intelligence Presentation.pptx
 
An approach to empirical Optical Character recognition paradigm using Multi-L...
An approach to empirical Optical Character recognition paradigm using Multi-L...An approach to empirical Optical Character recognition paradigm using Multi-L...
An approach to empirical Optical Character recognition paradigm using Multi-L...
 

Recently uploaded

ACEP Magazine edition 4th launched on 05.06.2024
ACEP Magazine edition 4th launched on 05.06.2024ACEP Magazine edition 4th launched on 05.06.2024
ACEP Magazine edition 4th launched on 05.06.2024
Rahul
 
Heat Resistant Concrete Presentation ppt
Heat Resistant Concrete Presentation pptHeat Resistant Concrete Presentation ppt
Heat Resistant Concrete Presentation ppt
mamunhossenbd75
 
ML Based Model for NIDS MSc Updated Presentation.v2.pptx
ML Based Model for NIDS MSc Updated Presentation.v2.pptxML Based Model for NIDS MSc Updated Presentation.v2.pptx
ML Based Model for NIDS MSc Updated Presentation.v2.pptx
JamalHussainArman
 
22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt
KrishnaveniKrishnara1
 
Modelagem de um CSTR com reação endotermica.pdf
Modelagem de um CSTR com reação endotermica.pdfModelagem de um CSTR com reação endotermica.pdf
Modelagem de um CSTR com reação endotermica.pdf
camseq
 
BPV-GUI-01-Guide-for-ASME-Review-Teams-(General)-10-10-2023.pdf
BPV-GUI-01-Guide-for-ASME-Review-Teams-(General)-10-10-2023.pdfBPV-GUI-01-Guide-for-ASME-Review-Teams-(General)-10-10-2023.pdf
BPV-GUI-01-Guide-for-ASME-Review-Teams-(General)-10-10-2023.pdf
MIGUELANGEL966976
 
sieving analysis and results interpretation
sieving analysis and results interpretationsieving analysis and results interpretation
sieving analysis and results interpretation
ssuser36d3051
 
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELDEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
gerogepatton
 
CSM Cloud Service Management Presentarion
CSM Cloud Service Management PresentarionCSM Cloud Service Management Presentarion
CSM Cloud Service Management Presentarion
rpskprasana
 
Swimming pool mechanical components design.pptx
Swimming pool  mechanical components design.pptxSwimming pool  mechanical components design.pptx
Swimming pool mechanical components design.pptx
yokeleetan1
 
Recycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part IIIRecycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part III
Aditya Rajan Patra
 
Series of visio cisco devices Cisco_Icons.ppt
Series of visio cisco devices Cisco_Icons.pptSeries of visio cisco devices Cisco_Icons.ppt
Series of visio cisco devices Cisco_Icons.ppt
PauloRodrigues104553
 
Exception Handling notes in java exception
Exception Handling notes in java exceptionException Handling notes in java exception
Exception Handling notes in java exception
Ratnakar Mikkili
 
2. Operations Strategy in a Global Environment.ppt
2. Operations Strategy in a Global Environment.ppt2. Operations Strategy in a Global Environment.ppt
2. Operations Strategy in a Global Environment.ppt
PuktoonEngr
 
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsKuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
Victor Morales
 
International Conference on NLP, Artificial Intelligence, Machine Learning an...
International Conference on NLP, Artificial Intelligence, Machine Learning an...International Conference on NLP, Artificial Intelligence, Machine Learning an...
International Conference on NLP, Artificial Intelligence, Machine Learning an...
gerogepatton
 
Properties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptxProperties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptx
MDSABBIROJJAMANPAYEL
 
Embedded machine learning-based road conditions and driving behavior monitoring
Embedded machine learning-based road conditions and driving behavior monitoringEmbedded machine learning-based road conditions and driving behavior monitoring
Embedded machine learning-based road conditions and driving behavior monitoring
IJECEIAES
 
6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)
ClaraZara1
 
Manufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptxManufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptx
Madan Karki
 

Recently uploaded (20)

ACEP Magazine edition 4th launched on 05.06.2024
ACEP Magazine edition 4th launched on 05.06.2024ACEP Magazine edition 4th launched on 05.06.2024
ACEP Magazine edition 4th launched on 05.06.2024
 
Heat Resistant Concrete Presentation ppt
Heat Resistant Concrete Presentation pptHeat Resistant Concrete Presentation ppt
Heat Resistant Concrete Presentation ppt
 
ML Based Model for NIDS MSc Updated Presentation.v2.pptx
ML Based Model for NIDS MSc Updated Presentation.v2.pptxML Based Model for NIDS MSc Updated Presentation.v2.pptx
ML Based Model for NIDS MSc Updated Presentation.v2.pptx
 
22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt
 
Modelagem de um CSTR com reação endotermica.pdf
Modelagem de um CSTR com reação endotermica.pdfModelagem de um CSTR com reação endotermica.pdf
Modelagem de um CSTR com reação endotermica.pdf
 
BPV-GUI-01-Guide-for-ASME-Review-Teams-(General)-10-10-2023.pdf
BPV-GUI-01-Guide-for-ASME-Review-Teams-(General)-10-10-2023.pdfBPV-GUI-01-Guide-for-ASME-Review-Teams-(General)-10-10-2023.pdf
BPV-GUI-01-Guide-for-ASME-Review-Teams-(General)-10-10-2023.pdf
 
sieving analysis and results interpretation
sieving analysis and results interpretationsieving analysis and results interpretation
sieving analysis and results interpretation
 
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELDEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
 
CSM Cloud Service Management Presentarion
CSM Cloud Service Management PresentarionCSM Cloud Service Management Presentarion
CSM Cloud Service Management Presentarion
 
Swimming pool mechanical components design.pptx
Swimming pool  mechanical components design.pptxSwimming pool  mechanical components design.pptx
Swimming pool mechanical components design.pptx
 
Recycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part IIIRecycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part III
 
Series of visio cisco devices Cisco_Icons.ppt
Series of visio cisco devices Cisco_Icons.pptSeries of visio cisco devices Cisco_Icons.ppt
Series of visio cisco devices Cisco_Icons.ppt
 
Exception Handling notes in java exception
Exception Handling notes in java exceptionException Handling notes in java exception
Exception Handling notes in java exception
 
2. Operations Strategy in a Global Environment.ppt
2. Operations Strategy in a Global Environment.ppt2. Operations Strategy in a Global Environment.ppt
2. Operations Strategy in a Global Environment.ppt
 
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsKuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
 
International Conference on NLP, Artificial Intelligence, Machine Learning an...
International Conference on NLP, Artificial Intelligence, Machine Learning an...International Conference on NLP, Artificial Intelligence, Machine Learning an...
International Conference on NLP, Artificial Intelligence, Machine Learning an...
 
Properties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptxProperties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptx
 
Embedded machine learning-based road conditions and driving behavior monitoring
Embedded machine learning-based road conditions and driving behavior monitoringEmbedded machine learning-based road conditions and driving behavior monitoring
Embedded machine learning-based road conditions and driving behavior monitoring
 
6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)
 
Manufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptxManufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptx
 

Ensemble Method (Bagging Boosting)

  • 3. Ensemble Learning • What is Ensemble Learning? • How Ensemble Learning works? • How Ensemble Learning comes to existence? 3
  • 4. Ensemble Learning Ensemble learning in machine learning refers to the technique of combining predictions from multiple models to improve overall performance and accuracy. By aggregating diverse models such as decision trees or neural networks, ensemble methods like bagging and boosting enhance robustness, reduce overfitting, and yield more accurate and stable predictions for various tasks.
  • 5. 5
  • 8. Dataset Model 1 Model 2 Model n Dataset 1 Dataset 2 Dataset n Ensemble Model
  • 10. What is boosting? • Boosting builds a series of weak learners sequentially. • Each new model pays more attention to the data points that the previous models misclassified. Sequential Training of Models • Boosting algorithms identify and prioritize misclassified samples during the training process. • More emphasis is given to the data points that are difficult to classify correctly. Focus on Misclassified Samples • Predictions from individual models are combined with different weights. • Models that perform well are given higher weights, while models that struggle with certain samples are given lower weights. • The final prediction is a weighted sum of predictions from all models. Weighted Aggregation of Predictions
  • 11. Training Set Subset 1 Subset 2 Subset n Weak Learner Weak Learner Weak Learner False prediction False prediction Overall Prediction Training Training Training Testing Testing
  • 12. 12
  • 13. Advantage of Boosting Handles Noisy Data Boosting corrects misclassified samples, reducing the impact of noisy data on the final model's accuracy. Feature Importance Boosting reveals crucial features for accurate predictions, aiding in effective feature selection. Adaptability to Different Data Boosting algorithms adapt to diverse and complex datasets, making them suitable for various applications, from text classification to image recognition. Reduction of Bias and Variance Boosting strikes a balance between bias and variance, resulting in models that generalize well to new data by combining predictions of multiple weak learners. Improved Accuracy Boosting corrects errors made by previous models, leading to highly accurate predictions through sequential model refinement.
  • 14. Differences Between Bagging and Boosting 14 Criteria Bagging Boosting Predictions data type The simplest way of combining predictions, Which belong to the same type. A way of combining predictions that belong to the different types. Focuses areas Aim to decrease variance, not bias. Aim to decrease bias, not variance. Weights Each model receives equal weight. Models are weighted according to their performance. Models working method Each model is built independently. New models are influenced by the performance of previously built models. Classifier training type In this base classifiers are trained parallelly. In this base classifiers are trained sequentially. Example The Random forest model uses Bagging. The AdaBoost uses Boosting techniques.
  • 15. Credit Scoring Customer Churn Prediction Stock Market Forecasting Medical Diagnosis Web Search Ranking Fraud Detection Face Detection Natural Language Processing Real World Application of Bagging Real World Application of Boosting
  • 16. Summary 1. Enhanced Accuracy and Stability 2. Robustness to Data Challenges 3. Continuous Improvement and Exploration
  • 18. References: 18 ✔✔ https://www.simplilearn.com/tutorials/machine-learning-tutorial/bagging-in-machine-learning ✔✔ https://www.analyticsvidhya.com/blog/2023/01/ensemble-learning-methods-bagging-boosting-and stacking/ ✔✔ https://www.simplilearn.com/tutorials/machine-learning-tutorial/what-is-boosting ✔✔ https://www.geeksforgeeks.org/bagging-vs-boosting-in-machine-learning/ ✔✔ https://olympus.mygreatlearning.com/courses/61356

Editor's Notes

  1. 1. **Bootstrap Aggregating (Bagging)**: Bagging is an ensemble learning technique that combines predictions from multiple base models. It operates by creating multiple subsets of the original dataset through bootstrapping (sampling with replacement). 2. **Diverse Models**: Bagging uses a collection of diverse base models, such as decision trees, which are trained independently on these subsets. Each model learns different patterns from the data, enhancing overall model diversity. 3. **Aggregation**: After training, predictions from all individual models are combined, often through averaging (for regression) or voting (for classification). This aggregation smoothens out individual model errors and improves overall accuracy and robustness. 4. **Reduction of Overfitting**: Bagging reduces overfitting because it averages out the noise present in individual models, leading to a more generalized and reliable ensemble model. It is a fundamental technique used in ensemble learning to enhance predictive performance.
  2. Improved Accuracy: Bagging reduces variance and overfitting by combining predictions from multiple models. This often leads to more accurate and reliable predictions compared to individual models. Enhanced Robustness: By training multiple models on different subsets of the data, bagging reduces the impact of outliers and noisy data points, making the overall model more robust and resistant to errors. Increased Stability: Bagging stabilizes the learning process. Because it averages or combines predictions, it smoothens out fluctuations in the training data, making the model's predictions more stable and consistent. 4.Reduction of Overfitting: Bagging significantly reduces overfitting, a common problem in machine learning where a model performs well on training data but poorly on unseen data. By training multiple models on different subsets of the data and combining their predictions, bagging reduces the chances of any single model memorizing noise or outliers in the training data, leading to a more generalized and reliable ensemble model. This reduction in overfitting enhances the model's ability to make accurate predictions on new, unseen data, improving overall performance and reliability. 5.Easy Parallelization: The independent nature of the base models in bagging allows for easy parallelization. Models can be trained simultaneously on different subsets of data, speeding up the overall training process, which is particularly useful for large datasets.
  3. Explanation of Boosting: Boosting is an ensemble learning technique that aims to improve the accuracy of models by converting weak learners into strong learners through a sequential learning process. Unlike bagging, boosting trains models sequentially, where each model corrects the errors made by its predecessor. Boosting focuses on learning from mistakes, continually giving more attention to misclassified samples. Key Points: Sequential Training of Models: Boosting builds a series of weak learners (e.g., decision trees) sequentially. Each new model pays more attention to the data points that the previous models misclassified. Focus on Misclassified Samples: Boosting algorithms identify and prioritize misclassified samples during the training process. More emphasis is given to the data points that are difficult to classify correctly. Weighted Aggregation of Predictions: Predictions from individual models are combined with different weights. Models that perform well are given higher weights, while models that struggle with certain samples are given lower weights. The final prediction is a weighted sum of predictions from all models. Visual: Diagram Illustrating the Concept of Boosting: [Insert Diagram Here] (Visual: Provide a diagram showing the sequential training process of boosting. Use arrows to indicate the flow of information from one model to another. Emphasize the focus on misclassified samples, potentially using different colors or shapes to represent correctly and incorrectly classified samples. Show the weighted aggregation of predictions, perhaps using varying line thickness or opacity to represent different model weights. This visual aid will help the audience understand how boosting adapts and improves over iterations.) This slide explains the concept of boosting, highlighting its sequential training approach, emphasis on misclassified samples, and the weighted aggregation of predictions. The visual diagram enhances the understanding of the boosting process, making it easier for the audience to grasp the iterative nature of this ensemble learning technique.
  4. Enhanced Accuracy and Stability: Ensemble techniques like bagging and boosting improve model accuracy by combining diverse models, ensuring more reliable predictions even in complex scenarios. Robustness to Data Challenges: Ensemble methods handle overfitting and outliers effectively, crucial for managing noisy or imbalanced datasets, making them indispensable in real-world applications. Continuous Improvement and Exploration: Regular experimentation with diverse base models, coupled with hyperparameter tuning, enables continuous learning, allowing adaptation to the latest advancements in ensemble techniques, ensuring optimal model performance.