SlideShare a Scribd company logo
Machine Learning
Group IX
What is machine learning?
● Give learning abilities to computers rather than defining all states
● Uses subfields of AI - computational learning theory and pattern recognition
● Make computer programs work on two special stages “Train” and “Predict”
2
Machine learning vs conditional programming
Conditional programming uses the simple if-then
else rules
Problem : Detect flower name by its features
Conditional approach - use if-else rules for all states
AI approach - Train ML model and predict the result.
3
Supervised learning
Supervised learning is the machine learning task of inferring a function from
labeled training data. The training data consist of a set of training examples.
4
Supervised learning algorithms
Decision trees
Naive bayes
K-nearest
5
Train Predict
Decision Tree
6
1. Decision Tree
Decision tree builds
classification model using tree
structure.
It breaks down a dataset into
smaller and smaller subsets.
Finding the optimal decision
tree is np-hard
So we use greedy technique
7
Decision tree algorithm
1. starting with whole training data
2. select attribute or value along dimension that gives “best” split
3. create child nodes based on split
4. recurse on each child using child data until a stopping criterion is reached
• all examples have same class - Entropy is 0
• amount of data is too small - < Min_samples_split
• tree too large
Problem: How do we choose the “best” attribute?
8
Simple Example
Weekend (Example) Weather Parents Money Decision (Category)
W1 Sunny Yes Rich Cinema
W2 Sunny No Rich Tennis
W3 Windy Yes Rich Cinema
W4 Rainy Yes Poor Cinema
W5 Rainy No Rich Stay in
W6 Rainy Yes Poor Cinema
W7 Windy No Poor Cinema
W8 Windy No Rich Shopping
W9 Windy Yes Rich Cinema
W10 Sunny No Rich Tennis
9
Python code
10
Decision tree
When Parent is the splitter entropy is
1.571
Parameters
Criterion = entropy*, gini(default)
Splitter = best(default)*, random
Min_samples_split = 2* (default)
* - used in here
11
How prediction works
Today is windy. I have money and parents not
at home. Predict what I will do??
Weather = “Windy” 1
Parent = “No” 0
Money = “Rich” 1
classified=[0, 1, 0, 0] I may start shopping!
12
Decision tree for large dataset
Sklearn iris data set
13
Naive bayes
14
2. Naive bayes
It is a classification technique based on Bayes’ Theorem with an assumption
of independence among predictors.
Primarily used for text classification which involves high dimensional training
data sets.
Example : Spam filtration, Sentimental analysis, and classifying news
articles.
Bayes theorem provides a way of calculating posterior probability P(c|x) from
P(c), P(x) and P(x|c).
15
P(c|x) is the posterior probability of class (c,target) given predictor (x,
attributes).
● P(c) is the prior probability of class.
● P(x|c) is the likelihood which is the probability of predictor given class.16
How Naive Bayes algorithm works?
Example :
Take training data set of weather and corresponding target variable ‘Play’
(suggesting possibilities of playing). Then classify whether players will play
or not based on weather condition.
Let’s follow the below steps to perform it…
17
Steps:
1. Convert the data set into a frequency table.
2. Create Likelihood table by finding the probabilities. (Overcast
probability=0.29 and probability of playing is 0.64)
18
3. Use Naive bayesian equation to calculate the posterior probability for
each class. The class with the highest posterior probability is the outcome
of prediction.
Problem: Players will play if weather is sunny. Is this statement is correct?
Solution: Solve it using the method of posterior probability.
P(Yes|Sunny)=P(Sunny|Yes)*P(Yes) / P(Sunny)
Here, P(Sunny|Yes)=3/9=0.33
P(Sunny)=5/14=0.36,
P(Yes)=9/14=0.64
P(Yes|Sunny)=0.33*0.64/0.36=0.60
19
Python Code
20
Output :
21
k - Nearest neighbour
22
3. k-Nearest Neighbour
Introduction
The KNN algorithm is a robust and versatile classifier that is often
used as a benchmark for more complex classifiers such as Artificial
Neural Networks (ANN) and Support Vector Machines (SVM).
Despite its simplicity, KNN can outperform more powerful classifiers
and is used in a variety of applications such as economic forecasting,
data compression and genetics.
23
What is KNN?
KNN falls in the supervised learning family of algorithms. Informally,
this means that we are given a labelled dataset consisting of training
observations (x,y)(x,y) and would like to capture the relationship
between xx and yy. More formally, our goal is to learn a function
h:X→Yh:X→Y so that given an unseen observation xx, h(x)h(x) can
confidently predict the corresponding output.
● KNN is non-parametric, instance-based and used in a supervised
learning setting.
● Minimal training but expensive testing. 24
How does KNN work?
The K-nearest neighbor algorithm essentially boils down to forming a majority vote
between the K most similar instances to a given “unseen” observation. Similarity is
defined according to a distance metric between two data points. A popular choice
is the Euclidean distance given by
25
How it works(cont...)
1. Assign k value preferably a small odd number.
2. Find the closest number of k points.
3. Assign the new point from the majority of classes.
26
How it works(cont...)
27
When K is small, we are restraining the region of a given prediction and forcing
our classifier to be “more blind” to the overall distribution. A small value for K
provides the most flexible fit, which will have low bias but high variance.
Graphically, our decision boundary will be more jagged.
28
On the other hand, a higher K averages more voters in each prediction and hence
is more resilient to outliers. Larger values of K will have smoother decision
boundaries which means lower variance but increased bias.
29
Exploring KNN in Code
30
Clustering
31
Unsupervised learning - Clustering
● organization of unlabeled data
into similarity groups
● Three types of clustering
techniques
Hierarchical
Partitional
Bayesian
32
Clustering Algorithms
K-means
● Partitional clustering algorithm
● Choose k(random) data points(seeds) to be the initial centroids
● Assign each data points to the closest centroid
33
K means
34
4. K-means
Algorithm
● Decide value for k
● Initialize the k cluster centers
● Assigning objects into nearest clusters
● Re-estimate the cluster centers
● If objects are not change the membership,exit and go to fourth step
35
Step 1
36
Step 2
37
Step 3
38
Step 4
39
Step 5
40
Python Code
Output Labels [0 0 1 1]
Predicted Label [0]
41
Output Labels [1 1 0 0]
Predicted Label [1]
42

More Related Content

What's hot

Artificial Neural Networks for Data Mining
Artificial Neural Networks for Data MiningArtificial Neural Networks for Data Mining
Dbscan algorithom
Dbscan algorithomDbscan algorithom
Dbscan algorithom
Mahbubur Rahman Shimul
 
Machine learning
Machine learningMachine learning
Machine learning
Dr Geetha Mohan
 
Lecture1 introduction to machine learning
Lecture1 introduction to machine learningLecture1 introduction to machine learning
Lecture1 introduction to machine learning
UmmeSalmaM1
 
DBSCAN : A Clustering Algorithm
DBSCAN : A Clustering AlgorithmDBSCAN : A Clustering Algorithm
DBSCAN : A Clustering Algorithm
Pınar Yahşi
 
Bagging.pptx
Bagging.pptxBagging.pptx
Bagging.pptx
ComsatsSahiwal1
 
Supervised and Unsupervised Learning In Machine Learning | Machine Learning T...
Supervised and Unsupervised Learning In Machine Learning | Machine Learning T...Supervised and Unsupervised Learning In Machine Learning | Machine Learning T...
Supervised and Unsupervised Learning In Machine Learning | Machine Learning T...
Simplilearn
 
Machine Learning - Ensemble Methods
Machine Learning - Ensemble MethodsMachine Learning - Ensemble Methods
Machine Learning - Ensemble Methods
Andrew Ferlitsch
 
Ensemble methods
Ensemble methodsEnsemble methods
Ensemble methods
Christopher Marker
 
Linear regression
Linear regressionLinear regression
Linear regression
MartinHogg9
 
Machine learning session4(linear regression)
Machine learning   session4(linear regression)Machine learning   session4(linear regression)
Machine learning session4(linear regression)
Abhimanyu Dwivedi
 
Machine Learning Algorithms
Machine Learning AlgorithmsMachine Learning Algorithms
Machine Learning Algorithms
Hichem Felouat
 
Support Vector Machines ( SVM )
Support Vector Machines ( SVM ) Support Vector Machines ( SVM )
Support Vector Machines ( SVM )
Mohammad Junaid Khan
 
Unit I & II in Principles of Soft computing
Unit I & II in Principles of Soft computing Unit I & II in Principles of Soft computing
Unit I & II in Principles of Soft computing
Sivagowry Shathesh
 
Deep Neural Networks (DNN)
Deep Neural Networks (DNN)Deep Neural Networks (DNN)
Deep Learning Tutorial
Deep Learning TutorialDeep Learning Tutorial
Deep Learning Tutorial
Amr Rashed
 
Machine Learning Algorithms | Machine Learning Tutorial | Data Science Algori...
Machine Learning Algorithms | Machine Learning Tutorial | Data Science Algori...Machine Learning Algorithms | Machine Learning Tutorial | Data Science Algori...
Machine Learning Algorithms | Machine Learning Tutorial | Data Science Algori...
Simplilearn
 
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
Simplilearn
 
2.6 support vector machines and associative classifiers revised
2.6 support vector machines and associative classifiers revised2.6 support vector machines and associative classifiers revised
2.6 support vector machines and associative classifiers revised
Krish_ver2
 
Decision tree and random forest
Decision tree and random forestDecision tree and random forest
Decision tree and random forest
Lippo Group Digital
 

What's hot (20)

Artificial Neural Networks for Data Mining
Artificial Neural Networks for Data MiningArtificial Neural Networks for Data Mining
Artificial Neural Networks for Data Mining
 
Dbscan algorithom
Dbscan algorithomDbscan algorithom
Dbscan algorithom
 
Machine learning
Machine learningMachine learning
Machine learning
 
Lecture1 introduction to machine learning
Lecture1 introduction to machine learningLecture1 introduction to machine learning
Lecture1 introduction to machine learning
 
DBSCAN : A Clustering Algorithm
DBSCAN : A Clustering AlgorithmDBSCAN : A Clustering Algorithm
DBSCAN : A Clustering Algorithm
 
Bagging.pptx
Bagging.pptxBagging.pptx
Bagging.pptx
 
Supervised and Unsupervised Learning In Machine Learning | Machine Learning T...
Supervised and Unsupervised Learning In Machine Learning | Machine Learning T...Supervised and Unsupervised Learning In Machine Learning | Machine Learning T...
Supervised and Unsupervised Learning In Machine Learning | Machine Learning T...
 
Machine Learning - Ensemble Methods
Machine Learning - Ensemble MethodsMachine Learning - Ensemble Methods
Machine Learning - Ensemble Methods
 
Ensemble methods
Ensemble methodsEnsemble methods
Ensemble methods
 
Linear regression
Linear regressionLinear regression
Linear regression
 
Machine learning session4(linear regression)
Machine learning   session4(linear regression)Machine learning   session4(linear regression)
Machine learning session4(linear regression)
 
Machine Learning Algorithms
Machine Learning AlgorithmsMachine Learning Algorithms
Machine Learning Algorithms
 
Support Vector Machines ( SVM )
Support Vector Machines ( SVM ) Support Vector Machines ( SVM )
Support Vector Machines ( SVM )
 
Unit I & II in Principles of Soft computing
Unit I & II in Principles of Soft computing Unit I & II in Principles of Soft computing
Unit I & II in Principles of Soft computing
 
Deep Neural Networks (DNN)
Deep Neural Networks (DNN)Deep Neural Networks (DNN)
Deep Neural Networks (DNN)
 
Deep Learning Tutorial
Deep Learning TutorialDeep Learning Tutorial
Deep Learning Tutorial
 
Machine Learning Algorithms | Machine Learning Tutorial | Data Science Algori...
Machine Learning Algorithms | Machine Learning Tutorial | Data Science Algori...Machine Learning Algorithms | Machine Learning Tutorial | Data Science Algori...
Machine Learning Algorithms | Machine Learning Tutorial | Data Science Algori...
 
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
 
2.6 support vector machines and associative classifiers revised
2.6 support vector machines and associative classifiers revised2.6 support vector machines and associative classifiers revised
2.6 support vector machines and associative classifiers revised
 
Decision tree and random forest
Decision tree and random forestDecision tree and random forest
Decision tree and random forest
 

Similar to Machine learning algorithms

Introduction to data mining and machine learning
Introduction to data mining and machine learningIntroduction to data mining and machine learning
Introduction to data mining and machine learning
Tilani Gunawardena PhD(UNIBAS), BSc(Pera), FHEA(UK), CEng, MIESL
 
Classification Based Machine Learning Algorithms
Classification Based Machine Learning AlgorithmsClassification Based Machine Learning Algorithms
Classification Based Machine Learning Algorithms
Md. Main Uddin Rony
 
Supervised and unsupervised learning
Supervised and unsupervised learningSupervised and unsupervised learning
Supervised and unsupervised learning
AmAn Singh
 
Data analysis of weather forecasting
Data analysis of weather forecastingData analysis of weather forecasting
Data analysis of weather forecasting
Trupti Shingala, WAS, CPACC, CPWA, JAWS, CSM
 
Mncs 16-09-4주-변승규-introduction to the machine learning
Mncs 16-09-4주-변승규-introduction to the machine learningMncs 16-09-4주-변승규-introduction to the machine learning
Mncs 16-09-4주-변승규-introduction to the machine learning
Seung-gyu Byeon
 
Decision Tree and Bayesian Classification
Decision Tree and Bayesian ClassificationDecision Tree and Bayesian Classification
Decision Tree and Bayesian Classification
Komal Kotak
 
3.2 partitioning methods
3.2 partitioning methods3.2 partitioning methods
3.2 partitioning methods
Krish_ver2
 
Types of Machine Learnig Algorithms(CART, ID3)
Types of Machine Learnig Algorithms(CART, ID3)Types of Machine Learnig Algorithms(CART, ID3)
Types of Machine Learnig Algorithms(CART, ID3)
Fatimakhan325
 
Dbm630 lecture06
Dbm630 lecture06Dbm630 lecture06
Machine learning ( Part 2 )
Machine learning ( Part 2 )Machine learning ( Part 2 )
Machine learning ( Part 2 )
Sunil OS
 
Introduction to conventional machine learning techniques
Introduction to conventional machine learning techniquesIntroduction to conventional machine learning techniques
Introduction to conventional machine learning techniques
Xavier Rafael Palou
 
IMAGE CLASSIFICATION USING DIFFERENT CLASSICAL APPROACHES
IMAGE CLASSIFICATION USING DIFFERENT CLASSICAL APPROACHESIMAGE CLASSIFICATION USING DIFFERENT CLASSICAL APPROACHES
IMAGE CLASSIFICATION USING DIFFERENT CLASSICAL APPROACHES
Vikash Kumar
 
Classifiers
ClassifiersClassifiers
Classifiers
Ayurdata
 
multiarmed bandit.ppt
multiarmed bandit.pptmultiarmed bandit.ppt
multiarmed bandit.ppt
LPrashanthi
 
Introduction ML
Introduction MLIntroduction ML
Introduction ML
Luc Lesoil
 
AI Algorithms
AI AlgorithmsAI Algorithms
AI Algorithms
Dr. C.V. Suresh Babu
 
Premeditated Initial Points for K-Means Clustering
Premeditated Initial Points for K-Means ClusteringPremeditated Initial Points for K-Means Clustering
Premeditated Initial Points for K-Means Clustering
IJCSIS Research Publications
 
20MEMECH Part 3- Classification.pdf
20MEMECH Part 3- Classification.pdf20MEMECH Part 3- Classification.pdf
20MEMECH Part 3- Classification.pdf
MariaKhan905189
 
DagdelenSiriwardaneY..
DagdelenSiriwardaneY..DagdelenSiriwardaneY..
DagdelenSiriwardaneY..
butest
 
Second subjective assignment
Second  subjective assignmentSecond  subjective assignment
Second subjective assignment
yatheeshabodumalla
 

Similar to Machine learning algorithms (20)

Introduction to data mining and machine learning
Introduction to data mining and machine learningIntroduction to data mining and machine learning
Introduction to data mining and machine learning
 
Classification Based Machine Learning Algorithms
Classification Based Machine Learning AlgorithmsClassification Based Machine Learning Algorithms
Classification Based Machine Learning Algorithms
 
Supervised and unsupervised learning
Supervised and unsupervised learningSupervised and unsupervised learning
Supervised and unsupervised learning
 
Data analysis of weather forecasting
Data analysis of weather forecastingData analysis of weather forecasting
Data analysis of weather forecasting
 
Mncs 16-09-4주-변승규-introduction to the machine learning
Mncs 16-09-4주-변승규-introduction to the machine learningMncs 16-09-4주-변승규-introduction to the machine learning
Mncs 16-09-4주-변승규-introduction to the machine learning
 
Decision Tree and Bayesian Classification
Decision Tree and Bayesian ClassificationDecision Tree and Bayesian Classification
Decision Tree and Bayesian Classification
 
3.2 partitioning methods
3.2 partitioning methods3.2 partitioning methods
3.2 partitioning methods
 
Types of Machine Learnig Algorithms(CART, ID3)
Types of Machine Learnig Algorithms(CART, ID3)Types of Machine Learnig Algorithms(CART, ID3)
Types of Machine Learnig Algorithms(CART, ID3)
 
Dbm630 lecture06
Dbm630 lecture06Dbm630 lecture06
Dbm630 lecture06
 
Machine learning ( Part 2 )
Machine learning ( Part 2 )Machine learning ( Part 2 )
Machine learning ( Part 2 )
 
Introduction to conventional machine learning techniques
Introduction to conventional machine learning techniquesIntroduction to conventional machine learning techniques
Introduction to conventional machine learning techniques
 
IMAGE CLASSIFICATION USING DIFFERENT CLASSICAL APPROACHES
IMAGE CLASSIFICATION USING DIFFERENT CLASSICAL APPROACHESIMAGE CLASSIFICATION USING DIFFERENT CLASSICAL APPROACHES
IMAGE CLASSIFICATION USING DIFFERENT CLASSICAL APPROACHES
 
Classifiers
ClassifiersClassifiers
Classifiers
 
multiarmed bandit.ppt
multiarmed bandit.pptmultiarmed bandit.ppt
multiarmed bandit.ppt
 
Introduction ML
Introduction MLIntroduction ML
Introduction ML
 
AI Algorithms
AI AlgorithmsAI Algorithms
AI Algorithms
 
Premeditated Initial Points for K-Means Clustering
Premeditated Initial Points for K-Means ClusteringPremeditated Initial Points for K-Means Clustering
Premeditated Initial Points for K-Means Clustering
 
20MEMECH Part 3- Classification.pdf
20MEMECH Part 3- Classification.pdf20MEMECH Part 3- Classification.pdf
20MEMECH Part 3- Classification.pdf
 
DagdelenSiriwardaneY..
DagdelenSiriwardaneY..DagdelenSiriwardaneY..
DagdelenSiriwardaneY..
 
Second subjective assignment
Second  subjective assignmentSecond  subjective assignment
Second subjective assignment
 

Recently uploaded

Artificial Intelligence for XMLDevelopment
Artificial Intelligence for XMLDevelopmentArtificial Intelligence for XMLDevelopment
Artificial Intelligence for XMLDevelopment
Octavian Nadolu
 
Pushing the limits of ePRTC: 100ns holdover for 100 days
Pushing the limits of ePRTC: 100ns holdover for 100 daysPushing the limits of ePRTC: 100ns holdover for 100 days
Pushing the limits of ePRTC: 100ns holdover for 100 days
Adtran
 
Infrastructure Challenges in Scaling RAG with Custom AI models
Infrastructure Challenges in Scaling RAG with Custom AI modelsInfrastructure Challenges in Scaling RAG with Custom AI models
Infrastructure Challenges in Scaling RAG with Custom AI models
Zilliz
 
Uni Systems Copilot event_05062024_C.Vlachos.pdf
Uni Systems Copilot event_05062024_C.Vlachos.pdfUni Systems Copilot event_05062024_C.Vlachos.pdf
Uni Systems Copilot event_05062024_C.Vlachos.pdf
Uni Systems S.M.S.A.
 
Presentation of the OECD Artificial Intelligence Review of Germany
Presentation of the OECD Artificial Intelligence Review of GermanyPresentation of the OECD Artificial Intelligence Review of Germany
Presentation of the OECD Artificial Intelligence Review of Germany
innovationoecd
 
RESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for studentsRESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for students
KAMESHS29
 
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
Neo4j
 
Mariano G Tinti - Decoding SpaceX
Mariano G Tinti - Decoding SpaceXMariano G Tinti - Decoding SpaceX
Mariano G Tinti - Decoding SpaceX
Mariano Tinti
 
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
Neo4j
 
20240609 QFM020 Irresponsible AI Reading List May 2024
20240609 QFM020 Irresponsible AI Reading List May 202420240609 QFM020 Irresponsible AI Reading List May 2024
20240609 QFM020 Irresponsible AI Reading List May 2024
Matthew Sinclair
 
GenAI Pilot Implementation in the organizations
GenAI Pilot Implementation in the organizationsGenAI Pilot Implementation in the organizations
GenAI Pilot Implementation in the organizations
kumardaparthi1024
 
AI 101: An Introduction to the Basics and Impact of Artificial Intelligence
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceAI 101: An Introduction to the Basics and Impact of Artificial Intelligence
AI 101: An Introduction to the Basics and Impact of Artificial Intelligence
IndexBug
 
Mind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AIMind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AI
Kumud Singh
 
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
Neo4j
 
Driving Business Innovation: Latest Generative AI Advancements & Success Story
Driving Business Innovation: Latest Generative AI Advancements & Success StoryDriving Business Innovation: Latest Generative AI Advancements & Success Story
Driving Business Innovation: Latest Generative AI Advancements & Success Story
Safe Software
 
UiPath Test Automation using UiPath Test Suite series, part 5
UiPath Test Automation using UiPath Test Suite series, part 5UiPath Test Automation using UiPath Test Suite series, part 5
UiPath Test Automation using UiPath Test Suite series, part 5
DianaGray10
 
GraphSummit Singapore | Neo4j Product Vision & Roadmap - Q2 2024
GraphSummit Singapore | Neo4j Product Vision & Roadmap - Q2 2024GraphSummit Singapore | Neo4j Product Vision & Roadmap - Q2 2024
GraphSummit Singapore | Neo4j Product Vision & Roadmap - Q2 2024
Neo4j
 
Programming Foundation Models with DSPy - Meetup Slides
Programming Foundation Models with DSPy - Meetup SlidesProgramming Foundation Models with DSPy - Meetup Slides
Programming Foundation Models with DSPy - Meetup Slides
Zilliz
 
Communications Mining Series - Zero to Hero - Session 1
Communications Mining Series - Zero to Hero - Session 1Communications Mining Series - Zero to Hero - Session 1
Communications Mining Series - Zero to Hero - Session 1
DianaGray10
 
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slack
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with SlackLet's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slack
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slack
shyamraj55
 

Recently uploaded (20)

Artificial Intelligence for XMLDevelopment
Artificial Intelligence for XMLDevelopmentArtificial Intelligence for XMLDevelopment
Artificial Intelligence for XMLDevelopment
 
Pushing the limits of ePRTC: 100ns holdover for 100 days
Pushing the limits of ePRTC: 100ns holdover for 100 daysPushing the limits of ePRTC: 100ns holdover for 100 days
Pushing the limits of ePRTC: 100ns holdover for 100 days
 
Infrastructure Challenges in Scaling RAG with Custom AI models
Infrastructure Challenges in Scaling RAG with Custom AI modelsInfrastructure Challenges in Scaling RAG with Custom AI models
Infrastructure Challenges in Scaling RAG with Custom AI models
 
Uni Systems Copilot event_05062024_C.Vlachos.pdf
Uni Systems Copilot event_05062024_C.Vlachos.pdfUni Systems Copilot event_05062024_C.Vlachos.pdf
Uni Systems Copilot event_05062024_C.Vlachos.pdf
 
Presentation of the OECD Artificial Intelligence Review of Germany
Presentation of the OECD Artificial Intelligence Review of GermanyPresentation of the OECD Artificial Intelligence Review of Germany
Presentation of the OECD Artificial Intelligence Review of Germany
 
RESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for studentsRESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for students
 
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
 
Mariano G Tinti - Decoding SpaceX
Mariano G Tinti - Decoding SpaceXMariano G Tinti - Decoding SpaceX
Mariano G Tinti - Decoding SpaceX
 
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
 
20240609 QFM020 Irresponsible AI Reading List May 2024
20240609 QFM020 Irresponsible AI Reading List May 202420240609 QFM020 Irresponsible AI Reading List May 2024
20240609 QFM020 Irresponsible AI Reading List May 2024
 
GenAI Pilot Implementation in the organizations
GenAI Pilot Implementation in the organizationsGenAI Pilot Implementation in the organizations
GenAI Pilot Implementation in the organizations
 
AI 101: An Introduction to the Basics and Impact of Artificial Intelligence
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceAI 101: An Introduction to the Basics and Impact of Artificial Intelligence
AI 101: An Introduction to the Basics and Impact of Artificial Intelligence
 
Mind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AIMind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AI
 
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
 
Driving Business Innovation: Latest Generative AI Advancements & Success Story
Driving Business Innovation: Latest Generative AI Advancements & Success StoryDriving Business Innovation: Latest Generative AI Advancements & Success Story
Driving Business Innovation: Latest Generative AI Advancements & Success Story
 
UiPath Test Automation using UiPath Test Suite series, part 5
UiPath Test Automation using UiPath Test Suite series, part 5UiPath Test Automation using UiPath Test Suite series, part 5
UiPath Test Automation using UiPath Test Suite series, part 5
 
GraphSummit Singapore | Neo4j Product Vision & Roadmap - Q2 2024
GraphSummit Singapore | Neo4j Product Vision & Roadmap - Q2 2024GraphSummit Singapore | Neo4j Product Vision & Roadmap - Q2 2024
GraphSummit Singapore | Neo4j Product Vision & Roadmap - Q2 2024
 
Programming Foundation Models with DSPy - Meetup Slides
Programming Foundation Models with DSPy - Meetup SlidesProgramming Foundation Models with DSPy - Meetup Slides
Programming Foundation Models with DSPy - Meetup Slides
 
Communications Mining Series - Zero to Hero - Session 1
Communications Mining Series - Zero to Hero - Session 1Communications Mining Series - Zero to Hero - Session 1
Communications Mining Series - Zero to Hero - Session 1
 
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slack
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with SlackLet's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slack
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slack
 

Machine learning algorithms

  • 2. What is machine learning? ● Give learning abilities to computers rather than defining all states ● Uses subfields of AI - computational learning theory and pattern recognition ● Make computer programs work on two special stages “Train” and “Predict” 2
  • 3. Machine learning vs conditional programming Conditional programming uses the simple if-then else rules Problem : Detect flower name by its features Conditional approach - use if-else rules for all states AI approach - Train ML model and predict the result. 3
  • 4. Supervised learning Supervised learning is the machine learning task of inferring a function from labeled training data. The training data consist of a set of training examples. 4
  • 5. Supervised learning algorithms Decision trees Naive bayes K-nearest 5 Train Predict
  • 7. 1. Decision Tree Decision tree builds classification model using tree structure. It breaks down a dataset into smaller and smaller subsets. Finding the optimal decision tree is np-hard So we use greedy technique 7
  • 8. Decision tree algorithm 1. starting with whole training data 2. select attribute or value along dimension that gives “best” split 3. create child nodes based on split 4. recurse on each child using child data until a stopping criterion is reached • all examples have same class - Entropy is 0 • amount of data is too small - < Min_samples_split • tree too large Problem: How do we choose the “best” attribute? 8
  • 9. Simple Example Weekend (Example) Weather Parents Money Decision (Category) W1 Sunny Yes Rich Cinema W2 Sunny No Rich Tennis W3 Windy Yes Rich Cinema W4 Rainy Yes Poor Cinema W5 Rainy No Rich Stay in W6 Rainy Yes Poor Cinema W7 Windy No Poor Cinema W8 Windy No Rich Shopping W9 Windy Yes Rich Cinema W10 Sunny No Rich Tennis 9
  • 11. Decision tree When Parent is the splitter entropy is 1.571 Parameters Criterion = entropy*, gini(default) Splitter = best(default)*, random Min_samples_split = 2* (default) * - used in here 11
  • 12. How prediction works Today is windy. I have money and parents not at home. Predict what I will do?? Weather = “Windy” 1 Parent = “No” 0 Money = “Rich” 1 classified=[0, 1, 0, 0] I may start shopping! 12
  • 13. Decision tree for large dataset Sklearn iris data set 13
  • 15. 2. Naive bayes It is a classification technique based on Bayes’ Theorem with an assumption of independence among predictors. Primarily used for text classification which involves high dimensional training data sets. Example : Spam filtration, Sentimental analysis, and classifying news articles. Bayes theorem provides a way of calculating posterior probability P(c|x) from P(c), P(x) and P(x|c). 15
  • 16. P(c|x) is the posterior probability of class (c,target) given predictor (x, attributes). ● P(c) is the prior probability of class. ● P(x|c) is the likelihood which is the probability of predictor given class.16
  • 17. How Naive Bayes algorithm works? Example : Take training data set of weather and corresponding target variable ‘Play’ (suggesting possibilities of playing). Then classify whether players will play or not based on weather condition. Let’s follow the below steps to perform it… 17
  • 18. Steps: 1. Convert the data set into a frequency table. 2. Create Likelihood table by finding the probabilities. (Overcast probability=0.29 and probability of playing is 0.64) 18
  • 19. 3. Use Naive bayesian equation to calculate the posterior probability for each class. The class with the highest posterior probability is the outcome of prediction. Problem: Players will play if weather is sunny. Is this statement is correct? Solution: Solve it using the method of posterior probability. P(Yes|Sunny)=P(Sunny|Yes)*P(Yes) / P(Sunny) Here, P(Sunny|Yes)=3/9=0.33 P(Sunny)=5/14=0.36, P(Yes)=9/14=0.64 P(Yes|Sunny)=0.33*0.64/0.36=0.60 19
  • 22. k - Nearest neighbour 22
  • 23. 3. k-Nearest Neighbour Introduction The KNN algorithm is a robust and versatile classifier that is often used as a benchmark for more complex classifiers such as Artificial Neural Networks (ANN) and Support Vector Machines (SVM). Despite its simplicity, KNN can outperform more powerful classifiers and is used in a variety of applications such as economic forecasting, data compression and genetics. 23
  • 24. What is KNN? KNN falls in the supervised learning family of algorithms. Informally, this means that we are given a labelled dataset consisting of training observations (x,y)(x,y) and would like to capture the relationship between xx and yy. More formally, our goal is to learn a function h:X→Yh:X→Y so that given an unseen observation xx, h(x)h(x) can confidently predict the corresponding output. ● KNN is non-parametric, instance-based and used in a supervised learning setting. ● Minimal training but expensive testing. 24
  • 25. How does KNN work? The K-nearest neighbor algorithm essentially boils down to forming a majority vote between the K most similar instances to a given “unseen” observation. Similarity is defined according to a distance metric between two data points. A popular choice is the Euclidean distance given by 25
  • 26. How it works(cont...) 1. Assign k value preferably a small odd number. 2. Find the closest number of k points. 3. Assign the new point from the majority of classes. 26
  • 28. When K is small, we are restraining the region of a given prediction and forcing our classifier to be “more blind” to the overall distribution. A small value for K provides the most flexible fit, which will have low bias but high variance. Graphically, our decision boundary will be more jagged. 28
  • 29. On the other hand, a higher K averages more voters in each prediction and hence is more resilient to outliers. Larger values of K will have smoother decision boundaries which means lower variance but increased bias. 29
  • 30. Exploring KNN in Code 30
  • 32. Unsupervised learning - Clustering ● organization of unlabeled data into similarity groups ● Three types of clustering techniques Hierarchical Partitional Bayesian 32
  • 33. Clustering Algorithms K-means ● Partitional clustering algorithm ● Choose k(random) data points(seeds) to be the initial centroids ● Assign each data points to the closest centroid 33
  • 35. 4. K-means Algorithm ● Decide value for k ● Initialize the k cluster centers ● Assigning objects into nearest clusters ● Re-estimate the cluster centers ● If objects are not change the membership,exit and go to fourth step 35
  • 41. Python Code Output Labels [0 0 1 1] Predicted Label [0] 41
  • 42. Output Labels [1 1 0 0] Predicted Label [1] 42