SlideShare a Scribd company logo
UNIT 4 AI
BY:SURBHI SAROHA
 Supervised and unsupervised learning.
 Decision trees.
 Statistical learning model
 Learning with complete data: Naïve Bayes
models
 Learning with hidden data:EM algorithm
 Reinforcement learning
 Supervised learning as the name indicates the
presence of a supervisor as a teacher.
 Basically supervised learning is a learning in
which we teach or train the machine using data
which is well labeled that means some data is
already tagged with the correct answer.
 After that, the machine is provided with a new
set of examples(data) so that supervised
learning algorithm analyses the training data(set
of training examples) and produces a correct
outcome from labeled data.
 For instance, suppose you are given a basket filled
with different kinds of fruits.
 Now the first step is to train the machine with all
different fruits one by one like this:
 If shape of object is rounded and depression at top
having color Red then it will be labeled as –Apple.
 If shape of object is long curving cylinder having color
Green-Yellow then it will be labeled as –Banana.
 Now suppose after training the data, you have given a
new separate fruit say Banana from basket and asked
to identify it.
 Since the machine has already learned the
things from previous data and this time have
to use it wisely.
 It will first classify the fruit with its shape and
color and would confirm the fruit name as
BANANA and put it in Banana category.
 Thus the machine learns the things from
training data(basket containing fruits) and
then apply the knowledge to test data(new
fruit).
 Supervised learning classified into two
categories of algorithms:
 Classification: A classification problem is
when the output variable is a category, such
as “Red” or “blue” or “disease” and “no
disease”.
 Regression: A regression problem is when
the output variable is a real value, such as
“dollars” or “weight”.
 Supervised learning allows collecting data
and produce data output from the previous
experiences.
 Helps to optimize performance criteria with
the help of experience.
 Supervised machine learning helps to solve
various types of real-world computation
problems.
 Classifying big data can be challenging.
 Training for supervised learning needs a lot of
computation time. So, it requires a lot of
time.
 Unsupervised learning is the training of machine using
information that is neither classified nor labeled and
allowing the algorithm to act on that information
without guidance.
 Here the task of machine is to group unsorted
information according to similarities, patterns and
differences without any prior training of data.
 Unlike supervised learning, no teacher is provided that
means no training will be given to the machine.
 Therefore machine is restricted to find the hidden
structure in unlabeled data by our-self.
 For instance, suppose it is given an image having both dogs and
cats which have not seen ever.
 Thus the machine has no idea about the features of dogs and cat
so we can’t categorize it in dogs and cats.
 But it can categorize them according to their similarities,
patterns, and differences i.e., we can easily categorize the above
picture into two parts.
 First first may contain all pics having dogs in it and second part
may contain all pics having cats in it.
 Here you didn’t learn anything before, means no training data or
examples. It allows the model to work on its own to discover
patterns and information that was previously undetected.
 It mainly deals with unlabelled data.
 Unsupervised learning classified into two categories
of algorithms:
 Clustering: A clustering problem is where you want to
discover the inherent groupings in the data, such as
grouping customers by purchasing behavior.
 Association: An association rule learning problem is
where you want to discover rules that describe large
portions of your data, such as people that buy X also
tend to buyY.
 A decision tree is a decision support tool that uses
a tree-like model of decisions and their possible
consequences, including chance event outcomes,
resource costs, and utility.
 It is one way to display an algorithm that only
contains conditional control statements.
 Decision Tree : Decision tree is the most powerful and
popular tool for classification and prediction.
 A Decision tree is a flowchart like tree structure,
where each internal node denotes a test on an
attribute, each branch represents an outcome of the
test, and each leaf node (terminal node) holds a class
label.
 Statistical learning theory is a framework
for machine learning drawing from the fields
of statistics and functional analysis.
 Statistical learning theory deals with the
problem of finding a predictive function
based on data.
 Statistical learning theory has led to
successful applications in fields such
as computer vision, speech recognition,
and bioinformatics.
 The goals of learning are understanding and prediction.
 Learning falls into many categories, including supervised
learning, unsupervised learning, online learning,
and reinforcement learning.
 From the perspective of statistical learning theory,
supervised learning is best understood.
 Supervised learning involves learning from a training
set of data.
 Every point in the training is an input-output pair, where
the input maps to an output.
 The learning problem consists of inferring the function
that maps between the input and the output, such that the
learned function can be used to predict the output from
future input.
 Naïve Bayes models
 Naïve Bayes Classifier is one of the simple
and most effectiveClassification algorithms
which helps in building the fast machine
learning models that can make quick
predictions.
 It is a probabilistic classifier, which means it
predicts on the basis of the probability of an
object.
 Naive Bayes classifiers are a collection of classification algorithms
based on Bayes’Theorem.
 It is not a single algorithm but a family of algorithms where all of
them share a common principle, i.e. every pair of features being
classified is independent of each other.
 Bayes’ theorem finds many uses in the probability theory and
statistics.
 There’s a micro chance that you have never heard about this
theorem in your life.
 Turns out that this theorem has found its way into the world of
machine learning, to form one of the highly decorated algorithms.
 In this article, we will learn all about the Naive Bayes Algorithm,
along with its variations for different purposes in machine
learning.
 Naive Bayes is a simple technique for constructing classifiers:
models that assign class labels to problem instances, represented
as vectors of feature values, where the class labels are drawn from
some finite set.
 There is not a single algorithm for training such classifiers, but a
family of algorithms based on a common principle: all naive Bayes
classifiers assume that the value of a particular feature
is independent of the value of any other feature, given the class
variable.
 For example, a fruit may be considered to be an apple if it is red,
round, and about 10 cm in diameter.
 A naive Bayes classifier considers each of these features to
contribute independently to the probability that this fruit is an
apple, regardless of any possible correlations between the color,
roundness, and diameter features.
 For some types of probability models, naive Bayes classifiers can be
trained very efficiently in a supervised learning setting.
 In many practical applications, parameter estimation for naive Bayes
models uses the method of maximum likelihood; in other words, one can
work with the naive Bayes model without accepting Bayesian
probability or using any Bayesian methods.
 Despite their naive design and apparently oversimplified assumptions,
naive Bayes classifiers have worked quite well in many complex real-
world situations.
 In 2004, an analysis of the Bayesian classification problem showed that
there are sound theoretical reasons for the apparently
implausible efficacy of naive Bayes classifiers.
 Still, a comprehensive comparison with other classification algorithms in
2006 showed that Bayes classification is outperformed by other
approaches, such as boosted trees or random forests.
 An advantage of naive Bayes is that it only requires a small number of
training data to estimate the parameters necessary for classification
 It is a very general algorithm used
to learn probabilistic models in which
variables are hidden; that is, some of the
variables are not observed.
 Models with hidden variables are sometimes
called latent variable models.
 In stats, an Expectation–Maximisation (EM) algorithm is
used as an iterative method to know out (local) maximum
likelihood or maximum a posteriori (MAP) estimates of
parameters in statistical models, where the model
depends on unobserved latent variables.
 The EM iteration is used as an alternate between the
performing an expectation (E) step, which creates a
function for the expectation of the log-likelihood
evaluated using the present estimate for the parameters
 It is important for the maximisation (M) step, which help
in computing parameters maximising the expected log-
likelihood found on the E step.These parameter-estimates
are then wont to determine the distribution of the latent
variables within the next E step.
 This algorithm is really at the bottom of the
many unsupervised clustering algorithms
within the field of machine learning.
 It was explained, proposed and given its
name during a paper published in 1977 by
Arthur Dempster, Nan Laird and Donald
Rubin. it’s wont to find the local maximum
likelihood parameters of a statistical model
within the cases where latent variables.
 It is often wont to fill the missing data during
a sample.
 It is often used because of the basis of
unsupervised learning of clusters.
 It is often used for the aim of estimating the
parameters of the Hidden Markov Model
(HMM).
 It is often used for locating the values of
latent variables.
 It has slow convergence.
 It makes convergence to the local optima
only.
 It requires both the possibilities, forward and
backward (numerical optimisation requires
only forward probability).
 Reinforcement learning (RL) is an area
of machine learning concerned with
how software agents ought to take actions in
an environment in order to maximize the
notion of cumulative reward.
 Reinforcement learning is one of three basic
machine learning paradigms,
alongside supervised
learning and unsupervised learning.
 Reinforcement learning differs from
supervised learning in not needing labelled
input/output pairs be presented, and in not
needing sub-optimal actions to be explicitly
corrected.
 Instead the focus is on finding a balance
between exploration (of uncharted territory)
and exploitation (of current knowledge).
 The environment is typically stated in the form
of a Markov decision process (MDP), because
many reinforcement learning algorithms for this
context use dynamic programming techniques.
 The main difference between the classical
dynamic programming methods and
reinforcement learning algorithms is that the
latter do not assume knowledge of an exact
mathematical model of the MDP and they target
large MDPs where exact methods become
infeasible.
Machine learning(UNIT 4)

More Related Content

What's hot

Data mining: Classification and prediction
Data mining: Classification and predictionData mining: Classification and prediction
Data mining: Classification and prediction
DataminingTools Inc
 
Datamining - On What Kind of Data
Datamining - On What Kind of DataDatamining - On What Kind of Data
Datamining - On What Kind of Data
wina wulansari
 
Data Preprocessing
Data PreprocessingData Preprocessing
An Introduction to Supervised Machine Learning and Pattern Classification: Th...
An Introduction to Supervised Machine Learning and Pattern Classification: Th...An Introduction to Supervised Machine Learning and Pattern Classification: Th...
An Introduction to Supervised Machine Learning and Pattern Classification: Th...
Sebastian Raschka
 
Data Mining: Outlier analysis
Data Mining: Outlier analysisData Mining: Outlier analysis
Data Mining: Outlier analysis
DataminingTools Inc
 
2.1 Data Mining-classification Basic concepts
2.1 Data Mining-classification Basic concepts2.1 Data Mining-classification Basic concepts
2.1 Data Mining-classification Basic concepts
Krish_ver2
 
Support vector machines (svm)
Support vector machines (svm)Support vector machines (svm)
Support vector machines (svm)
Sharayu Patil
 
Associative Classification: Synopsis
Associative Classification: SynopsisAssociative Classification: Synopsis
Associative Classification: Synopsis
Jagdeep Singh Malhi
 
Decision tree
Decision treeDecision tree
Decision tree
ShraddhaPandey45
 
Chapter 08 Data Mining Techniques
Chapter 08 Data Mining Techniques Chapter 08 Data Mining Techniques
Chapter 08 Data Mining Techniques
Houw Liong The
 
Machine learning and decision trees
Machine learning and decision treesMachine learning and decision trees
Machine learning and decision trees
Padma Metta
 
Introduction to Machine Learning
Introduction to Machine LearningIntroduction to Machine Learning
Introduction to Machine Learning
Shahar Cohen
 
Ensemble methods
Ensemble methods Ensemble methods
Ensemble methods
zekeLabs Technologies
 
Decision tree and random forest
Decision tree and random forestDecision tree and random forest
Decision tree and random forest
Lippo Group Digital
 
Decision trees in Machine Learning
Decision trees in Machine Learning Decision trees in Machine Learning
Decision trees in Machine Learning
Mohammad Junaid Khan
 
Classification using back propagation algorithm
Classification using back propagation algorithmClassification using back propagation algorithm
Classification using back propagation algorithm
KIRAN R
 
Data Science - Part V - Decision Trees & Random Forests
Data Science - Part V - Decision Trees & Random Forests Data Science - Part V - Decision Trees & Random Forests
Data Science - Part V - Decision Trees & Random Forests
Derek Kane
 
Lecture9 - Bayesian-Decision-Theory
Lecture9 - Bayesian-Decision-TheoryLecture9 - Bayesian-Decision-Theory
Lecture9 - Bayesian-Decision-Theory
Albert Orriols-Puig
 
Statistical Pattern recognition(1)
Statistical Pattern recognition(1)Statistical Pattern recognition(1)
Statistical Pattern recognition(1)
Syed Atif Naseem
 
Support Vector machine
Support Vector machineSupport Vector machine
Support Vector machine
Anandha L Ranganathan
 

What's hot (20)

Data mining: Classification and prediction
Data mining: Classification and predictionData mining: Classification and prediction
Data mining: Classification and prediction
 
Datamining - On What Kind of Data
Datamining - On What Kind of DataDatamining - On What Kind of Data
Datamining - On What Kind of Data
 
Data Preprocessing
Data PreprocessingData Preprocessing
Data Preprocessing
 
An Introduction to Supervised Machine Learning and Pattern Classification: Th...
An Introduction to Supervised Machine Learning and Pattern Classification: Th...An Introduction to Supervised Machine Learning and Pattern Classification: Th...
An Introduction to Supervised Machine Learning and Pattern Classification: Th...
 
Data Mining: Outlier analysis
Data Mining: Outlier analysisData Mining: Outlier analysis
Data Mining: Outlier analysis
 
2.1 Data Mining-classification Basic concepts
2.1 Data Mining-classification Basic concepts2.1 Data Mining-classification Basic concepts
2.1 Data Mining-classification Basic concepts
 
Support vector machines (svm)
Support vector machines (svm)Support vector machines (svm)
Support vector machines (svm)
 
Associative Classification: Synopsis
Associative Classification: SynopsisAssociative Classification: Synopsis
Associative Classification: Synopsis
 
Decision tree
Decision treeDecision tree
Decision tree
 
Chapter 08 Data Mining Techniques
Chapter 08 Data Mining Techniques Chapter 08 Data Mining Techniques
Chapter 08 Data Mining Techniques
 
Machine learning and decision trees
Machine learning and decision treesMachine learning and decision trees
Machine learning and decision trees
 
Introduction to Machine Learning
Introduction to Machine LearningIntroduction to Machine Learning
Introduction to Machine Learning
 
Ensemble methods
Ensemble methods Ensemble methods
Ensemble methods
 
Decision tree and random forest
Decision tree and random forestDecision tree and random forest
Decision tree and random forest
 
Decision trees in Machine Learning
Decision trees in Machine Learning Decision trees in Machine Learning
Decision trees in Machine Learning
 
Classification using back propagation algorithm
Classification using back propagation algorithmClassification using back propagation algorithm
Classification using back propagation algorithm
 
Data Science - Part V - Decision Trees & Random Forests
Data Science - Part V - Decision Trees & Random Forests Data Science - Part V - Decision Trees & Random Forests
Data Science - Part V - Decision Trees & Random Forests
 
Lecture9 - Bayesian-Decision-Theory
Lecture9 - Bayesian-Decision-TheoryLecture9 - Bayesian-Decision-Theory
Lecture9 - Bayesian-Decision-Theory
 
Statistical Pattern recognition(1)
Statistical Pattern recognition(1)Statistical Pattern recognition(1)
Statistical Pattern recognition(1)
 
Support Vector machine
Support Vector machineSupport Vector machine
Support Vector machine
 

Similar to Machine learning(UNIT 4)

Machine Learning Interview Questions and Answers
Machine Learning Interview Questions and AnswersMachine Learning Interview Questions and Answers
Machine Learning Interview Questions and Answers
Satyam Jaiswal
 
Machine Learning - Deep Learning
Machine Learning - Deep LearningMachine Learning - Deep Learning
Machine Learning - Deep Learning
Adetimehin Oluwasegun Matthew
 
Introduction to machine learning
Introduction to machine learningIntroduction to machine learning
Introduction to machine learning
Adetimehin Oluwasegun Matthew
 
Types of Machine Learning- Tanvir Siddike Moin
Types of Machine Learning- Tanvir Siddike MoinTypes of Machine Learning- Tanvir Siddike Moin
Types of Machine Learning- Tanvir Siddike Moin
Tanvir Moin
 
Mis End Term Exam Theory Concepts
Mis End Term Exam Theory ConceptsMis End Term Exam Theory Concepts
Mis End Term Exam Theory Concepts
Vidya sagar Sharma
 
Classification
ClassificationClassification
Classification
thamizh arasi
 
An Introduction to Machine Learning
An Introduction to Machine LearningAn Introduction to Machine Learning
An Introduction to Machine Learning
Vedaj Padman
 
Machine learning
Machine learningMachine learning
Machine learning
Dr Geetha Mohan
 
Supervised learning techniques and applications
Supervised learning techniques and applicationsSupervised learning techniques and applications
Supervised learning techniques and applications
Benjaminlapid1
 
Lesson 1 - Overview of Machine Learning and Data Analysis.pptx
Lesson 1 - Overview of Machine Learning and Data Analysis.pptxLesson 1 - Overview of Machine Learning and Data Analysis.pptx
Lesson 1 - Overview of Machine Learning and Data Analysis.pptx
cloudserviceuit
 
Nss power point_machine_learning
Nss power point_machine_learningNss power point_machine_learning
Nss power point_machine_learning
Gauravsd2014
 
Week 4 advanced labeling, augmentation and data preprocessing
Week 4   advanced labeling, augmentation and data preprocessingWeek 4   advanced labeling, augmentation and data preprocessing
Week 4 advanced labeling, augmentation and data preprocessing
Ajay Taneja
 
Machine Learning by Rj
Machine Learning by RjMachine Learning by Rj
Industrial training ppt
Industrial training pptIndustrial training ppt
Industrial training ppt
HRJEETSINGH
 
Unit-V Machine Learning.ppt
Unit-V Machine Learning.pptUnit-V Machine Learning.ppt
Unit-V Machine Learning.ppt
Sharpmark256
 
Unit 2-ML.pptx
Unit 2-ML.pptxUnit 2-ML.pptx
Unit 2-ML.pptx
Chitrachitrap
 
Gradient Boosted trees
Gradient Boosted treesGradient Boosted trees
Gradient Boosted trees
Nihar Ranjan
 
dm1.pdf
dm1.pdfdm1.pdf
dm1.pdf
MarriamAmir1
 
Lecture 3 ml
Lecture 3 mlLecture 3 ml
Lecture 3 ml
Kalpesh Doru
 
PG STAT 531 Lecture 4 Exploratory Data Analysis
PG STAT 531 Lecture 4 Exploratory Data AnalysisPG STAT 531 Lecture 4 Exploratory Data Analysis
PG STAT 531 Lecture 4 Exploratory Data Analysis
Aashish Patel
 

Similar to Machine learning(UNIT 4) (20)

Machine Learning Interview Questions and Answers
Machine Learning Interview Questions and AnswersMachine Learning Interview Questions and Answers
Machine Learning Interview Questions and Answers
 
Machine Learning - Deep Learning
Machine Learning - Deep LearningMachine Learning - Deep Learning
Machine Learning - Deep Learning
 
Introduction to machine learning
Introduction to machine learningIntroduction to machine learning
Introduction to machine learning
 
Types of Machine Learning- Tanvir Siddike Moin
Types of Machine Learning- Tanvir Siddike MoinTypes of Machine Learning- Tanvir Siddike Moin
Types of Machine Learning- Tanvir Siddike Moin
 
Mis End Term Exam Theory Concepts
Mis End Term Exam Theory ConceptsMis End Term Exam Theory Concepts
Mis End Term Exam Theory Concepts
 
Classification
ClassificationClassification
Classification
 
An Introduction to Machine Learning
An Introduction to Machine LearningAn Introduction to Machine Learning
An Introduction to Machine Learning
 
Machine learning
Machine learningMachine learning
Machine learning
 
Supervised learning techniques and applications
Supervised learning techniques and applicationsSupervised learning techniques and applications
Supervised learning techniques and applications
 
Lesson 1 - Overview of Machine Learning and Data Analysis.pptx
Lesson 1 - Overview of Machine Learning and Data Analysis.pptxLesson 1 - Overview of Machine Learning and Data Analysis.pptx
Lesson 1 - Overview of Machine Learning and Data Analysis.pptx
 
Nss power point_machine_learning
Nss power point_machine_learningNss power point_machine_learning
Nss power point_machine_learning
 
Week 4 advanced labeling, augmentation and data preprocessing
Week 4   advanced labeling, augmentation and data preprocessingWeek 4   advanced labeling, augmentation and data preprocessing
Week 4 advanced labeling, augmentation and data preprocessing
 
Machine Learning by Rj
Machine Learning by RjMachine Learning by Rj
Machine Learning by Rj
 
Industrial training ppt
Industrial training pptIndustrial training ppt
Industrial training ppt
 
Unit-V Machine Learning.ppt
Unit-V Machine Learning.pptUnit-V Machine Learning.ppt
Unit-V Machine Learning.ppt
 
Unit 2-ML.pptx
Unit 2-ML.pptxUnit 2-ML.pptx
Unit 2-ML.pptx
 
Gradient Boosted trees
Gradient Boosted treesGradient Boosted trees
Gradient Boosted trees
 
dm1.pdf
dm1.pdfdm1.pdf
dm1.pdf
 
Lecture 3 ml
Lecture 3 mlLecture 3 ml
Lecture 3 ml
 
PG STAT 531 Lecture 4 Exploratory Data Analysis
PG STAT 531 Lecture 4 Exploratory Data AnalysisPG STAT 531 Lecture 4 Exploratory Data Analysis
PG STAT 531 Lecture 4 Exploratory Data Analysis
 

More from SURBHI SAROHA

Cloud Computing (Infrastructure as a Service)UNIT 2
Cloud Computing (Infrastructure as a Service)UNIT 2Cloud Computing (Infrastructure as a Service)UNIT 2
Cloud Computing (Infrastructure as a Service)UNIT 2
SURBHI SAROHA
 
Management Information System(Unit 2).pptx
Management Information System(Unit 2).pptxManagement Information System(Unit 2).pptx
Management Information System(Unit 2).pptx
SURBHI SAROHA
 
Searching in Data Structure(Linear search and Binary search)
Searching in Data Structure(Linear search and Binary search)Searching in Data Structure(Linear search and Binary search)
Searching in Data Structure(Linear search and Binary search)
SURBHI SAROHA
 
Management Information System(UNIT 1).pptx
Management Information System(UNIT 1).pptxManagement Information System(UNIT 1).pptx
Management Information System(UNIT 1).pptx
SURBHI SAROHA
 
Introduction to Cloud Computing(UNIT 1).pptx
Introduction to Cloud Computing(UNIT 1).pptxIntroduction to Cloud Computing(UNIT 1).pptx
Introduction to Cloud Computing(UNIT 1).pptx
SURBHI SAROHA
 
JAVA (UNIT 5)
JAVA (UNIT 5)JAVA (UNIT 5)
JAVA (UNIT 5)
SURBHI SAROHA
 
DBMS (UNIT 5)
DBMS (UNIT 5)DBMS (UNIT 5)
DBMS (UNIT 5)
SURBHI SAROHA
 
DBMS UNIT 4
DBMS UNIT 4DBMS UNIT 4
DBMS UNIT 4
SURBHI SAROHA
 
JAVA(UNIT 4)
JAVA(UNIT 4)JAVA(UNIT 4)
JAVA(UNIT 4)
SURBHI SAROHA
 
OOPs & C++(UNIT 5)
OOPs & C++(UNIT 5)OOPs & C++(UNIT 5)
OOPs & C++(UNIT 5)
SURBHI SAROHA
 
OOPS & C++(UNIT 4)
OOPS & C++(UNIT 4)OOPS & C++(UNIT 4)
OOPS & C++(UNIT 4)
SURBHI SAROHA
 
DBMS UNIT 3
DBMS UNIT 3DBMS UNIT 3
DBMS UNIT 3
SURBHI SAROHA
 
JAVA (UNIT 3)
JAVA (UNIT 3)JAVA (UNIT 3)
JAVA (UNIT 3)
SURBHI SAROHA
 
Keys in dbms(UNIT 2)
Keys in dbms(UNIT 2)Keys in dbms(UNIT 2)
Keys in dbms(UNIT 2)
SURBHI SAROHA
 
DBMS (UNIT 2)
DBMS (UNIT 2)DBMS (UNIT 2)
DBMS (UNIT 2)
SURBHI SAROHA
 
JAVA UNIT 2
JAVA UNIT 2JAVA UNIT 2
JAVA UNIT 2
SURBHI SAROHA
 
Database Management System(UNIT 1)
Database Management System(UNIT 1)Database Management System(UNIT 1)
Database Management System(UNIT 1)
SURBHI SAROHA
 
Object-Oriented Programming with Java UNIT 1
Object-Oriented Programming with Java UNIT 1Object-Oriented Programming with Java UNIT 1
Object-Oriented Programming with Java UNIT 1
SURBHI SAROHA
 
Database Management System(UNIT 1)
Database Management System(UNIT 1)Database Management System(UNIT 1)
Database Management System(UNIT 1)
SURBHI SAROHA
 
OOPs & C++ UNIT 3
OOPs & C++ UNIT 3OOPs & C++ UNIT 3
OOPs & C++ UNIT 3
SURBHI SAROHA
 

More from SURBHI SAROHA (20)

Cloud Computing (Infrastructure as a Service)UNIT 2
Cloud Computing (Infrastructure as a Service)UNIT 2Cloud Computing (Infrastructure as a Service)UNIT 2
Cloud Computing (Infrastructure as a Service)UNIT 2
 
Management Information System(Unit 2).pptx
Management Information System(Unit 2).pptxManagement Information System(Unit 2).pptx
Management Information System(Unit 2).pptx
 
Searching in Data Structure(Linear search and Binary search)
Searching in Data Structure(Linear search and Binary search)Searching in Data Structure(Linear search and Binary search)
Searching in Data Structure(Linear search and Binary search)
 
Management Information System(UNIT 1).pptx
Management Information System(UNIT 1).pptxManagement Information System(UNIT 1).pptx
Management Information System(UNIT 1).pptx
 
Introduction to Cloud Computing(UNIT 1).pptx
Introduction to Cloud Computing(UNIT 1).pptxIntroduction to Cloud Computing(UNIT 1).pptx
Introduction to Cloud Computing(UNIT 1).pptx
 
JAVA (UNIT 5)
JAVA (UNIT 5)JAVA (UNIT 5)
JAVA (UNIT 5)
 
DBMS (UNIT 5)
DBMS (UNIT 5)DBMS (UNIT 5)
DBMS (UNIT 5)
 
DBMS UNIT 4
DBMS UNIT 4DBMS UNIT 4
DBMS UNIT 4
 
JAVA(UNIT 4)
JAVA(UNIT 4)JAVA(UNIT 4)
JAVA(UNIT 4)
 
OOPs & C++(UNIT 5)
OOPs & C++(UNIT 5)OOPs & C++(UNIT 5)
OOPs & C++(UNIT 5)
 
OOPS & C++(UNIT 4)
OOPS & C++(UNIT 4)OOPS & C++(UNIT 4)
OOPS & C++(UNIT 4)
 
DBMS UNIT 3
DBMS UNIT 3DBMS UNIT 3
DBMS UNIT 3
 
JAVA (UNIT 3)
JAVA (UNIT 3)JAVA (UNIT 3)
JAVA (UNIT 3)
 
Keys in dbms(UNIT 2)
Keys in dbms(UNIT 2)Keys in dbms(UNIT 2)
Keys in dbms(UNIT 2)
 
DBMS (UNIT 2)
DBMS (UNIT 2)DBMS (UNIT 2)
DBMS (UNIT 2)
 
JAVA UNIT 2
JAVA UNIT 2JAVA UNIT 2
JAVA UNIT 2
 
Database Management System(UNIT 1)
Database Management System(UNIT 1)Database Management System(UNIT 1)
Database Management System(UNIT 1)
 
Object-Oriented Programming with Java UNIT 1
Object-Oriented Programming with Java UNIT 1Object-Oriented Programming with Java UNIT 1
Object-Oriented Programming with Java UNIT 1
 
Database Management System(UNIT 1)
Database Management System(UNIT 1)Database Management System(UNIT 1)
Database Management System(UNIT 1)
 
OOPs & C++ UNIT 3
OOPs & C++ UNIT 3OOPs & C++ UNIT 3
OOPs & C++ UNIT 3
 

Recently uploaded

Digital Artifact 1 - 10VCD Environments Unit
Digital Artifact 1 - 10VCD Environments UnitDigital Artifact 1 - 10VCD Environments Unit
Digital Artifact 1 - 10VCD Environments Unit
chanes7
 
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
National Information Standards Organization (NISO)
 
PIMS Job Advertisement 2024.pdf Islamabad
PIMS Job Advertisement 2024.pdf IslamabadPIMS Job Advertisement 2024.pdf Islamabad
PIMS Job Advertisement 2024.pdf Islamabad
AyyanKhan40
 
How to Make a Field Mandatory in Odoo 17
How to Make a Field Mandatory in Odoo 17How to Make a Field Mandatory in Odoo 17
How to Make a Field Mandatory in Odoo 17
Celine George
 
Liberal Approach to the Study of Indian Politics.pdf
Liberal Approach to the Study of Indian Politics.pdfLiberal Approach to the Study of Indian Politics.pdf
Liberal Approach to the Study of Indian Politics.pdf
WaniBasim
 
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdfবাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
eBook.com.bd (প্রয়োজনীয় বাংলা বই)
 
How to Manage Your Lost Opportunities in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRMHow to Manage Your Lost Opportunities in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRM
Celine George
 
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdfANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
Priyankaranawat4
 
Pride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School DistrictPride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School District
David Douglas School District
 
A Independência da América Espanhola LAPBOOK.pdf
A Independência da América Espanhola LAPBOOK.pdfA Independência da América Espanhola LAPBOOK.pdf
A Independência da América Espanhola LAPBOOK.pdf
Jean Carlos Nunes Paixão
 
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
GeorgeMilliken2
 
Walmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdfWalmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdf
TechSoup
 
How to Setup Warehouse & Location in Odoo 17 Inventory
How to Setup Warehouse & Location in Odoo 17 InventoryHow to Setup Warehouse & Location in Odoo 17 Inventory
How to Setup Warehouse & Location in Odoo 17 Inventory
Celine George
 
Digital Artefact 1 - Tiny Home Environmental Design
Digital Artefact 1 - Tiny Home Environmental DesignDigital Artefact 1 - Tiny Home Environmental Design
Digital Artefact 1 - Tiny Home Environmental Design
amberjdewit93
 
PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.
Dr. Shivangi Singh Parihar
 
How to Add Chatter in the odoo 17 ERP Module
How to Add Chatter in the odoo 17 ERP ModuleHow to Add Chatter in the odoo 17 ERP Module
How to Add Chatter in the odoo 17 ERP Module
Celine George
 
The basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptxThe basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptx
heathfieldcps1
 
RPMS TEMPLATE FOR SCHOOL YEAR 2023-2024 FOR TEACHER 1 TO TEACHER 3
RPMS TEMPLATE FOR SCHOOL YEAR 2023-2024 FOR TEACHER 1 TO TEACHER 3RPMS TEMPLATE FOR SCHOOL YEAR 2023-2024 FOR TEACHER 1 TO TEACHER 3
RPMS TEMPLATE FOR SCHOOL YEAR 2023-2024 FOR TEACHER 1 TO TEACHER 3
IreneSebastianRueco1
 
How to Build a Module in Odoo 17 Using the Scaffold Method
How to Build a Module in Odoo 17 Using the Scaffold MethodHow to Build a Module in Odoo 17 Using the Scaffold Method
How to Build a Module in Odoo 17 Using the Scaffold Method
Celine George
 
Main Java[All of the Base Concepts}.docx
Main Java[All of the Base Concepts}.docxMain Java[All of the Base Concepts}.docx
Main Java[All of the Base Concepts}.docx
adhitya5119
 

Recently uploaded (20)

Digital Artifact 1 - 10VCD Environments Unit
Digital Artifact 1 - 10VCD Environments UnitDigital Artifact 1 - 10VCD Environments Unit
Digital Artifact 1 - 10VCD Environments Unit
 
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
 
PIMS Job Advertisement 2024.pdf Islamabad
PIMS Job Advertisement 2024.pdf IslamabadPIMS Job Advertisement 2024.pdf Islamabad
PIMS Job Advertisement 2024.pdf Islamabad
 
How to Make a Field Mandatory in Odoo 17
How to Make a Field Mandatory in Odoo 17How to Make a Field Mandatory in Odoo 17
How to Make a Field Mandatory in Odoo 17
 
Liberal Approach to the Study of Indian Politics.pdf
Liberal Approach to the Study of Indian Politics.pdfLiberal Approach to the Study of Indian Politics.pdf
Liberal Approach to the Study of Indian Politics.pdf
 
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdfবাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
 
How to Manage Your Lost Opportunities in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRMHow to Manage Your Lost Opportunities in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRM
 
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdfANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
 
Pride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School DistrictPride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School District
 
A Independência da América Espanhola LAPBOOK.pdf
A Independência da América Espanhola LAPBOOK.pdfA Independência da América Espanhola LAPBOOK.pdf
A Independência da América Espanhola LAPBOOK.pdf
 
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
 
Walmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdfWalmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdf
 
How to Setup Warehouse & Location in Odoo 17 Inventory
How to Setup Warehouse & Location in Odoo 17 InventoryHow to Setup Warehouse & Location in Odoo 17 Inventory
How to Setup Warehouse & Location in Odoo 17 Inventory
 
Digital Artefact 1 - Tiny Home Environmental Design
Digital Artefact 1 - Tiny Home Environmental DesignDigital Artefact 1 - Tiny Home Environmental Design
Digital Artefact 1 - Tiny Home Environmental Design
 
PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.
 
How to Add Chatter in the odoo 17 ERP Module
How to Add Chatter in the odoo 17 ERP ModuleHow to Add Chatter in the odoo 17 ERP Module
How to Add Chatter in the odoo 17 ERP Module
 
The basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptxThe basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptx
 
RPMS TEMPLATE FOR SCHOOL YEAR 2023-2024 FOR TEACHER 1 TO TEACHER 3
RPMS TEMPLATE FOR SCHOOL YEAR 2023-2024 FOR TEACHER 1 TO TEACHER 3RPMS TEMPLATE FOR SCHOOL YEAR 2023-2024 FOR TEACHER 1 TO TEACHER 3
RPMS TEMPLATE FOR SCHOOL YEAR 2023-2024 FOR TEACHER 1 TO TEACHER 3
 
How to Build a Module in Odoo 17 Using the Scaffold Method
How to Build a Module in Odoo 17 Using the Scaffold MethodHow to Build a Module in Odoo 17 Using the Scaffold Method
How to Build a Module in Odoo 17 Using the Scaffold Method
 
Main Java[All of the Base Concepts}.docx
Main Java[All of the Base Concepts}.docxMain Java[All of the Base Concepts}.docx
Main Java[All of the Base Concepts}.docx
 

Machine learning(UNIT 4)

  • 2.  Supervised and unsupervised learning.  Decision trees.  Statistical learning model  Learning with complete data: Naïve Bayes models  Learning with hidden data:EM algorithm  Reinforcement learning
  • 3.  Supervised learning as the name indicates the presence of a supervisor as a teacher.  Basically supervised learning is a learning in which we teach or train the machine using data which is well labeled that means some data is already tagged with the correct answer.  After that, the machine is provided with a new set of examples(data) so that supervised learning algorithm analyses the training data(set of training examples) and produces a correct outcome from labeled data.
  • 4.  For instance, suppose you are given a basket filled with different kinds of fruits.  Now the first step is to train the machine with all different fruits one by one like this:  If shape of object is rounded and depression at top having color Red then it will be labeled as –Apple.  If shape of object is long curving cylinder having color Green-Yellow then it will be labeled as –Banana.  Now suppose after training the data, you have given a new separate fruit say Banana from basket and asked to identify it.
  • 5.  Since the machine has already learned the things from previous data and this time have to use it wisely.  It will first classify the fruit with its shape and color and would confirm the fruit name as BANANA and put it in Banana category.  Thus the machine learns the things from training data(basket containing fruits) and then apply the knowledge to test data(new fruit).
  • 6.  Supervised learning classified into two categories of algorithms:  Classification: A classification problem is when the output variable is a category, such as “Red” or “blue” or “disease” and “no disease”.  Regression: A regression problem is when the output variable is a real value, such as “dollars” or “weight”.
  • 7.  Supervised learning allows collecting data and produce data output from the previous experiences.  Helps to optimize performance criteria with the help of experience.  Supervised machine learning helps to solve various types of real-world computation problems.
  • 8.  Classifying big data can be challenging.  Training for supervised learning needs a lot of computation time. So, it requires a lot of time.
  • 9.  Unsupervised learning is the training of machine using information that is neither classified nor labeled and allowing the algorithm to act on that information without guidance.  Here the task of machine is to group unsorted information according to similarities, patterns and differences without any prior training of data.  Unlike supervised learning, no teacher is provided that means no training will be given to the machine.  Therefore machine is restricted to find the hidden structure in unlabeled data by our-self.
  • 10.  For instance, suppose it is given an image having both dogs and cats which have not seen ever.  Thus the machine has no idea about the features of dogs and cat so we can’t categorize it in dogs and cats.  But it can categorize them according to their similarities, patterns, and differences i.e., we can easily categorize the above picture into two parts.  First first may contain all pics having dogs in it and second part may contain all pics having cats in it.  Here you didn’t learn anything before, means no training data or examples. It allows the model to work on its own to discover patterns and information that was previously undetected.  It mainly deals with unlabelled data.
  • 11.  Unsupervised learning classified into two categories of algorithms:  Clustering: A clustering problem is where you want to discover the inherent groupings in the data, such as grouping customers by purchasing behavior.  Association: An association rule learning problem is where you want to discover rules that describe large portions of your data, such as people that buy X also tend to buyY.
  • 12.  A decision tree is a decision support tool that uses a tree-like model of decisions and their possible consequences, including chance event outcomes, resource costs, and utility.  It is one way to display an algorithm that only contains conditional control statements.  Decision Tree : Decision tree is the most powerful and popular tool for classification and prediction.  A Decision tree is a flowchart like tree structure, where each internal node denotes a test on an attribute, each branch represents an outcome of the test, and each leaf node (terminal node) holds a class label.
  • 13.
  • 14.  Statistical learning theory is a framework for machine learning drawing from the fields of statistics and functional analysis.  Statistical learning theory deals with the problem of finding a predictive function based on data.  Statistical learning theory has led to successful applications in fields such as computer vision, speech recognition, and bioinformatics.
  • 15.  The goals of learning are understanding and prediction.  Learning falls into many categories, including supervised learning, unsupervised learning, online learning, and reinforcement learning.  From the perspective of statistical learning theory, supervised learning is best understood.  Supervised learning involves learning from a training set of data.  Every point in the training is an input-output pair, where the input maps to an output.  The learning problem consists of inferring the function that maps between the input and the output, such that the learned function can be used to predict the output from future input.
  • 16.  Naïve Bayes models  Naïve Bayes Classifier is one of the simple and most effectiveClassification algorithms which helps in building the fast machine learning models that can make quick predictions.  It is a probabilistic classifier, which means it predicts on the basis of the probability of an object.
  • 17.  Naive Bayes classifiers are a collection of classification algorithms based on Bayes’Theorem.  It is not a single algorithm but a family of algorithms where all of them share a common principle, i.e. every pair of features being classified is independent of each other.  Bayes’ theorem finds many uses in the probability theory and statistics.  There’s a micro chance that you have never heard about this theorem in your life.  Turns out that this theorem has found its way into the world of machine learning, to form one of the highly decorated algorithms.  In this article, we will learn all about the Naive Bayes Algorithm, along with its variations for different purposes in machine learning.
  • 18.  Naive Bayes is a simple technique for constructing classifiers: models that assign class labels to problem instances, represented as vectors of feature values, where the class labels are drawn from some finite set.  There is not a single algorithm for training such classifiers, but a family of algorithms based on a common principle: all naive Bayes classifiers assume that the value of a particular feature is independent of the value of any other feature, given the class variable.  For example, a fruit may be considered to be an apple if it is red, round, and about 10 cm in diameter.  A naive Bayes classifier considers each of these features to contribute independently to the probability that this fruit is an apple, regardless of any possible correlations between the color, roundness, and diameter features.
  • 19.  For some types of probability models, naive Bayes classifiers can be trained very efficiently in a supervised learning setting.  In many practical applications, parameter estimation for naive Bayes models uses the method of maximum likelihood; in other words, one can work with the naive Bayes model without accepting Bayesian probability or using any Bayesian methods.  Despite their naive design and apparently oversimplified assumptions, naive Bayes classifiers have worked quite well in many complex real- world situations.  In 2004, an analysis of the Bayesian classification problem showed that there are sound theoretical reasons for the apparently implausible efficacy of naive Bayes classifiers.  Still, a comprehensive comparison with other classification algorithms in 2006 showed that Bayes classification is outperformed by other approaches, such as boosted trees or random forests.  An advantage of naive Bayes is that it only requires a small number of training data to estimate the parameters necessary for classification
  • 20.  It is a very general algorithm used to learn probabilistic models in which variables are hidden; that is, some of the variables are not observed.  Models with hidden variables are sometimes called latent variable models.
  • 21.  In stats, an Expectation–Maximisation (EM) algorithm is used as an iterative method to know out (local) maximum likelihood or maximum a posteriori (MAP) estimates of parameters in statistical models, where the model depends on unobserved latent variables.  The EM iteration is used as an alternate between the performing an expectation (E) step, which creates a function for the expectation of the log-likelihood evaluated using the present estimate for the parameters  It is important for the maximisation (M) step, which help in computing parameters maximising the expected log- likelihood found on the E step.These parameter-estimates are then wont to determine the distribution of the latent variables within the next E step.
  • 22.  This algorithm is really at the bottom of the many unsupervised clustering algorithms within the field of machine learning.  It was explained, proposed and given its name during a paper published in 1977 by Arthur Dempster, Nan Laird and Donald Rubin. it’s wont to find the local maximum likelihood parameters of a statistical model within the cases where latent variables.
  • 23.
  • 24.  It is often wont to fill the missing data during a sample.  It is often used because of the basis of unsupervised learning of clusters.  It is often used for the aim of estimating the parameters of the Hidden Markov Model (HMM).  It is often used for locating the values of latent variables.
  • 25.  It has slow convergence.  It makes convergence to the local optima only.  It requires both the possibilities, forward and backward (numerical optimisation requires only forward probability).
  • 26.  Reinforcement learning (RL) is an area of machine learning concerned with how software agents ought to take actions in an environment in order to maximize the notion of cumulative reward.  Reinforcement learning is one of three basic machine learning paradigms, alongside supervised learning and unsupervised learning.
  • 27.  Reinforcement learning differs from supervised learning in not needing labelled input/output pairs be presented, and in not needing sub-optimal actions to be explicitly corrected.  Instead the focus is on finding a balance between exploration (of uncharted territory) and exploitation (of current knowledge).
  • 28.  The environment is typically stated in the form of a Markov decision process (MDP), because many reinforcement learning algorithms for this context use dynamic programming techniques.  The main difference between the classical dynamic programming methods and reinforcement learning algorithms is that the latter do not assume knowledge of an exact mathematical model of the MDP and they target large MDPs where exact methods become infeasible.