SlideShare a Scribd company logo
Naive Bayes Classifier
Bayesian Methods
 Our focus this lecture
 Learning and classification methods based on
probability theory.
 Bayes theorem plays a critical role in probabilistic
learning and classification.
 Uses prior probability of each category given no
information about an item.
 Categorization produces a posterior probability
distribution over the possible categories given a
description of an item.
Basic Probability Formulas
 Product rule
 Sum rule
 Bayes theorem
 Theorem of total probability, if event Ai is
mutually exclusive and probability sum to 1
)
(
)
|
(
)
(
)
|
(
)
( A
P
A
B
P
B
P
B
A
P
B
A
P 


)
(
)
(
)
(
)
( B
A
P
B
P
A
P
B
A
P 







n
i
i
i A
P
A
B
P
B
P
1
)
(
)
|
(
)
(
)
(
)
(
)
|
(
)
|
(
D
P
h
P
h
D
P
D
h
P 
Bayes Theorem
 Given a hypothesis h and data D which bears on the
hypothesis:
 P(h): independent probability of h: prior probability
 P(D): independent probability of D
 P(D|h): conditional probability of D given h:
likelihood
 P(h|D): conditional probability of h given D: posterior
probability
)
(
)
(
)
|
(
)
|
(
D
P
h
P
h
D
P
D
h
P 
Does patient have cancer or not?
 A patient takes a lab test and the result comes back positive.
It is known that the test returns a correct positive result in
only 99% of the cases and a correct negative result in only
95% of the cases. Furthermore, only 0.03 of the entire
population has this disease.
1. What is the probability that this patient has cancer?
2. What is the probability that he does not have cancer?
3. What is the diagnosis?
Maximum A Posterior
 Based on Bayes Theorem, we can compute the
Maximum A Posterior (MAP) hypothesis for the data
 We are interested in the best hypothesis for some
space H given observed training data D.
)
|
(
argmax D
h
P
h
H
h
MAP


)
(
)
(
)
|
(
argmax
D
P
h
P
h
D
P
H
h

)
(
)
|
(
argmax h
P
h
D
P
H
h

H: set of all hypothesis.
Note that we can drop P(D) as the probability of the data is constant
(and independent of the hypothesis).
Maximum Likelihood
 Now assume that all hypotheses are equally
probable a priori, i.e., P(hi ) = P(hj ) for all hi,
hj belong to H.
 This is called assuming a uniform prior. It
simplifies computing the posterior:
 This hypothesis is called the maximum
likelihood hypothesis.
)
|
(
max
arg h
D
P
h
H
h
ML


Desirable Properties of Bayes Classifier
 Incrementality: with each training example,
the prior and the likelihood can be updated
dynamically: flexible and robust to errors.
 Combines prior knowledge and observed
data: prior probability of a hypothesis
multiplied with probability of the hypothesis
given the training data
 Probabilistic hypothesis: outputs not only a
classification, but a probability distribution
over all classes
Bayes Classifiers
Assumption: training set consists of instances of different classes
described cj as conjunctions of attributes values
Task: Classify a new instance d based on a tuple of attribute values
into one of the classes cj  C
Key idea: assign the most probable class using Bayes
Theorem.
MAP
c
)
,
,
,
|
(
argmax 2
1 n
j
C
c
MAP x
x
x
c
P
c
j



)
,
,
,
(
)
(
)
|
,
,
,
(
argmax
2
1
2
1
n
j
j
n
C
c x
x
x
P
c
P
c
x
x
x
P
j 



)
(
)
|
,
,
,
(
argmax 2
1 j
j
n
C
c
c
P
c
x
x
x
P
j



Parameters estimation
 P(cj)
 Can be estimated from the frequency of classes in the
training examples.
 P(x1,x2,…,xn|cj)
 O(|X|n•|C|) parameters
 Could only be estimated if a very, very large number of
training examples was available.
 Independence Assumption: attribute values are
conditionally independent given the target value: naïve
Bayes.


i
j
i
j
n c
x
P
c
x
x
x
P )
|
(
)
|
,
,
,
( 2
1 



i
j
i
j
C
c
NB c
x
P
c
P
c
j
)
|
(
)
(
max
arg
Properties
 Estimating instead of greatly
reduces the number of parameters (and the data
sparseness).
 The learning step in Naïve Bayes consists of
estimating and based on the
frequencies in the training data
 An unseen instance is classified by computing the
class that maximizes the posterior
 When conditioned independence is satisfied, Naïve
Bayes corresponds to MAP classification.
)
|
( j
i c
x
P )
|
,
,
,
( 2
1 j
n c
x
x
x
P 
)
( j
c
P
)
|
( j
i c
x
P
Example. ‘Play Tennis’ data
Day Outlook Temperature Humidity Wind Play
Tennis
Day1 Sunny Hot High Weak No
Day2 Sunny Hot High Strong No
Day3 Overcast Hot High Weak Yes
Day4 Rain Mild High Weak Yes
Day5 Rain Cool Normal Weak Yes
Day6 Rain Cool Normal Strong No
Day7 Overcast Cool Normal Strong Yes
Day8 Sunny Mild High Weak No
Day9 Sunny Cool Normal Weak Yes
Day10 Rain Mild Normal Weak Yes
Day11 Sunny Mild Normal Strong Yes
Day12 Overcast Mild High Strong Yes
Day13 Overcast Hot Normal Weak Yes
Day14 Rain Mild High Strong No
Question: For the day <sunny, cool, high, strong>, what’s
the play prediction?
Naive Bayes solution
Classify any new datum instance x=(a1,…aT) as:
 To do this based on training examples, we need to estimate the
parameters from the training examples:
 For each target value (hypothesis) h
 For each attribute value at of each datum instance
)
(
estimate
:
)
(
ˆ h
P
h
P 
)
|
(
estimate
:
)
|
(
ˆ h
a
P
h
a
P t
t 



t
t
h
h
Bayes
Naive h
a
P
h
P
h
P
h
P
h )
|
(
)
(
max
arg
)
|
(
)
(
max
arg x
Based on the examples in the table, classify the following datum x:
x=(Outl=Sunny, Temp=Cool, Hum=High, Wind=strong)
 That means: Play tennis or not?
 Working:
)
|
(
)
|
(
)
|
(
)
|
(
)
(
max
arg
)
|
(
)
(
max
arg
)
|
(
)
(
max
arg
]
,
[
]
,
[
]
,
[
h
strong
Wind
P
h
high
Humidity
P
h
cool
Temp
P
h
sunny
Outlook
P
h
P
h
a
P
h
P
h
P
h
P
h
no
yes
h
t
t
no
yes
h
no
yes
h
NB











x
no
x
PlayTennis
answer
no
strong
P
no
high
P
no
cool
P
no
sunny
P
no
P
yes
strong
P
yes
high
P
yes
cool
P
yes
sunny
P
yes
P
etc
no
PlayTennis
strong
Wind
P
yes
PlayTennis
strong
Wind
P
no
PlayTennis
P
yes
PlayTennis
P


















)
(
:
)
|
(
)
|
(
)
|
(
)
|
(
)
(
0053
.
0
)
|
(
)
|
(
)
|
(
)
|
(
)
(
.
60
.
0
5
/
3
)
|
(
33
.
0
9
/
3
)
|
(
36
.
0
14
/
5
)
(
64
.
0
14
/
9
)
(
0.0206
Underflow Prevention
 Multiplying lots of probabilities, which are
between 0 and 1 by definition, can result in
floating-point underflow.
 Since log(xy) = log(x) + log(y), it is better to
perform all computations by summing logs of
probabilities rather than multiplying
probabilities.
 Class with highest final un-normalized log
probability score is still the most probable.





positions
i
j
i
j
C
c
NB c
x
P
c
P
c )
|
(
log
)
(
log
argmax
j

More Related Content

What's hot

Decision trees for machine learning
Decision trees for machine learningDecision trees for machine learning
Decision trees for machine learning
Amr BARAKAT
 
Support Vector Machines
Support Vector MachinesSupport Vector Machines
Support Vector Machinesnextlib
 
Naive bayes
Naive bayesNaive bayes
Naive bayes
umeskath
 
Association Rule Mining in Data Mining
Association Rule Mining in Data Mining Association Rule Mining in Data Mining
Association Rule Mining in Data Mining
Ayesha Ali
 
Decision tree induction \ Decision Tree Algorithm with Example| Data science
Decision tree induction \ Decision Tree Algorithm with Example| Data scienceDecision tree induction \ Decision Tree Algorithm with Example| Data science
Decision tree induction \ Decision Tree Algorithm with Example| Data science
MaryamRehman6
 
Dimensionality Reduction | Machine Learning | CloudxLab
Dimensionality Reduction | Machine Learning | CloudxLabDimensionality Reduction | Machine Learning | CloudxLab
Dimensionality Reduction | Machine Learning | CloudxLab
CloudxLab
 
Optics ordering points to identify the clustering structure
Optics ordering points to identify the clustering structureOptics ordering points to identify the clustering structure
Optics ordering points to identify the clustering structure
Rajesh Piryani
 
Random forest algorithm
Random forest algorithmRandom forest algorithm
Random forest algorithm
Rashid Ansari
 
Decision tree
Decision treeDecision tree
Decision tree
ShraddhaPandey45
 
NAIVE BAYES CLASSIFIER
NAIVE BAYES CLASSIFIERNAIVE BAYES CLASSIFIER
NAIVE BAYES CLASSIFIER
Knoldus Inc.
 
Random Forest
Random ForestRandom Forest
Svm
SvmSvm
Formation python
Formation pythonFormation python
Formation python
j_lipaz
 
Machine learning Lecture 2
Machine learning Lecture 2Machine learning Lecture 2
Machine learning Lecture 2
Srinivasan R
 
Mining Frequent Patterns, Association and Correlations
Mining Frequent Patterns, Association and CorrelationsMining Frequent Patterns, Association and Correlations
Mining Frequent Patterns, Association and Correlations
Justin Cletus
 
Decision Tree Algorithm | Decision Tree in Python | Machine Learning Algorith...
Decision Tree Algorithm | Decision Tree in Python | Machine Learning Algorith...Decision Tree Algorithm | Decision Tree in Python | Machine Learning Algorith...
Decision Tree Algorithm | Decision Tree in Python | Machine Learning Algorith...
Edureka!
 
2.4 rule based classification
2.4 rule based classification2.4 rule based classification
2.4 rule based classification
Krish_ver2
 
Linear models and multiclass classification
Linear models and multiclass classificationLinear models and multiclass classification
Linear models and multiclass classification
NdSv94
 

What's hot (20)

Decision trees for machine learning
Decision trees for machine learningDecision trees for machine learning
Decision trees for machine learning
 
Support Vector Machines
Support Vector MachinesSupport Vector Machines
Support Vector Machines
 
Lecture6 - C4.5
Lecture6 - C4.5Lecture6 - C4.5
Lecture6 - C4.5
 
Naive bayes
Naive bayesNaive bayes
Naive bayes
 
Association Rule Mining in Data Mining
Association Rule Mining in Data Mining Association Rule Mining in Data Mining
Association Rule Mining in Data Mining
 
Decision tree induction \ Decision Tree Algorithm with Example| Data science
Decision tree induction \ Decision Tree Algorithm with Example| Data scienceDecision tree induction \ Decision Tree Algorithm with Example| Data science
Decision tree induction \ Decision Tree Algorithm with Example| Data science
 
Dimensionality Reduction | Machine Learning | CloudxLab
Dimensionality Reduction | Machine Learning | CloudxLabDimensionality Reduction | Machine Learning | CloudxLab
Dimensionality Reduction | Machine Learning | CloudxLab
 
Optics ordering points to identify the clustering structure
Optics ordering points to identify the clustering structureOptics ordering points to identify the clustering structure
Optics ordering points to identify the clustering structure
 
Random forest algorithm
Random forest algorithmRandom forest algorithm
Random forest algorithm
 
Decision tree
Decision treeDecision tree
Decision tree
 
NAIVE BAYES CLASSIFIER
NAIVE BAYES CLASSIFIERNAIVE BAYES CLASSIFIER
NAIVE BAYES CLASSIFIER
 
Random Forest
Random ForestRandom Forest
Random Forest
 
Flat
FlatFlat
Flat
 
Svm
SvmSvm
Svm
 
Formation python
Formation pythonFormation python
Formation python
 
Machine learning Lecture 2
Machine learning Lecture 2Machine learning Lecture 2
Machine learning Lecture 2
 
Mining Frequent Patterns, Association and Correlations
Mining Frequent Patterns, Association and CorrelationsMining Frequent Patterns, Association and Correlations
Mining Frequent Patterns, Association and Correlations
 
Decision Tree Algorithm | Decision Tree in Python | Machine Learning Algorith...
Decision Tree Algorithm | Decision Tree in Python | Machine Learning Algorith...Decision Tree Algorithm | Decision Tree in Python | Machine Learning Algorith...
Decision Tree Algorithm | Decision Tree in Python | Machine Learning Algorith...
 
2.4 rule based classification
2.4 rule based classification2.4 rule based classification
2.4 rule based classification
 
Linear models and multiclass classification
Linear models and multiclass classificationLinear models and multiclass classification
Linear models and multiclass classification
 

Similar to bayesNaive.ppt

Naive Bayes Presentation
Naive Bayes PresentationNaive Bayes Presentation
Naive Bayes Presentation
Md. Enamul Haque Chowdhury
 
original
originaloriginal
originalbutest
 
4-ML-UNIT-IV-Bayesian Learning.pptx
4-ML-UNIT-IV-Bayesian Learning.pptx4-ML-UNIT-IV-Bayesian Learning.pptx
4-ML-UNIT-IV-Bayesian Learning.pptx
Saitama84
 
-BayesianLearning in machine Learning 12
-BayesianLearning in machine Learning 12-BayesianLearning in machine Learning 12
-BayesianLearning in machine Learning 12
Kumari Naveen
 
Module 4 part_1
Module 4 part_1Module 4 part_1
Module 4 part_1
ShashankN22
 
lecture 5 about lecture 5 about lecture lecture
lecture 5 about lecture 5 about lecture lecturelecture 5 about lecture 5 about lecture lecture
lecture 5 about lecture 5 about lecture lecture
anxiousanoja
 
Bagging_and_Boosting.pptx
Bagging_and_Boosting.pptxBagging_and_Boosting.pptx
Bagging_and_Boosting.pptx
ABINASHPADHY6
 
Module 4_F.pptx
Module  4_F.pptxModule  4_F.pptx
Module 4_F.pptx
SupriyaN21
 
06 Machine Learning - Naive Bayes
06 Machine Learning - Naive Bayes06 Machine Learning - Naive Bayes
06 Machine Learning - Naive Bayes
Andres Mendez-Vazquez
 
Naive Bayes.pptx
Naive Bayes.pptxNaive Bayes.pptx
Naive Bayes.pptx
Uttara University
 
Bayesian Learning- part of machine learning
Bayesian Learning-  part of machine learningBayesian Learning-  part of machine learning
Bayesian Learning- part of machine learning
kensaleste
 
Bayesian Learning by Dr.C.R.Dhivyaa Kongu Engineering College
Bayesian Learning by Dr.C.R.Dhivyaa Kongu Engineering CollegeBayesian Learning by Dr.C.R.Dhivyaa Kongu Engineering College
Bayesian Learning by Dr.C.R.Dhivyaa Kongu Engineering College
Dhivyaa C.R
 
UNIT II (7).pptx
UNIT II (7).pptxUNIT II (7).pptx
UNIT II (7).pptx
DrDhivyaaCRAssistant
 
Naive_hehe.pptx
Naive_hehe.pptxNaive_hehe.pptx
Naive_hehe.pptx
MahimMajee
 
Naive bayes
Naive bayesNaive bayes
Bayes Theorem.pdf
Bayes Theorem.pdfBayes Theorem.pdf
Bayes Theorem.pdf
Nirmalavenkatachalam
 
An introduction to Bayesian Statistics using Python
An introduction to Bayesian Statistics using PythonAn introduction to Bayesian Statistics using Python
An introduction to Bayesian Statistics using Python
freshdatabos
 
[系列活動] Machine Learning 機器學習課程
[系列活動] Machine Learning 機器學習課程[系列活動] Machine Learning 機器學習課程
[系列活動] Machine Learning 機器學習課程
台灣資料科學年會
 

Similar to bayesNaive.ppt (20)

Naive Bayes Presentation
Naive Bayes PresentationNaive Bayes Presentation
Naive Bayes Presentation
 
original
originaloriginal
original
 
4-ML-UNIT-IV-Bayesian Learning.pptx
4-ML-UNIT-IV-Bayesian Learning.pptx4-ML-UNIT-IV-Bayesian Learning.pptx
4-ML-UNIT-IV-Bayesian Learning.pptx
 
-BayesianLearning in machine Learning 12
-BayesianLearning in machine Learning 12-BayesianLearning in machine Learning 12
-BayesianLearning in machine Learning 12
 
Module 4 part_1
Module 4 part_1Module 4 part_1
Module 4 part_1
 
lecture 5 about lecture 5 about lecture lecture
lecture 5 about lecture 5 about lecture lecturelecture 5 about lecture 5 about lecture lecture
lecture 5 about lecture 5 about lecture lecture
 
Bagging_and_Boosting.pptx
Bagging_and_Boosting.pptxBagging_and_Boosting.pptx
Bagging_and_Boosting.pptx
 
Module 4_F.pptx
Module  4_F.pptxModule  4_F.pptx
Module 4_F.pptx
 
06 Machine Learning - Naive Bayes
06 Machine Learning - Naive Bayes06 Machine Learning - Naive Bayes
06 Machine Learning - Naive Bayes
 
Naive Bayes.pptx
Naive Bayes.pptxNaive Bayes.pptx
Naive Bayes.pptx
 
Bayesian Learning- part of machine learning
Bayesian Learning-  part of machine learningBayesian Learning-  part of machine learning
Bayesian Learning- part of machine learning
 
Bayesian Learning by Dr.C.R.Dhivyaa Kongu Engineering College
Bayesian Learning by Dr.C.R.Dhivyaa Kongu Engineering CollegeBayesian Learning by Dr.C.R.Dhivyaa Kongu Engineering College
Bayesian Learning by Dr.C.R.Dhivyaa Kongu Engineering College
 
UNIT II (7).pptx
UNIT II (7).pptxUNIT II (7).pptx
UNIT II (7).pptx
 
Naive_hehe.pptx
Naive_hehe.pptxNaive_hehe.pptx
Naive_hehe.pptx
 
Naive bayes
Naive bayesNaive bayes
Naive bayes
 
ppt
pptppt
ppt
 
Bayes Theorem.pdf
Bayes Theorem.pdfBayes Theorem.pdf
Bayes Theorem.pdf
 
Lecture10 - Naïve Bayes
Lecture10 - Naïve BayesLecture10 - Naïve Bayes
Lecture10 - Naïve Bayes
 
An introduction to Bayesian Statistics using Python
An introduction to Bayesian Statistics using PythonAn introduction to Bayesian Statistics using Python
An introduction to Bayesian Statistics using Python
 
[系列活動] Machine Learning 機器學習課程
[系列活動] Machine Learning 機器學習課程[系列活動] Machine Learning 機器學習課程
[系列活動] Machine Learning 機器學習課程
 

Recently uploaded

Multithreading_in_C++ - std::thread, race condition
Multithreading_in_C++ - std::thread, race conditionMultithreading_in_C++ - std::thread, race condition
Multithreading_in_C++ - std::thread, race condition
Mohammed Sikander
 
Advantages and Disadvantages of CMS from an SEO Perspective
Advantages and Disadvantages of CMS from an SEO PerspectiveAdvantages and Disadvantages of CMS from an SEO Perspective
Advantages and Disadvantages of CMS from an SEO Perspective
Krisztián Száraz
 
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
TESDA TM1 REVIEWER  FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...TESDA TM1 REVIEWER  FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
EugeneSaldivar
 
Natural birth techniques - Mrs.Akanksha Trivedi Rama University
Natural birth techniques - Mrs.Akanksha Trivedi Rama UniversityNatural birth techniques - Mrs.Akanksha Trivedi Rama University
Natural birth techniques - Mrs.Akanksha Trivedi Rama University
Akanksha trivedi rama nursing college kanpur.
 
Digital Artifact 1 - 10VCD Environments Unit
Digital Artifact 1 - 10VCD Environments UnitDigital Artifact 1 - 10VCD Environments Unit
Digital Artifact 1 - 10VCD Environments Unit
chanes7
 
Overview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with MechanismOverview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with Mechanism
DeeptiGupta154
 
Pride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School DistrictPride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School District
David Douglas School District
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
Jisc
 
Executive Directors Chat Leveraging AI for Diversity, Equity, and Inclusion
Executive Directors Chat  Leveraging AI for Diversity, Equity, and InclusionExecutive Directors Chat  Leveraging AI for Diversity, Equity, and Inclusion
Executive Directors Chat Leveraging AI for Diversity, Equity, and Inclusion
TechSoup
 
2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...
Sandy Millin
 
Normal Labour/ Stages of Labour/ Mechanism of Labour
Normal Labour/ Stages of Labour/ Mechanism of LabourNormal Labour/ Stages of Labour/ Mechanism of Labour
Normal Labour/ Stages of Labour/ Mechanism of Labour
Wasim Ak
 
How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17
Celine George
 
The Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptxThe Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptx
DhatriParmar
 
Francesca Gottschalk - How can education support child empowerment.pptx
Francesca Gottschalk - How can education support child empowerment.pptxFrancesca Gottschalk - How can education support child empowerment.pptx
Francesca Gottschalk - How can education support child empowerment.pptx
EduSkills OECD
 
Lapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdfLapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdf
Jean Carlos Nunes Paixão
 
BÀI TẬP BỔ TRỢ TIẾNG ANH GLOBAL SUCCESS LỚP 3 - CẢ NĂM (CÓ FILE NGHE VÀ ĐÁP Á...
BÀI TẬP BỔ TRỢ TIẾNG ANH GLOBAL SUCCESS LỚP 3 - CẢ NĂM (CÓ FILE NGHE VÀ ĐÁP Á...BÀI TẬP BỔ TRỢ TIẾNG ANH GLOBAL SUCCESS LỚP 3 - CẢ NĂM (CÓ FILE NGHE VÀ ĐÁP Á...
BÀI TẬP BỔ TRỢ TIẾNG ANH GLOBAL SUCCESS LỚP 3 - CẢ NĂM (CÓ FILE NGHE VÀ ĐÁP Á...
Nguyen Thanh Tu Collection
 
How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...
Jisc
 
Model Attribute Check Company Auto Property
Model Attribute  Check Company Auto PropertyModel Attribute  Check Company Auto Property
Model Attribute Check Company Auto Property
Celine George
 
JEE1_This_section_contains_FOUR_ questions
JEE1_This_section_contains_FOUR_ questionsJEE1_This_section_contains_FOUR_ questions
JEE1_This_section_contains_FOUR_ questions
ShivajiThube2
 
Acetabularia Information For Class 9 .docx
Acetabularia Information For Class 9  .docxAcetabularia Information For Class 9  .docx
Acetabularia Information For Class 9 .docx
vaibhavrinwa19
 

Recently uploaded (20)

Multithreading_in_C++ - std::thread, race condition
Multithreading_in_C++ - std::thread, race conditionMultithreading_in_C++ - std::thread, race condition
Multithreading_in_C++ - std::thread, race condition
 
Advantages and Disadvantages of CMS from an SEO Perspective
Advantages and Disadvantages of CMS from an SEO PerspectiveAdvantages and Disadvantages of CMS from an SEO Perspective
Advantages and Disadvantages of CMS from an SEO Perspective
 
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
TESDA TM1 REVIEWER  FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...TESDA TM1 REVIEWER  FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
 
Natural birth techniques - Mrs.Akanksha Trivedi Rama University
Natural birth techniques - Mrs.Akanksha Trivedi Rama UniversityNatural birth techniques - Mrs.Akanksha Trivedi Rama University
Natural birth techniques - Mrs.Akanksha Trivedi Rama University
 
Digital Artifact 1 - 10VCD Environments Unit
Digital Artifact 1 - 10VCD Environments UnitDigital Artifact 1 - 10VCD Environments Unit
Digital Artifact 1 - 10VCD Environments Unit
 
Overview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with MechanismOverview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with Mechanism
 
Pride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School DistrictPride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School District
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
 
Executive Directors Chat Leveraging AI for Diversity, Equity, and Inclusion
Executive Directors Chat  Leveraging AI for Diversity, Equity, and InclusionExecutive Directors Chat  Leveraging AI for Diversity, Equity, and Inclusion
Executive Directors Chat Leveraging AI for Diversity, Equity, and Inclusion
 
2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...
 
Normal Labour/ Stages of Labour/ Mechanism of Labour
Normal Labour/ Stages of Labour/ Mechanism of LabourNormal Labour/ Stages of Labour/ Mechanism of Labour
Normal Labour/ Stages of Labour/ Mechanism of Labour
 
How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17
 
The Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptxThe Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptx
 
Francesca Gottschalk - How can education support child empowerment.pptx
Francesca Gottschalk - How can education support child empowerment.pptxFrancesca Gottschalk - How can education support child empowerment.pptx
Francesca Gottschalk - How can education support child empowerment.pptx
 
Lapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdfLapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdf
 
BÀI TẬP BỔ TRỢ TIẾNG ANH GLOBAL SUCCESS LỚP 3 - CẢ NĂM (CÓ FILE NGHE VÀ ĐÁP Á...
BÀI TẬP BỔ TRỢ TIẾNG ANH GLOBAL SUCCESS LỚP 3 - CẢ NĂM (CÓ FILE NGHE VÀ ĐÁP Á...BÀI TẬP BỔ TRỢ TIẾNG ANH GLOBAL SUCCESS LỚP 3 - CẢ NĂM (CÓ FILE NGHE VÀ ĐÁP Á...
BÀI TẬP BỔ TRỢ TIẾNG ANH GLOBAL SUCCESS LỚP 3 - CẢ NĂM (CÓ FILE NGHE VÀ ĐÁP Á...
 
How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...
 
Model Attribute Check Company Auto Property
Model Attribute  Check Company Auto PropertyModel Attribute  Check Company Auto Property
Model Attribute Check Company Auto Property
 
JEE1_This_section_contains_FOUR_ questions
JEE1_This_section_contains_FOUR_ questionsJEE1_This_section_contains_FOUR_ questions
JEE1_This_section_contains_FOUR_ questions
 
Acetabularia Information For Class 9 .docx
Acetabularia Information For Class 9  .docxAcetabularia Information For Class 9  .docx
Acetabularia Information For Class 9 .docx
 

bayesNaive.ppt

  • 2. Bayesian Methods  Our focus this lecture  Learning and classification methods based on probability theory.  Bayes theorem plays a critical role in probabilistic learning and classification.  Uses prior probability of each category given no information about an item.  Categorization produces a posterior probability distribution over the possible categories given a description of an item.
  • 3. Basic Probability Formulas  Product rule  Sum rule  Bayes theorem  Theorem of total probability, if event Ai is mutually exclusive and probability sum to 1 ) ( ) | ( ) ( ) | ( ) ( A P A B P B P B A P B A P    ) ( ) ( ) ( ) ( B A P B P A P B A P         n i i i A P A B P B P 1 ) ( ) | ( ) ( ) ( ) ( ) | ( ) | ( D P h P h D P D h P 
  • 4. Bayes Theorem  Given a hypothesis h and data D which bears on the hypothesis:  P(h): independent probability of h: prior probability  P(D): independent probability of D  P(D|h): conditional probability of D given h: likelihood  P(h|D): conditional probability of h given D: posterior probability ) ( ) ( ) | ( ) | ( D P h P h D P D h P 
  • 5. Does patient have cancer or not?  A patient takes a lab test and the result comes back positive. It is known that the test returns a correct positive result in only 99% of the cases and a correct negative result in only 95% of the cases. Furthermore, only 0.03 of the entire population has this disease. 1. What is the probability that this patient has cancer? 2. What is the probability that he does not have cancer? 3. What is the diagnosis?
  • 6. Maximum A Posterior  Based on Bayes Theorem, we can compute the Maximum A Posterior (MAP) hypothesis for the data  We are interested in the best hypothesis for some space H given observed training data D. ) | ( argmax D h P h H h MAP   ) ( ) ( ) | ( argmax D P h P h D P H h  ) ( ) | ( argmax h P h D P H h  H: set of all hypothesis. Note that we can drop P(D) as the probability of the data is constant (and independent of the hypothesis).
  • 7. Maximum Likelihood  Now assume that all hypotheses are equally probable a priori, i.e., P(hi ) = P(hj ) for all hi, hj belong to H.  This is called assuming a uniform prior. It simplifies computing the posterior:  This hypothesis is called the maximum likelihood hypothesis. ) | ( max arg h D P h H h ML  
  • 8. Desirable Properties of Bayes Classifier  Incrementality: with each training example, the prior and the likelihood can be updated dynamically: flexible and robust to errors.  Combines prior knowledge and observed data: prior probability of a hypothesis multiplied with probability of the hypothesis given the training data  Probabilistic hypothesis: outputs not only a classification, but a probability distribution over all classes
  • 9. Bayes Classifiers Assumption: training set consists of instances of different classes described cj as conjunctions of attributes values Task: Classify a new instance d based on a tuple of attribute values into one of the classes cj  C Key idea: assign the most probable class using Bayes Theorem. MAP c ) , , , | ( argmax 2 1 n j C c MAP x x x c P c j    ) , , , ( ) ( ) | , , , ( argmax 2 1 2 1 n j j n C c x x x P c P c x x x P j     ) ( ) | , , , ( argmax 2 1 j j n C c c P c x x x P j   
  • 10. Parameters estimation  P(cj)  Can be estimated from the frequency of classes in the training examples.  P(x1,x2,…,xn|cj)  O(|X|n•|C|) parameters  Could only be estimated if a very, very large number of training examples was available.  Independence Assumption: attribute values are conditionally independent given the target value: naïve Bayes.   i j i j n c x P c x x x P ) | ( ) | , , , ( 2 1     i j i j C c NB c x P c P c j ) | ( ) ( max arg
  • 11. Properties  Estimating instead of greatly reduces the number of parameters (and the data sparseness).  The learning step in Naïve Bayes consists of estimating and based on the frequencies in the training data  An unseen instance is classified by computing the class that maximizes the posterior  When conditioned independence is satisfied, Naïve Bayes corresponds to MAP classification. ) | ( j i c x P ) | , , , ( 2 1 j n c x x x P  ) ( j c P ) | ( j i c x P
  • 12. Example. ‘Play Tennis’ data Day Outlook Temperature Humidity Wind Play Tennis Day1 Sunny Hot High Weak No Day2 Sunny Hot High Strong No Day3 Overcast Hot High Weak Yes Day4 Rain Mild High Weak Yes Day5 Rain Cool Normal Weak Yes Day6 Rain Cool Normal Strong No Day7 Overcast Cool Normal Strong Yes Day8 Sunny Mild High Weak No Day9 Sunny Cool Normal Weak Yes Day10 Rain Mild Normal Weak Yes Day11 Sunny Mild Normal Strong Yes Day12 Overcast Mild High Strong Yes Day13 Overcast Hot Normal Weak Yes Day14 Rain Mild High Strong No Question: For the day <sunny, cool, high, strong>, what’s the play prediction?
  • 13. Naive Bayes solution Classify any new datum instance x=(a1,…aT) as:  To do this based on training examples, we need to estimate the parameters from the training examples:  For each target value (hypothesis) h  For each attribute value at of each datum instance ) ( estimate : ) ( ˆ h P h P  ) | ( estimate : ) | ( ˆ h a P h a P t t     t t h h Bayes Naive h a P h P h P h P h ) | ( ) ( max arg ) | ( ) ( max arg x
  • 14. Based on the examples in the table, classify the following datum x: x=(Outl=Sunny, Temp=Cool, Hum=High, Wind=strong)  That means: Play tennis or not?  Working: ) | ( ) | ( ) | ( ) | ( ) ( max arg ) | ( ) ( max arg ) | ( ) ( max arg ] , [ ] , [ ] , [ h strong Wind P h high Humidity P h cool Temp P h sunny Outlook P h P h a P h P h P h P h no yes h t t no yes h no yes h NB            x no x PlayTennis answer no strong P no high P no cool P no sunny P no P yes strong P yes high P yes cool P yes sunny P yes P etc no PlayTennis strong Wind P yes PlayTennis strong Wind P no PlayTennis P yes PlayTennis P                   ) ( : ) | ( ) | ( ) | ( ) | ( ) ( 0053 . 0 ) | ( ) | ( ) | ( ) | ( ) ( . 60 . 0 5 / 3 ) | ( 33 . 0 9 / 3 ) | ( 36 . 0 14 / 5 ) ( 64 . 0 14 / 9 ) ( 0.0206
  • 15. Underflow Prevention  Multiplying lots of probabilities, which are between 0 and 1 by definition, can result in floating-point underflow.  Since log(xy) = log(x) + log(y), it is better to perform all computations by summing logs of probabilities rather than multiplying probabilities.  Class with highest final un-normalized log probability score is still the most probable.      positions i j i j C c NB c x P c P c ) | ( log ) ( log argmax j