SlideShare a Scribd company logo
1
Pattern Recognition
Probability Review
2
Why Bother About Probabilities?
• Accounting for uncertainty is a crucial component
in decision making (e.g., classification) because of
ambiguity in our measurements.
• Probability theory is the proper mechanism for
accounting for uncertainty.
• Need to take into account reasonable preferences
about the state of the world, for example:
"If the fish was caught in the Atlantic ocean, then
it is more likely to be salmon than sea-bass
3
Definitions
• Random experiment
– An experiment whose result is not certain in advance
(e.g., throwing a die)
• Outcome
– The result of a random experiment
• Sample space
– The set of all possible outcomes (e.g., {1,2,3,4,5,6})
• Event
– A subset of the sample space (e.g., obtain an odd
number in the experiment of throwing a die = {1,3,5})
4
Intuitive Formulation
• Intuitively, the probability of an event a could be defined
as:
• Assumes that all outcomes in the sample space are equally
likely (Laplacian definition)
Where N(a) is the number that event a happens in n trials
5
Axioms of Probability
A1
A2
A3
A4
S
B
6
Prior (Unconditional) Probability
• This is the probability of an event prior to arrival
of any evidence.
P(Cavity)=0.1 means that “in the absence of any
other information, there is a 10% chance that the
patient is having a cavity”.
7
Posterior (Conditional) Probability
• This is the probability of an event given some
evidence.
P(Cavity/Toothache)=0.8 means that “there is an
80% chance that the patient is having a cavity
given that he is having a toothache”
8
Posterior (Conditional) Probability
(cont’d)
• Conditional probabilities can be defined in terms of
unconditional probabilities:
• Conditional probabilities lead to the chain rule:
P(A,B)=P(A/B)P(B)=P(B/A)P(A)
 
( , ) ( , )
( / ) , /
( ) ( )
P A B P A B
P A B P B A
P B P A
 
9
Law of Total Probability
• If A1, A2, …, An is a partition of mutually exclusive events
and B is any event, then
• Special case :
• Using the chain rule, we can rewrite the law of total
probability using conditional probabilities:
1 1 2 2
1
( ) ( / ) ( ) ( / ) ( ) ... ( / ) ( )
( / ) ( )
n n
n
j j
j
P B P B A P A P B A P A P B A P A
P B A P A

   
 
B
A1
A2
A3
S
A4
( ) ( , ) ( , ) ( / ) ( ) ( / ) ( )
P A P A B P A B P A B P B P A B P B
   
10
Law of Total Probability: Example
• My mood can take one of two values
– Happy, Sad
• The weather can take one of three values
– Rainy, Sunny, Cloudy
• We can compute P(Happy) and P(Sad) as follows:
P(Happy)=P(Happy/Rainy)+P(Happy/Sunny)+P(Happy/Cloudy)
P(Sad)=P(Sad/Rainy)+P(Sad/Sunny)+P(Sad/Cloudy)
11
Bayes’ Theorem
• Conditional probabilities lead to the Bayes’ rule:
where
• Example: consider the probability of Disease given Symptom:
where
( / ) ( )
( / )
( )
P B A P A
P A B
P B

( / ) ( )
( / )
( )
P Symptom Disease P Disease
P Disease Symptom
P Symptom

( ) ( / ) ( )
( / ) ( )
P Symptom P Symptom Disease P Disease
P Symptom Disease P Disease
 
( ) ( , ) ( , ) ( / ) ( ) ( / ) ( )
P B P B A P B A P B A P A P B A P A
   
12
Bayes’ Theorem Example
• Meningitis causes a stiff neck 50% of the time.
• A patient comes in with a stiff neck – what is the probability
that he has meningitis?
• Need to know two things:
– The prior probability of a patient having meningitis (1/50,000)
– The prior probability of a patient having a stiff neck (1/20)
P(M/S)=0.0002
( / ) ( )
( / )
( )
P S M P M
P M S
P S

13
General Form of Bayes’ Rule
• If A1, A2, …, An is a partition of mutually exclusive events
and B is any event, then the Bayes’ rule is given by:
where
( / ) ( )
( / )
( )
i i
i
P B A P A
P A B
P B

1
( ) ( / ) ( )
n
j j
j
P B P B A P A

 
14
Independence
• Two events A and B are independent iff:
P(A,B)=P(A)P(B)
• From the above formula, we can show:
P(A/B)=P(A) and P(B/A)=P(B)
• A and B are conditionally independent given C iff:
P(A/B,C)=P(A/C)
e.g., P(WetGrass/Season,Rain)=P(WetGrass/Rain)
15
Random Variables
• In many experiments, it is easier to deal with a summary
variable than with the original probability structure.
• Example: in an opinion poll, we ask 50 people whether
agree or disagree with a certain issue.
– Suppose we record a "1" for agree and "0" for disagree.
– The sample space for this experiment has 250 elements.
– Suppose we are only interested in the number of people who agree.
– Define the variable X=number of "1“ 's recorded out of 50.
– Easier to deal with this sample space (has only 51 elements).
16
Random Variables (cont’d)
• A random variable (r.v.) is the value we assign to the
outcome of a random experiment (i.e., a function that
assigns a real number to each event).
X(j)
17
Random Variables (cont’d)
• How is the probability function of the random variable is
being defined from the probability function of the original
sample space?
– Suppose the sample space is S=<s1, s2, …, sn>
– Suppose the range of the random variable X is <x1,x2,…,xm>
– We observe X=xj iff the outcome of the random experiment is an
such that X(sj)=xj
P(X=xj)=P( : X(sj)=xj)
j
s S

j
s S

e.g., What is P(X=2) in the previous example?
18
Discrete/Continuous Random
Variables
• A discrete r.v. can assume only a countable number of values.
• Consider the experiment of throwing a pair of dice
• A continuous r.v. can assume a range of values (e.g., sensor
readings).
19
Probability mass (pmf) and
density function (pdf)
• The pmf /pdf of a r.v. X assigns a probability for each
possible value of X.
• Warning: given two r.v.'s, X and Y, their pmf/pdf are denoted as
pX(x) and pY(y); for convenience, we will drop the subscripts and
denote them as p(x) and p(y), however, keep in mind that these
functions are different !
20
Probability mass (pmf) and
density function (pdf) (cont’d)_
• Some properties of the pmf and pdf:
21
Probability Distribution Function
(PDF)
• With every r.v., we associate a function called probability
distribution function (PDF) which is defined as follows:
• Some properties of the PDF are:
– (1)
– (2) F(x) is a non-decreasing function of x
• If X is discrete, its PDF can be computed as follows:
( ) ( )
F x P X x
 
0 ( ) 1
F x
 
0 0
( ) ( ) ( ) ( )
x x
k k
F x P X x P X k p k
 
    
 
22
Probability Distribution Function
(PDF) (cont’d)
23
Probability mass (pmf) and
density function (pdf) (cont’d)_
• If X is continuous, its PDF can be computed as follows:
• Using the above formula, it can be shown that:
( ) ( )
x
F x p t dt for all x

 
( ) ( )
dF
p x x
dx

24
Probability mass (pmf) and
density function (pdf) (cont’d)_
• Example: the Gaussian pdf and PDF
0
25
Joint pmf (discrete r.v.)
• For n random variables, the joint pmf assigns a
probability for each possible combination of
values:
p(x1,x2,…,xn)=P(X1=x1, X2=x2, …, Xn=xn)
Warning: the joint pmf /pdf of the r.v.'s X1, X2, ..., Xn and Y1, Y2, ...,
Yn are denoted as pX1X2...Xn(x1,x2,...,xn) and pY1Y2...Yn(y1,y2,...,yn); for
convenience, we will drop the subscripts and denote them p(x1,x2,...,xn)
and p(y1,y2,...,yn), keep in mind, however, that these are two different
functions.
26
Joint pmf (discrete r.v.) (cont’d)
• Specifying the joint pmf requires an enormous number of
values
– kn assuming n random variables where each one can assume one
of k discrete values.
– things much simpler if we assume independence or conditional
independence …
P(Cavity, Toothache) is a 2 x 2 matrix
Toothache Not Toothache
Cavity 0.04 0.06
Not Cavity 0.01 0.89
Joint Probability
Sum of probabilities = 1.0
27
Joint pdf (continuous r.v.)
For n random variables, the joint pdf assigns a probability
for each possible combination of values:
1 2
( , ,..., ) 0
n
p x x x 
1
1 2 1
... ( , ,..., ) ... 1
n
n n
x x
p x x x dx dx 
 
28
Discrete/Continuous
Probability Distributions
29
Interesting Properties
• The conditional pdf can be derived from the joint
pdf:
• Conditional pdfs lead to the chain rule (general
form):
( , )
( / ) ( , ) ( / ) ( )
( )
p x y
p y x or p x y p y x p x
p x
 
1 2 1 2 2 3 1
( , ,..., ) ( / ,..., ) ( / ,..., )... ( / ) ( )
n n n n n n
p x x x p x x x p x x x p x x p x


30
Interesting Properties (cont’d)
• Knowledge about independence between r.v.'s is
very powerful since it simplifies things a lot,
e.g., if X and Y are independent, then:
• The law of total probability:
( , ) ( ) ( )
p x y p x p y

( ) ( / ) ( )
x
p y p y x p x
 
31
Marginalization
• From a joint probability, we can compute the probability of
any subset of the variables by marginalization:
– Example - case of joint pmf :
– Examples - case of joint pdf :
( ) ( , )
p x p x y dy


 
( ) ( , )
y
p x p x y
 
1 2 1 1 1 2
( , ,..., , ,..., ) ( , ,..., )
i i n n i
p x x x x x p x x x dx

  
 
3
1 2 1 2 3
( , ) ... ( , ,..., ) ...
n
n n
x x
p x x p x x x dx dx
  
32
Why is the joint pmf
(or pdf) useful?
33
Why is the joint pmf
(or pdf) useful (cont’d)?
34
Probabilistic Inference
• If we could define all possible values for the
probability distribution, then we could read off any
probability we were interested in.
• In general, it is not practical to define all possible
entries for the joint probability function.
• Probabilistic inference consists of computing
probabilities that are not explicitly stored by the
reasoning system (e.g., marginals, conditionals).
35
Normal (Gaussian) Distribution
36
Normal (Gaussian) Distribution
(cont’d)
• Shape and parameters of Gaussian distribution
– number of parameters is
– shape determined by Σ
( 1)
2
d d
d 

37
Normal (Gaussian) Distribution
(cont’d)
• Mahalanobis distance:
• If variables are independent, the multivariate normal
distribution becomes:
2 1
( ) ( )
t
r  

   
x x
2
2
2
( )
1
( ) exp[ ]
2
2
i i
i
i
i
x
p x




 
38
Expected Value
39
Expected Value (cont’d)
40
Properties of the Expected Value
41
Variance and Standard Deviation
42
Covariance
43
Covariance Matrix
44
Uncorrelated r.v.’s
45
Properties of the covariance matrix
46
Covariance Matrix Decomposition
Φ-1= ΦT
47
Linear Transformations
48
Linear Transformations (cont’d)
Whitening
transform
49
Moments of r.v.’s

More Related Content

Similar to pattern recognition

Chapter-4 combined.pptx
Chapter-4 combined.pptxChapter-4 combined.pptx
Chapter-4 combined.pptx
HamzaHaji6
 
Recursive State Estimation AI for Robotics.pdf
Recursive State Estimation AI for Robotics.pdfRecursive State Estimation AI for Robotics.pdf
Recursive State Estimation AI for Robotics.pdf
f20220630
 
Lecture 2
Lecture 2Lecture 2
Lecture 2
Shravan Vasishth
 
Introduction to Evidential Neural Networks
Introduction to Evidential Neural NetworksIntroduction to Evidential Neural Networks
Introduction to Evidential Neural Networks
Federico Cerutti
 
Statement of stochastic programming problems
Statement of stochastic programming problemsStatement of stochastic programming problems
Statement of stochastic programming problems
SSA KPI
 
Dcs unit 2
Dcs unit 2Dcs unit 2
Dcs unit 2
Anil Nigam
 
Data mining assignment 2
Data mining assignment 2Data mining assignment 2
Data mining assignment 2
BarryK88
 
Chapter1
Chapter1Chapter1
chap2.pdf
chap2.pdfchap2.pdf
chap2.pdf
eseinsei
 
Econometrics 2.pptx
Econometrics 2.pptxEconometrics 2.pptx
Econometrics 2.pptx
fuad80
 
Chapter_09_ParameterEstimation.pptx
Chapter_09_ParameterEstimation.pptxChapter_09_ParameterEstimation.pptx
Chapter_09_ParameterEstimation.pptx
VimalMehta19
 
Refresher probabilities-statistics
Refresher probabilities-statisticsRefresher probabilities-statistics
Refresher probabilities-statistics
Steve Nouri
 
HPWFcorePRES--FUR2016
HPWFcorePRES--FUR2016HPWFcorePRES--FUR2016
HPWFcorePRES--FUR2016
G Charles-Cadogan
 
Basic Concept Of Probability
Basic Concept Of ProbabilityBasic Concept Of Probability
Basic Concept Of Probability
guest45a926
 
Accelerating Metropolis Hastings with Lightweight Inference Compilation
Accelerating Metropolis Hastings with Lightweight Inference CompilationAccelerating Metropolis Hastings with Lightweight Inference Compilation
Accelerating Metropolis Hastings with Lightweight Inference Compilation
Feynman Liang
 
2 Review of Statistics. 2 Review of Statistics.
2 Review of Statistics. 2 Review of Statistics.2 Review of Statistics. 2 Review of Statistics.
2 Review of Statistics. 2 Review of Statistics.
WeihanKhor2
 
2 random variables notes 2p3
2 random variables notes 2p32 random variables notes 2p3
2 random variables notes 2p3
MuhannadSaleh
 
BAYSM'14, Wien, Austria
BAYSM'14, Wien, AustriaBAYSM'14, Wien, Austria
BAYSM'14, Wien, Austria
Christian Robert
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
Christian Robert
 
An overview of Hidden Markov Models (HMM)
An overview of Hidden Markov Models (HMM)An overview of Hidden Markov Models (HMM)
An overview of Hidden Markov Models (HMM)
ananth
 

Similar to pattern recognition (20)

Chapter-4 combined.pptx
Chapter-4 combined.pptxChapter-4 combined.pptx
Chapter-4 combined.pptx
 
Recursive State Estimation AI for Robotics.pdf
Recursive State Estimation AI for Robotics.pdfRecursive State Estimation AI for Robotics.pdf
Recursive State Estimation AI for Robotics.pdf
 
Lecture 2
Lecture 2Lecture 2
Lecture 2
 
Introduction to Evidential Neural Networks
Introduction to Evidential Neural NetworksIntroduction to Evidential Neural Networks
Introduction to Evidential Neural Networks
 
Statement of stochastic programming problems
Statement of stochastic programming problemsStatement of stochastic programming problems
Statement of stochastic programming problems
 
Dcs unit 2
Dcs unit 2Dcs unit 2
Dcs unit 2
 
Data mining assignment 2
Data mining assignment 2Data mining assignment 2
Data mining assignment 2
 
Chapter1
Chapter1Chapter1
Chapter1
 
chap2.pdf
chap2.pdfchap2.pdf
chap2.pdf
 
Econometrics 2.pptx
Econometrics 2.pptxEconometrics 2.pptx
Econometrics 2.pptx
 
Chapter_09_ParameterEstimation.pptx
Chapter_09_ParameterEstimation.pptxChapter_09_ParameterEstimation.pptx
Chapter_09_ParameterEstimation.pptx
 
Refresher probabilities-statistics
Refresher probabilities-statisticsRefresher probabilities-statistics
Refresher probabilities-statistics
 
HPWFcorePRES--FUR2016
HPWFcorePRES--FUR2016HPWFcorePRES--FUR2016
HPWFcorePRES--FUR2016
 
Basic Concept Of Probability
Basic Concept Of ProbabilityBasic Concept Of Probability
Basic Concept Of Probability
 
Accelerating Metropolis Hastings with Lightweight Inference Compilation
Accelerating Metropolis Hastings with Lightweight Inference CompilationAccelerating Metropolis Hastings with Lightweight Inference Compilation
Accelerating Metropolis Hastings with Lightweight Inference Compilation
 
2 Review of Statistics. 2 Review of Statistics.
2 Review of Statistics. 2 Review of Statistics.2 Review of Statistics. 2 Review of Statistics.
2 Review of Statistics. 2 Review of Statistics.
 
2 random variables notes 2p3
2 random variables notes 2p32 random variables notes 2p3
2 random variables notes 2p3
 
BAYSM'14, Wien, Austria
BAYSM'14, Wien, AustriaBAYSM'14, Wien, Austria
BAYSM'14, Wien, Austria
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
 
An overview of Hidden Markov Models (HMM)
An overview of Hidden Markov Models (HMM)An overview of Hidden Markov Models (HMM)
An overview of Hidden Markov Models (HMM)
 

Recently uploaded

PIMS Job Advertisement 2024.pdf Islamabad
PIMS Job Advertisement 2024.pdf IslamabadPIMS Job Advertisement 2024.pdf Islamabad
PIMS Job Advertisement 2024.pdf Islamabad
AyyanKhan40
 
Pride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School DistrictPride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School District
David Douglas School District
 
DRUGS AND ITS classification slide share
DRUGS AND ITS classification slide shareDRUGS AND ITS classification slide share
DRUGS AND ITS classification slide share
taiba qazi
 
Top five deadliest dog breeds in America
Top five deadliest dog breeds in AmericaTop five deadliest dog breeds in America
Top five deadliest dog breeds in America
Bisnar Chase Personal Injury Attorneys
 
A Survey of Techniques for Maximizing LLM Performance.pptx
A Survey of Techniques for Maximizing LLM Performance.pptxA Survey of Techniques for Maximizing LLM Performance.pptx
A Survey of Techniques for Maximizing LLM Performance.pptx
thanhdowork
 
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdfবাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
eBook.com.bd (প্রয়োজনীয় বাংলা বই)
 
Lapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdfLapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdf
Jean Carlos Nunes Paixão
 
Your Skill Boost Masterclass: Strategies for Effective Upskilling
Your Skill Boost Masterclass: Strategies for Effective UpskillingYour Skill Boost Masterclass: Strategies for Effective Upskilling
Your Skill Boost Masterclass: Strategies for Effective Upskilling
Excellence Foundation for South Sudan
 
CACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdfCACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdf
camakaiclarkmusic
 
clinical examination of hip joint (1).pdf
clinical examination of hip joint (1).pdfclinical examination of hip joint (1).pdf
clinical examination of hip joint (1).pdf
Priyankaranawat4
 
writing about opinions about Australia the movie
writing about opinions about Australia the moviewriting about opinions about Australia the movie
writing about opinions about Australia the movie
Nicholas Montgomery
 
How to Build a Module in Odoo 17 Using the Scaffold Method
How to Build a Module in Odoo 17 Using the Scaffold MethodHow to Build a Module in Odoo 17 Using the Scaffold Method
How to Build a Module in Odoo 17 Using the Scaffold Method
Celine George
 
Azure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHatAzure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHat
Scholarhat
 
Liberal Approach to the Study of Indian Politics.pdf
Liberal Approach to the Study of Indian Politics.pdfLiberal Approach to the Study of Indian Politics.pdf
Liberal Approach to the Study of Indian Politics.pdf
WaniBasim
 
South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)
Academy of Science of South Africa
 
Assessment and Planning in Educational technology.pptx
Assessment and Planning in Educational technology.pptxAssessment and Planning in Educational technology.pptx
Assessment and Planning in Educational technology.pptx
Kavitha Krishnan
 
A Independência da América Espanhola LAPBOOK.pdf
A Independência da América Espanhola LAPBOOK.pdfA Independência da América Espanhola LAPBOOK.pdf
A Independência da América Espanhola LAPBOOK.pdf
Jean Carlos Nunes Paixão
 
The basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptxThe basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptx
heathfieldcps1
 
World environment day ppt For 5 June 2024
World environment day ppt For 5 June 2024World environment day ppt For 5 June 2024
World environment day ppt For 5 June 2024
ak6969907
 
How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17
Celine George
 

Recently uploaded (20)

PIMS Job Advertisement 2024.pdf Islamabad
PIMS Job Advertisement 2024.pdf IslamabadPIMS Job Advertisement 2024.pdf Islamabad
PIMS Job Advertisement 2024.pdf Islamabad
 
Pride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School DistrictPride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School District
 
DRUGS AND ITS classification slide share
DRUGS AND ITS classification slide shareDRUGS AND ITS classification slide share
DRUGS AND ITS classification slide share
 
Top five deadliest dog breeds in America
Top five deadliest dog breeds in AmericaTop five deadliest dog breeds in America
Top five deadliest dog breeds in America
 
A Survey of Techniques for Maximizing LLM Performance.pptx
A Survey of Techniques for Maximizing LLM Performance.pptxA Survey of Techniques for Maximizing LLM Performance.pptx
A Survey of Techniques for Maximizing LLM Performance.pptx
 
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdfবাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
 
Lapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdfLapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdf
 
Your Skill Boost Masterclass: Strategies for Effective Upskilling
Your Skill Boost Masterclass: Strategies for Effective UpskillingYour Skill Boost Masterclass: Strategies for Effective Upskilling
Your Skill Boost Masterclass: Strategies for Effective Upskilling
 
CACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdfCACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdf
 
clinical examination of hip joint (1).pdf
clinical examination of hip joint (1).pdfclinical examination of hip joint (1).pdf
clinical examination of hip joint (1).pdf
 
writing about opinions about Australia the movie
writing about opinions about Australia the moviewriting about opinions about Australia the movie
writing about opinions about Australia the movie
 
How to Build a Module in Odoo 17 Using the Scaffold Method
How to Build a Module in Odoo 17 Using the Scaffold MethodHow to Build a Module in Odoo 17 Using the Scaffold Method
How to Build a Module in Odoo 17 Using the Scaffold Method
 
Azure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHatAzure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHat
 
Liberal Approach to the Study of Indian Politics.pdf
Liberal Approach to the Study of Indian Politics.pdfLiberal Approach to the Study of Indian Politics.pdf
Liberal Approach to the Study of Indian Politics.pdf
 
South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)
 
Assessment and Planning in Educational technology.pptx
Assessment and Planning in Educational technology.pptxAssessment and Planning in Educational technology.pptx
Assessment and Planning in Educational technology.pptx
 
A Independência da América Espanhola LAPBOOK.pdf
A Independência da América Espanhola LAPBOOK.pdfA Independência da América Espanhola LAPBOOK.pdf
A Independência da América Espanhola LAPBOOK.pdf
 
The basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptxThe basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptx
 
World environment day ppt For 5 June 2024
World environment day ppt For 5 June 2024World environment day ppt For 5 June 2024
World environment day ppt For 5 June 2024
 
How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17
 

pattern recognition

  • 2. 2 Why Bother About Probabilities? • Accounting for uncertainty is a crucial component in decision making (e.g., classification) because of ambiguity in our measurements. • Probability theory is the proper mechanism for accounting for uncertainty. • Need to take into account reasonable preferences about the state of the world, for example: "If the fish was caught in the Atlantic ocean, then it is more likely to be salmon than sea-bass
  • 3. 3 Definitions • Random experiment – An experiment whose result is not certain in advance (e.g., throwing a die) • Outcome – The result of a random experiment • Sample space – The set of all possible outcomes (e.g., {1,2,3,4,5,6}) • Event – A subset of the sample space (e.g., obtain an odd number in the experiment of throwing a die = {1,3,5})
  • 4. 4 Intuitive Formulation • Intuitively, the probability of an event a could be defined as: • Assumes that all outcomes in the sample space are equally likely (Laplacian definition) Where N(a) is the number that event a happens in n trials
  • 6. 6 Prior (Unconditional) Probability • This is the probability of an event prior to arrival of any evidence. P(Cavity)=0.1 means that “in the absence of any other information, there is a 10% chance that the patient is having a cavity”.
  • 7. 7 Posterior (Conditional) Probability • This is the probability of an event given some evidence. P(Cavity/Toothache)=0.8 means that “there is an 80% chance that the patient is having a cavity given that he is having a toothache”
  • 8. 8 Posterior (Conditional) Probability (cont’d) • Conditional probabilities can be defined in terms of unconditional probabilities: • Conditional probabilities lead to the chain rule: P(A,B)=P(A/B)P(B)=P(B/A)P(A)   ( , ) ( , ) ( / ) , / ( ) ( ) P A B P A B P A B P B A P B P A  
  • 9. 9 Law of Total Probability • If A1, A2, …, An is a partition of mutually exclusive events and B is any event, then • Special case : • Using the chain rule, we can rewrite the law of total probability using conditional probabilities: 1 1 2 2 1 ( ) ( / ) ( ) ( / ) ( ) ... ( / ) ( ) ( / ) ( ) n n n j j j P B P B A P A P B A P A P B A P A P B A P A        B A1 A2 A3 S A4 ( ) ( , ) ( , ) ( / ) ( ) ( / ) ( ) P A P A B P A B P A B P B P A B P B    
  • 10. 10 Law of Total Probability: Example • My mood can take one of two values – Happy, Sad • The weather can take one of three values – Rainy, Sunny, Cloudy • We can compute P(Happy) and P(Sad) as follows: P(Happy)=P(Happy/Rainy)+P(Happy/Sunny)+P(Happy/Cloudy) P(Sad)=P(Sad/Rainy)+P(Sad/Sunny)+P(Sad/Cloudy)
  • 11. 11 Bayes’ Theorem • Conditional probabilities lead to the Bayes’ rule: where • Example: consider the probability of Disease given Symptom: where ( / ) ( ) ( / ) ( ) P B A P A P A B P B  ( / ) ( ) ( / ) ( ) P Symptom Disease P Disease P Disease Symptom P Symptom  ( ) ( / ) ( ) ( / ) ( ) P Symptom P Symptom Disease P Disease P Symptom Disease P Disease   ( ) ( , ) ( , ) ( / ) ( ) ( / ) ( ) P B P B A P B A P B A P A P B A P A    
  • 12. 12 Bayes’ Theorem Example • Meningitis causes a stiff neck 50% of the time. • A patient comes in with a stiff neck – what is the probability that he has meningitis? • Need to know two things: – The prior probability of a patient having meningitis (1/50,000) – The prior probability of a patient having a stiff neck (1/20) P(M/S)=0.0002 ( / ) ( ) ( / ) ( ) P S M P M P M S P S 
  • 13. 13 General Form of Bayes’ Rule • If A1, A2, …, An is a partition of mutually exclusive events and B is any event, then the Bayes’ rule is given by: where ( / ) ( ) ( / ) ( ) i i i P B A P A P A B P B  1 ( ) ( / ) ( ) n j j j P B P B A P A   
  • 14. 14 Independence • Two events A and B are independent iff: P(A,B)=P(A)P(B) • From the above formula, we can show: P(A/B)=P(A) and P(B/A)=P(B) • A and B are conditionally independent given C iff: P(A/B,C)=P(A/C) e.g., P(WetGrass/Season,Rain)=P(WetGrass/Rain)
  • 15. 15 Random Variables • In many experiments, it is easier to deal with a summary variable than with the original probability structure. • Example: in an opinion poll, we ask 50 people whether agree or disagree with a certain issue. – Suppose we record a "1" for agree and "0" for disagree. – The sample space for this experiment has 250 elements. – Suppose we are only interested in the number of people who agree. – Define the variable X=number of "1“ 's recorded out of 50. – Easier to deal with this sample space (has only 51 elements).
  • 16. 16 Random Variables (cont’d) • A random variable (r.v.) is the value we assign to the outcome of a random experiment (i.e., a function that assigns a real number to each event). X(j)
  • 17. 17 Random Variables (cont’d) • How is the probability function of the random variable is being defined from the probability function of the original sample space? – Suppose the sample space is S=<s1, s2, …, sn> – Suppose the range of the random variable X is <x1,x2,…,xm> – We observe X=xj iff the outcome of the random experiment is an such that X(sj)=xj P(X=xj)=P( : X(sj)=xj) j s S  j s S  e.g., What is P(X=2) in the previous example?
  • 18. 18 Discrete/Continuous Random Variables • A discrete r.v. can assume only a countable number of values. • Consider the experiment of throwing a pair of dice • A continuous r.v. can assume a range of values (e.g., sensor readings).
  • 19. 19 Probability mass (pmf) and density function (pdf) • The pmf /pdf of a r.v. X assigns a probability for each possible value of X. • Warning: given two r.v.'s, X and Y, their pmf/pdf are denoted as pX(x) and pY(y); for convenience, we will drop the subscripts and denote them as p(x) and p(y), however, keep in mind that these functions are different !
  • 20. 20 Probability mass (pmf) and density function (pdf) (cont’d)_ • Some properties of the pmf and pdf:
  • 21. 21 Probability Distribution Function (PDF) • With every r.v., we associate a function called probability distribution function (PDF) which is defined as follows: • Some properties of the PDF are: – (1) – (2) F(x) is a non-decreasing function of x • If X is discrete, its PDF can be computed as follows: ( ) ( ) F x P X x   0 ( ) 1 F x   0 0 ( ) ( ) ( ) ( ) x x k k F x P X x P X k p k         
  • 23. 23 Probability mass (pmf) and density function (pdf) (cont’d)_ • If X is continuous, its PDF can be computed as follows: • Using the above formula, it can be shown that: ( ) ( ) x F x p t dt for all x    ( ) ( ) dF p x x dx 
  • 24. 24 Probability mass (pmf) and density function (pdf) (cont’d)_ • Example: the Gaussian pdf and PDF 0
  • 25. 25 Joint pmf (discrete r.v.) • For n random variables, the joint pmf assigns a probability for each possible combination of values: p(x1,x2,…,xn)=P(X1=x1, X2=x2, …, Xn=xn) Warning: the joint pmf /pdf of the r.v.'s X1, X2, ..., Xn and Y1, Y2, ..., Yn are denoted as pX1X2...Xn(x1,x2,...,xn) and pY1Y2...Yn(y1,y2,...,yn); for convenience, we will drop the subscripts and denote them p(x1,x2,...,xn) and p(y1,y2,...,yn), keep in mind, however, that these are two different functions.
  • 26. 26 Joint pmf (discrete r.v.) (cont’d) • Specifying the joint pmf requires an enormous number of values – kn assuming n random variables where each one can assume one of k discrete values. – things much simpler if we assume independence or conditional independence … P(Cavity, Toothache) is a 2 x 2 matrix Toothache Not Toothache Cavity 0.04 0.06 Not Cavity 0.01 0.89 Joint Probability Sum of probabilities = 1.0
  • 27. 27 Joint pdf (continuous r.v.) For n random variables, the joint pdf assigns a probability for each possible combination of values: 1 2 ( , ,..., ) 0 n p x x x  1 1 2 1 ... ( , ,..., ) ... 1 n n n x x p x x x dx dx   
  • 29. 29 Interesting Properties • The conditional pdf can be derived from the joint pdf: • Conditional pdfs lead to the chain rule (general form): ( , ) ( / ) ( , ) ( / ) ( ) ( ) p x y p y x or p x y p y x p x p x   1 2 1 2 2 3 1 ( , ,..., ) ( / ,..., ) ( / ,..., )... ( / ) ( ) n n n n n n p x x x p x x x p x x x p x x p x  
  • 30. 30 Interesting Properties (cont’d) • Knowledge about independence between r.v.'s is very powerful since it simplifies things a lot, e.g., if X and Y are independent, then: • The law of total probability: ( , ) ( ) ( ) p x y p x p y  ( ) ( / ) ( ) x p y p y x p x  
  • 31. 31 Marginalization • From a joint probability, we can compute the probability of any subset of the variables by marginalization: – Example - case of joint pmf : – Examples - case of joint pdf : ( ) ( , ) p x p x y dy     ( ) ( , ) y p x p x y   1 2 1 1 1 2 ( , ,..., , ,..., ) ( , ,..., ) i i n n i p x x x x x p x x x dx       3 1 2 1 2 3 ( , ) ... ( , ,..., ) ... n n n x x p x x p x x x dx dx   
  • 32. 32 Why is the joint pmf (or pdf) useful?
  • 33. 33 Why is the joint pmf (or pdf) useful (cont’d)?
  • 34. 34 Probabilistic Inference • If we could define all possible values for the probability distribution, then we could read off any probability we were interested in. • In general, it is not practical to define all possible entries for the joint probability function. • Probabilistic inference consists of computing probabilities that are not explicitly stored by the reasoning system (e.g., marginals, conditionals).
  • 36. 36 Normal (Gaussian) Distribution (cont’d) • Shape and parameters of Gaussian distribution – number of parameters is – shape determined by Σ ( 1) 2 d d d  
  • 37. 37 Normal (Gaussian) Distribution (cont’d) • Mahalanobis distance: • If variables are independent, the multivariate normal distribution becomes: 2 1 ( ) ( ) t r        x x 2 2 2 ( ) 1 ( ) exp[ ] 2 2 i i i i i x p x      
  • 40. 40 Properties of the Expected Value
  • 45. 45 Properties of the covariance matrix

Editor's Notes

  1. 2023-09-04
  2. 2023-09-04
  3. 2023-09-04
  4. 2023-09-04
  5. 2023-09-04
  6. 2023-09-04
  7. 2023-09-04
  8. 2023-09-04
  9. 2023-09-04
  10. 2023-09-04
  11. 2023-09-04
  12. 2023-09-04
  13. 2023-09-04
  14. 2023-09-04
  15. 2023-09-04
  16. 2023-09-04
  17. 2023-09-04
  18. 2023-09-04
  19. 2023-09-04
  20. 2023-09-04
  21. 2023-09-04
  22. 2023-09-04
  23. 2023-09-04
  24. 2023-09-04
  25. 2023-09-04
  26. 2023-09-04
  27. 2023-09-04
  28. 2023-09-04
  29. 2023-09-04
  30. 2023-09-04
  31. 2023-09-04
  32. 2023-09-04
  33. 2023-09-04
  34. 2023-09-04
  35. 2023-09-04
  36. 2023-09-04
  37. 2023-09-04
  38. 2023-09-04
  39. 2023-09-04
  40. 2023-09-04
  41. 2023-09-04
  42. 2023-09-04
  43. 2023-09-04
  44. 2023-09-04
  45. 2023-09-04
  46. 2023-09-04
  47. 2023-09-04
  48. 2023-09-04
  49. 2023-09-04