SlideShare a Scribd company logo
1 of 82
DAT203
Probability Theory and its Applications
Prof. Mohamed Deriche
Professor of AI/ML
Part 1: Introduction and Fundamentals
• Course overview
• Logistics and admin matters
• Motivation and coverage
• Introduction to probabilities
• Approaches to probabilities
• Axiomatic approach
• Probabilities through sets.
• Multiple random experiments
• Permutations and combinations
• Basic models for repeated experiments (Bernoulli, Binomial,
Multinomial and Geometric Probability Laws)
• Examples
Motivation
• The world is not ideal nor deterministic(can you guess the whether
tomorrow or a score of a game of whether the dollar goes up of
down?)
• Random signals (values) appear in various applications: e.g, effect of
noise (voice through wireless channels), reading measurements,
maximising chance for something to happen, predicting weather,
predicting faults, etc...
• One way to deal with such phenomena is to leave it to chance!
• Or we need mathematical tools to analyse such signals and extract
useful information so we can make better and more useful decisions.
• Ultimately, we need to use data efficiently and extract useful
information and finally make “good” decision (e.g share market)
Why study probabilities?
• Life is unpredictable. From guessing whether or not a given day will
be sunny or rainy to predicting the likelihood of when a deadly
disease may come
• Such uncertainties make all of us very nervous about the future.
• Fortunately, probability theory comes to save us. Essentially,
probability is a branch of mathematics which predicts through
calculations the various outcomes of a given event.
• Simply, it gives the ability to predict how future events may turn out.
What are probabilities?
• Consider the experiment of throwing a coin, what is
randomness and probability?
• "randomness" is a way of expressing what we don't know.
(forces, orientation, surface smoothness, etc..)
• When we say that something is random, we are saying our
knowledge about outcome is limited, so we can't be
certain what will happen.
• Since coin is fair, The outcome is random. If we don't know
anything about how it was flipped, probability that it will be
head is 50%, or 1/2. What exactly do we mean?
What are probabilities?
• Since coin is fair, if we don't know anything about how it
was flipped, probability that it will be head is 50%, or
1/2. What exactly do we mean?
• One interpretation is in terms of relative frequency? (if
experiment is repeated many times)
• Second interpretation of probability is that it quantifies
our degree of subjective belief that something will
happen
Probability (branch of math) is a measure of the likelihood of an event to
occur. Many events cannot be predicted with total certainty. We can predict
only the chance of an event to occur i.e. how likely they are to happen
Example: communication system
• Because of noise, we cannot
confirm that a 1 or 0 was sent
• Like throwing a coin, and the
received guesses what was sent
• Knowing the probabilistic nature
of the noise, can help us in
performing a good guess on what
was sent
Examples of Real Life Probability
•Planning around the Weather
•Sports Strategies
•Insurance Options
•Business decisions
•Games and Recreational Activities
•Researching a Disease
•Space Exploration and Risk Assessment
•Telecommunications
•Failure and reliability analysis
Course Objectives (DAT 203)
•This course introduces you to Probability theory,
Random variables and Random processes.
•It provides fundamentals of theory.
•Courses focuses on the application of different
probability models to solve practical real work
problems in data anlytics
Course Learning Outcomes
a. Apply the principles and techniques of probability theory
to solve data analytics problems.
a. Apply the concepts of discrete and continuous
probability distributions.
a. Derive and use the probability density function of
transformations of random variables
a. Use methods from algebra and calculus to derive the
moments of random variable and random vectors
a. Use the concepts of continuous and discrete-time
random processes.
a. Apply selected probability distributions to solve
problems in random processes
Coverage
• Introduction to probabilities
• Random variables
• Discrete and continuous probability distributions
• Operation on random variables
• Multiple random variables
• Some important laws of large numbers
• Random Processes
• Applications
Assessment Tools
• Class tests,
• Mid-Term,
• Final,
• Homework,
• Project
Part 1
Introduction to Probability Models
1.1 Introduction
• The world is not ideal nor deterministic
• Random signals (values) appear in various applications: e.g,
effect of noise, reading measurements, maximising chance for
something to happen, predicting weather, predicting faults,
etc...
• One way to deal with such phenomena is to leave it to chance!
• Or we need mathematical tools to analyse such signals and
extract useful information so we can make leaner and useful
decision.
• Ultimately, we need to use data efficiently and extract useful
information and finally make “good” decision (e.g share market)
Why study probabilities?
• Life is unpredictable. From guessing whether or not a given day
will be sunny or rainy to predicting the likelihood of when a
deadly disease may come
• Such uncertainties make all of us very nervous about the future.
• Fortunately, probability theory comes to save us. Essentially,
probability is a branch of mathematics which predicts through
calculations the various outcomes of a given event.
• Simply, it gives the ability to predict how future events may turn
out.
Probabilities versus Statistics
• Notation:
• Probability theory deals with models of unpredictable phenomena.
Probability theory is a branch of mathematics concerned with
probability. Probability is a numerical description of the likelihood of
an event.
• Statistics is concerned with collection and representation of data for
practical conclusions.
Statistics is a branch of mathematics that concerns the collection,
organization, displaying, analysis, interpretation and presentation of
data.
• The relationship between those two is that in statistics, we apply
probability (probability theory) to draw conclusions from data.
Example
• Probability example:
You have a fair coin (equal probability of heads or tails). You will toss
it 100 times. What is the probability of 60 or more heads? We can get
only a single answer because of the standard computation strategy.
• Statistics example:
You have a coin of unknown provenance. To investigate whether it is
fair you toss it 100 times and count the number of heads. Let’s say
you count 60 heads. Your job as a statistician is to draw a conclusion
(inference) from this data.
In this situation, different Statisticians may draw different
conclusions because they may use different conclusion forms or may
use different methods for predicting the probability(e.g. of landing
heads).
Occurring Phenomena can be of two types
• Deterministic: Happen same way each time experiment
repeated. Eg, x(t)=10sin(2t).
• Random: Do not happen same way even with same
conditions. Eg, tossing a coin.
Example of random signal
Sinusoid in noise: (Random signal)
y(t) = A cos(wot)+n(t)
• knowing A, wo can’t determine y(to) exactly (as n(t)
changes)
• However with no noise, we can predict y(t) at any time
instant with no error
Example: Defective chips
Consider a box of 500 chips with probability of defective chips:
P(0 defective)=0.02
P(1 defective)=0.11
P(2 defective)=0.16
P(3 defective)=0.21
P(4 defective)=0.13
P(5 defective)=0.08
P(6, or 7, or 8, . . Or 500) unknown
Claim: no more than 5 defective/box
What is the Prob. of claim being right?
P(correct) = P(0 defective) + P(1) + P(2) + P(3) + P(4) + P(5) = 0.71
On average 71% of boxes have no more than 5 defectives.
Decisions can be made based on this information
• Other examples: number of agents serving in bank, number of TVs in stock,,
Number of chairs for a certain number of people, overbooking in planes, etc….
1.2 Approaches to probability
• Personal
• relative frequency
• equally likely
• axiomatic
• Personal: each one of us has an intuitive notion of probability.
Ex. Waiting to be served (cant use this to solve problems!)
• Relative frequency: For a given event A in a random experiment
P(A) = lim nA/n = # occurrence of A / # trials
ex. Tossing a coin (# heads)
(In may cases cant repeat experiments many times, ex. Hitting a target, nuclear
experiments vaccine, …..)
• Equally Likely: Experiments seen as consisting of several outcomes that have
equal chance of occurrence.
Ex. Tossing 2 fair coins. Prob. Of at least 1 head.
Outcomes {H,H H,T T,H T,T}
prob. .25 .25 .25 .25
P(at least 1 head) = .25 + .25 +.25 = .75 (adding favourable outcomes)
Relative Frequency of Occurrence:
• Roll the die n = 100-10,000 times, count them up!
(classroom experiment, histogram).
• Q: can we always repeat an experiment 1000 time (eg. (nuclear
bomb!
• Advantages:
• Automatically accounts for non-uniform outcome probabilities.
• Can handle continuous outcomes.
• Disadvantages:
• We don’t have all day! (n << ∞)
• Thus, P[E] is not known exactly!
n
n
E
P E
n 

 lim
]
[
Equally likely approach
• A six sided die is rolled twice. What is the “probability” of the
sum of the two rolls being:
1, 2, 5, 7, or 11?
• equally likely
, (ratio of total to favorable outcomes)
•
• Count them up
N
N
E
P E

]
[
2nd 1st die
die 1 2 3 4 5 6
1 2 3 4 5 6 7
2 3 4 5 6 7 8
3 4 5 6 7 8 9
4 5 6 7 8 9 10
5 6 7 8 9 10 11
6 7 8 9 10 11 12
2/36
6/36
1/36
4/36
• Advantages:
• Clear, mathematically sound definition.
• Based on model of the events rather than empirical observations
of experimental outcomes.
• Disadvantages:
• Assumes equally likely outcomes
- What about loaded dice? No way to handle it!
• Requires detailed enumeration of all outcomes. What do you do
with continuous data?
Question: how to assign probability in a
consistent and global way?
• How to give a numeric value to the probability of some events?
• Common sense tells us that the probability of getting “Head” in
tossing of a fair coin experiment is 1/2 or 1 out of two
possibilities.
• Also the probability of getting “Two dots” in tossing a fair dice
1/6 or 1 out of six possibilities.
• Solution: Define probability through rules!
1.3 Axiomatic Approach to Probabilities
Instead of defining what probability is?
 define the properties that must be satisfied.
Notation:
Random experiment  can generate more than 1 outcome.
Rolled die  1, 2, 3, 4, 5, 6 points up.
Sample Space {S} of an experiment is set of all possible outcomes.
Above example: S={1, 2, 3, 4, 5, 6}
tossing 2 coins S = {HH, HT, TH, TT}
Event A: subset of S which collects outcomes of particular interest.
A  P(A) {we assign a value which is function of A
For each event A in S, we assign P(A): Prob. of A which must
satisfy: (axioms of probability)
1. P(A)  0
2. P(S) = 1
3. If A & B are two events in S with no common outcomes
 P(A or B) = P(A) + P(B).
3rd axiom  extended to N events
IF Am  An = ,  m  n
 P( An) =  P(An).
More formally: Axiomatic Approach to Probabilities
axiom 1 : represent our desire to work with
nonnegative numbers
P(A) 0

axiom 2 : The sample space is an event with
the highest probability , certain event
P(S) = 1
P( ) = 0
 The null event is an event with
the lowest probability , impossible event
The probability of an event which is the union of mutually exclusive events is
equal to the sum of the individual event probabilities
N N
N N m n
n = 1
n = 1
P A = P(A ) if A A =
 

 
 

axiom 3
For each event A in S, we assign P(A): Prob. of A which must satisfy:
Based on the axioms above, probability theory leads itself to use
the language of sets. Probability is defined and calculated for
sets.
Set theory is the branch of mathematics which deals with the
formal properties of sets as units (groups, objects, etc…)
(without regard to the nature of their individual constituents)
Intuitively a set is a collection of objects, which are called
elements. Although this seems like a simple idea, it has some
nice consequences and applications.
Set definition: A set is a collection of objects or elements.
1.4 Probabilities using set theory
Some examples of sets: Kitchen tools, electronic circuits,
School bags; Shopping Malls; Movies, House; airplanes,
chairs, students, etc…
More sets……
Examples of sets
Notation:
Sets are collection of objects (elements).
S = {HH, HT, TH, TT} (tossing two coins)
set of possible outcomes in tossing 2 coins.
i(element)  A, I element of set A.
i  A, i not contained in A.
B  A, B subset of A.
 null set, no elements.
S universal set set of all elements.
E = A  B, set of elements in A or B or both. (can use notation A+B)
E = A  B, set of elements in both A and B. (A.B)
Can use Venn Diagram for graphical visualisation of sets.
Ac or  complement A in S contains elements in S no in A.
disjoint sets: A  B= , no intersection.
Other properties:
A   = A
A  S = S
A   = 
A  S = A
if B  A  A  B = B, A  B = A
S
A
B
A

B
If every element of a set A is also an element in another set B
A is said to be contained in B or A is a subset of B
A B

If at least one element exists in B which is not in A ,
then A is a proper subset of B , denoted as
A B

The null set is subset of all other sets 
A
B
Two sets A and B are mutually exclusive if
they have no common elements
Example Let the sets A and B be given as
A = { 1, 3, 5, 7 } and B = { 2, 4, 6, 8, 10, 12, 14 }
A
B
Other properties
A  B = B  A, A  B = B  A (Commutatively)
(A  B)  C = A  (B  C) (Associatively)
(A  B)  C = A  (B  C)
(A  B)  C = (A  C)  (B  C) (distributive)
A  B = B  A, A  B = B  A (Commutatively)
(A  B)  C = A  (B  C) (Associatively)
(A  B)  C = A  (B  C)
(A  B)  C = (A  C)  (B  C) (distributive)
{A1, . . . AM} form a partition in S
if Ai  Aj = , i  j and A1  A2 . . .  AM = S
Cartesian product C = A x B (all possible pairs I j)
ex. A={h1, t1}, B={h2, t2}
A x B = {h1 h2, h1 t2, t1 h2, t1 t2}
A1
A2
A3
A4
A5
De Morgan’s Law:
(A  B)c = Ac  Bc
(A  B)c = Ac  Bc
Difference : Consider two sets A and B
Denote the set A – B, is the set containing all elements of A that are
not presents in B
Example :
Let A = { 0.6 < a ≤ 1.6 } and B = { 1.0 ≤ b ≤ 2.5 }
Then A – B = { 0.6 < c < 1.0 }
0.6 1.6
1.0 2.5
( ]
[ ]
( )
0.6 1.0
A
B
A – B  { 0.6 < c < 1.0 }
Similalrly B – A = { 1.6 < d < 2.5 } ≠ A – B
Union : The union of two sets A and B call it C is written
C = A B B
A B
A
B
Intersection : The intersection of two sets A and B call it D is
written
D = A B
A
B
A B
A B
Sometimes called the sum
of two sets
Sometimes called the
products of two sets
The union and intersection can be repeated for N sets
N
1 2 N i
i=1
C = A A A = A
  
N
1 2 N i
i=1
D = A A A = A
  
1.5 Derivation of probabilities from sets
• The idea is to use the concept of sets and add to it the concept of
function (which is probability). This function needs to satisfy the
axiomatic properties
• A  Ac = S A  Ac = 
since A & Ac are disjoint,
P(A  Ac) = P(A) + P(Ac) = P(S) = 1
 P(Ac) = 1 - P(A)
let A = S  Ac =   P() = 1 - P(S) = 1 - 1 = 0
 P(A  B) = P(A) + P(A  Bc)
P(B) = P(A  B) + P(B  Ac)
 P (A  B) = P(A) + P(B) - P(A  B)
S
A B
A

B
Data Analytics Pipeline based on Probabilities
1.6 Conditional Probabilities
In some experiments, we need to know probability of an event occurring
given that (conditional) another event has occurred.
We denote this conditional prob. as: P(A/B) conditional prob. of A given B.
We assume that P(B) > 0.
If A  B =   P(A/B) = 0.
P(A/B) is defined as P(A/B) = P(A  B)/P(B) (assuming P(B)>0)
if B  A  P(A/B) = 1, (A  B = 1)
if A  B  P(A/B) = P(A)/P(B)  P(A)
Does P(A/B) satisfy the 3 axioms: (valid probability function)? Yes
• P(A/B) > 0
• P(S/B) = 1
• if D & E mutually exclusive,
P(D  E/B) = P((D  E)  B)/P(B)
= (P(D  E)+P(E  B))/P(B)
= P(D/B) + P(E/B)
Example
• Probability of selling a Mobile on a given normal day maybe only 30%.
• But if we consider that given day is Long Weekend, then there are much
more chances of selling a Mobile!
• The conditional Probability of selling a Mobile on a day given that we have
a long weekend break might be 40%.
• We can represent these probabilities as
• P(Mobile sell on a random day) = 30%.
• P(Mobile sell given that today is long weekend) = 40%.
• So Conditional Probability helps Data Scientists to get better results from
the given data set and for Machine Learning Engineers, it helps building
better models for prediction.
Example
• Probability of defaulting on paying the loan based on age group.
• This will help predict how many people may default on paying the loans
when they apply for such loans.
1.7 Independent Events
Within a sample space, we call 2 events independent if the prob. of
occurrence of one is not affected by the occurrence of the other event.
If A & B are 2 Ind. Events  P(A/B) = P(A)
however since P(A/B) = P(AB)/P(B)  P(AB) = P(A).P(B)
Note: P(A/B) = P(A)  P(B/A) = P(B)  P(AB) = P(A).P(B)
independence assumption simplifies statistical problems.
For N events A1  AN are called statistically ind. if,
P(AiAj) = P(Ai).P(Aj)
P(AiAjAk) = P(Ai).P(Aj).P(Ak)
.
P(A1A2  AN) = P(A1).P(A1)  P(AN)
are satisfied for all 1  i < j < . . .  N
Other properties:
If A1 & A2 ind. Events  A1 & Â2 are independent.
 Â1 & A2 are independent.
 Â1 & Â2 are independent.
Independent Events
•You flip a coin and get a heads and you flip a second coin and
get a tails. The two coins don't influence each other.
•The probability of rain today and the probability of a pop quiz;
Non-independent Events
• The probability of snow today and the probability of a pop
quiz;
Snow causes school closings, in which case your teacher
can't give you a pop quiz.
• The chance that you are hungry right now and the chance
that you're eating right now. Obviously one leads to the
other eventually ...
Method to Identify Independent Events
Before applying probability formulas, one needs to identify an
independent event.
Few steps for checking whether the probability belongs to a
dependent or independent events:
Step 1: Check if it possible for the events to happen in any
order? If yes, go to Step 2, or else go to Step 3
Step 2: Check if one event affects the outcome of the other
event? If yes, go to step 4, or else go to Step 3
Step 3: The event is independent. Use the formula of
independent events and get the answer.
Step 4: The event is dependent. Use the formula of dependent
event and get the answer.
Example: A juggler has seven red, five green, and four blue
balls. During his stunt, he accidentally drops a ball and then
picks it up. As he continues, another ball falls. What is the
probability that the first ball that was dropped is blue, and the
second ball is green?
As we know that the first ball is picked by the juggler, the size of
the sample space for both balls is 16, because these events are
independent.
The probability that the first ball is blue or P (blue ball) = 4/16
The probability that the second ball is green or P(green ball) =
4/16
The probability that the first ball is blue and the second ball is
green:
P(blue and green)= P(blue) × P(green) = 4/16 × 4/16 = 1/16
Simple examples of dependent events:
• Robbing a bank and going to jail.
• Not paying your power bill on time and having your power cut off.
• Boarding a plane first and finding a good seat.
• Parking illegally and getting a parking ticket. Parking illegally
increases your odds of getting a ticket.
• Driving a car and getting in a traffic accident.
Consider an event B on the sample space S.
Consider also a partition {A1 . . . AN} in S.
Recall: B  S = B  (A1 . . AN)
= (BA1) . .(B AN)
 P(B) = P(BA1) + . . +(B AN)
However, recall P(BAi) = P(B/Ai).P(Ai)

P(B) = 𝒊=𝟏
𝑵
P(B/Ai).P(Ai)
A1
A2
A3
A4
A5
B
1.8 Total Probabilities
Example: 3 boxes of capacitors.
Experiment: randomly select a box, then randomly select a
capacitor.
CBox 1 2 3 Tot
C1 {0.01 F} 20 95 25 140
C2 {0.1 F} 55 35 75 165
C3 {1.0 F} 70 80 145 295
Tot 145 210 245 600
Prob. of selecting .01 F given that box 2 was selected?
If a .01 F is selected what is the prob. that it comes from box 3?
P(C1/B2) = 95/210
P(B3/C1) = P(C1B3)/P(C1)=P(C1/B3).P(B3)/
{P(C1/B1).P(B1)+P(C1/B2).P(B2)+P(C3/B3).P(B3) }
Example: 2 boxes containing $1 & $2 coins.
Box 1: 8 $1 2 $2
Box 2: 5 $1 20 $2
Box selected at random and a coin is selected.
Prob. Of a $2 coin selected?
Assume P(B1) = P(B2) = 1/2.
P($1/B1) = 8/10, P($2/B1) = 2/10
P($1/B2) = 5/25, P($2/B2) = 20/25
P($2) = P($2/B1).P(B1) + P($2/B2).P(B2)
= 2/10*1/2 + 20/25*1/2
= 1/10 + 2/5 = 1/2.
1.9 Bayes Theorem (Posterior Probability)
How do we calculate revised prob. of Ai given B.
(bringing additional information)
 P(A1/B) ? P(A2/B) ? P(AN/B)








N
j
j
j
i
i
i
N
N
i
i
i
i
A
P
A
B
P
A
P
A
B
P
B
A
P
A
P
A
B
P
A
P
A
B
P
A
P
A
B
P
B
P
B
A
P
B
A
P
1
1
1
)
(
).
/
(
)
(
).
/
(
)
/
(
)
(
).
/
(
...
)
(
).
/
(
)
(
).
/
(
)
(
)
(
)
/
(
Example: a mid-term exam in DAT203, prob. of student
studying = .7.
For those who study, prob. of passing = 0.9
For those who don’t study, prob. of passing = .05
Given student did not pass, what is prob. that he studied.
Event studied : ST, Event Passed: PA
197
.
0
3
.
0
95
.
0
7
.
0
*
1
.
0
7
.
0
1
.
0
)
(
).
/
(
)
(
).
/
(
)
(
).
/
(
)
/
(







ST
P
ST
PA
P
ST
P
ST
PA
P
ST
P
ST
PA
P
PA
ST
P
Example Corona Test
• Suppose that a test for COVID is 97% sensitive and 95% specific. That is, the test will
produce 97% true positive results for COVID patients and 95% true negative results for non-
COVID people.
• These are the pieces of data that any screening test will have from their history of tests.
Bayes’ rule allows us to use this kind of data-driven knowledge to calculate the final
probability. What is the probability that a randomly selected individual with a positive test
is COVID patient?
• P(TP/C)=.95 is sensitivity TP Test Positive
• P(TN/𝐶=0.97 is specificity T TN Test Negative
• PC)=.005 Prevalence
• P(C/TP)=P(TP/C)*P(C)/P(TP)=0.0089!!! only 8.9% chance that someone has COVID even
when test is positive.
Bayes Theorem: Example
Where 10% have own car
30% are Fr
Where 20% have own car
40% are So
Where 40% have own car
20% are Ju
Where 60% have own car
10% are Se
At AU
If a student has a car, the probability, that he is a junior
𝑃 𝐶 = 𝑃
𝐶
𝐹𝑟
𝑃 𝐹𝑟 + 𝑃
𝐶
𝑆𝑜
𝑃 𝑆𝑜 + 𝑃
𝐶
𝐽𝑢
𝑃 𝐽𝑢 + 𝑃
𝐶
𝑆𝑒
𝑃 𝑆𝑒
= 0.25  25%
P(Ju/C) =
𝑃
𝐶
𝐽𝑢
𝑃 𝐽𝑢
𝑃 𝐶
= 0.32
10%
30%
30%
Fr
%
So
40
%
20
%
10
Se
Ju
40
60
c
C
_
c
20%
c
c
• Most experiments considered till now assume outcomes from a single
experiment
• In some applications, we may need to combine experiments at the
same time or in a sequence
• Example: Measuring pressure and wind at a location
• 𝑆1 = 𝐿 𝑀 𝐻 𝑎𝑛𝑑 𝑆2 = {NW, LW, MW, FW}
• Combined sample space S=S1xS2={(L,NW), (L,LW),…………(H,FW)
• Example: Flipping Coin and tossing dice S = {H1, H2, H3….T6}
1.10 Multiple Random Experiments
Repeating Experiments in Sequence
When repeating same experiment in sequence, the outcome of the first may or may not affect the outcome of the second
Examples: 10 balls 5R 2G 3B
P(R) =
5
10
= 0.5
When we do experiment again, 2 scenarios:
Scenario
1
Scenario
2
Assume we got a R ball and we put it back
(with replacement)
→ 𝑃 𝑅 =
5
10
Assume we got a R ball and we move it out
(without replacement)
→ 𝑃 𝑅 =
4
𝑔
2 Issues are important in combined experiments
• Whether or not order is important?
• Whether or not we have a replacement?
Fundamental Counting Principle
Fundamental Counting Principle can be used
determine the number of possible outcomes
when there are two or more characteristics .
Fundamental Counting Principle states that
if an event has m possible outcomes and
another independent event has n possible
outcomes, then there are m* n possible
outcomes for the two events together.
1.11 Permutations and Combinations
Fundamental Counting
Principle
Lets start with a simple example.
A student is to roll a die and flip a coin.
How many possible outcomes will there be?
1H 2H 3H 4H 5H 6H
1T 2T 3T 4T 5T 6T
12 outcomes
6*2 = 12 outcomes
Fundamental Counting
Principle
For a college interview, Robert Ahmed to
choose what to wear from the following: 4
pants, 3 shirts, 2 shoes and 5 ties. How many
possible outfits does he have to choose
from?
4*3*2*5 = 120 outfits
Permutations
A Permutation is an arrangement
of items in a particular order.
Notice, Order Important!
To find the number of Permutations of
n items, we can use the Fundamental
Counting Principle or factorial notation.
Permutations
The number of ways to arrange
the letters ABC: ____ ____ ____
Number of choices for first blank? 3 ____ ____
3 2 ___
Number of choices for second blank?
Number of choices for third blank? 3 2 1
3*2*1 = 6 3! = 3*2*1 = 6
ABC ACB BAC BCA CAB CBA
Permutations
A combination lock will open when the
right choice of three numbers (from 1
to 30, inclusive) is selected. How many
different lock combinations are possible
assuming no number is repeated?
Practice:
24360
28
*
29
*
30
)!
3
30
(
!
30
3
30 




27!
30!
p
Combinations
A Combination is an arrangement
of items in which order does not
matter.
ORDER DOES NOT MATTER!
Since the order does not matter in
combinations, there are fewer combinations
than permutations. The combinations are a
"subset" of the permutations.
Combinations
To find the number of Combinations of
n items chosen r at a time, you can use
the formula
.
0
where n
r
r
n
r
n
r
C
n




)!
(
!
!
Combinations
To find the number of Combinations of
n items chosen r at a time, you can use
the formula
.
0
where n
r
r
n
r
n
r
C
n




)!
(
!
!
10
2
20
1
*
2
4
*
5
1
*
2
*
1
*
2
*
3
1
*
2
*
3
*
4
*
5
)!
3
5
(
!
3
!
5
3
5







3!2!
5!
C
Consider repeating an experiment without a replacement
• Example : Box of 3 balls numbered 1,2, and 3.
Select 2 balls with no replacement
{12 13 21 23 31 32}
1st ball 3 choices but the second ball 2 choices
Rule: Ordering r elements taken from “n” element
Number of possibilities = 𝒏 × 𝒏 − 𝟏 … . 𝒏 − 𝒓 + 𝟏
=
𝑛 𝑛− …
𝑛−𝑟 𝑛−𝑟−1 …1
=
𝒏!
𝒏−𝒓 !
= 𝑷 𝒏
𝒓
NOTE: here we have the case of: with order, no replacement
If order is not important, and no replacement, When order not important, then we have combinations
 Combination experiment without replacement
Rule: Number of combination of R objects from n = C𝒏
𝒓
=
𝒏!
𝒓! 𝒏−𝒓 !
Permutations and Combinations
Number of
Permutations
Example: 2 out of 4 
4!
2!2!
= 6
{1 2 1 3 14 23 24 34}
Wherein 2,1 counted as 1
• Consider the case of: with order and with replacement
• Consider n objects.
• 1st object  n possibilities
• 2nd Object  n possibilities
• .
• .
•
𝑛𝑥𝑛 …𝑛
𝑟
possibilities : 𝒏𝒓
# 𝒘𝒂𝒚𝒔 𝒇𝒐𝒓 𝒔𝒆𝒍𝒆𝒄𝒕𝒊𝒏𝒈 𝒏 𝒘𝒊𝒕𝒉 𝒐𝒓𝒅𝒆𝒓 𝒘𝒊𝒕𝒉 𝒓𝒆𝒑𝒍𝒂𝒄𝒆𝒎𝒆𝒏𝒕
• In case the order not important  combinations with replacements (no order, with replacement)
• 
𝒏+𝒓−𝟏 !
𝒓! 𝒏−𝟏 !
𝒑𝒐𝒔𝒔𝒊𝒃𝒍𝒆 𝒄𝒐𝒎𝒃𝒊𝒏𝒂𝒕𝒊𝒐𝒏𝒔 (no order, with replacement)
Summary
Permutation (with order), no replacement
𝑛!
𝑛 − 𝑟 !
Combination (no order), no replacement
𝑛!
𝑟! 𝑛 − 𝑟 !
Summary Examples
Example
12 students are selected and a team of 4 is selected from
those 12, find n number of ways this team is selected
Combination or Permutation? Combination! as the order is
not important
With replacement or without replacement? Once selected,
that is it
12!
4! 8!
=
12×11×10×9×
4×3×2
= 495
Example
A student needs to answer 8 questions from 10, find n number of
ways to choose the question if he selected the first 3
First 2 selected  remaining is 5 from 7 combination.
7!
𝑆!×2!
= 21 No replacement
• Find n # of committees of 5 with 1 chairman can be selected for 12 people.
• 1st
(𝑐ℎ𝑎𝑖𝑟)
12 possibilities
• Remaining 4 from 11  combination ( no order), w/o replacement
•
11!
4!7!
=
11×10×9×8
4×3×2
= 330
• N = 12x220 = 3960
• Find n number of permutations from letters m
• THOSE UNUSUAL SOCIOLOGICAL
• THOSE permutations n5 
5!
(5−5)!
= 120
• UNUSUAL permutation m represents but n =7, r=4 (u repeated 3 times
7!
3!
= 840
• SOCIOLOGICAL 
12!
1! 3! 2! 2! 2! 1!
S O C I L G
Summary Examples
• Example probability that n people have distinct birthdays (𝑛 ≤ 365)
• N people, 365 days  365 ways n people can have birthdays
• If n have different birthdays
• 1st  365 days possibilities
• 2nd  365-1 days possibilities
• Number of ways people have different birthdays is
365.364…..(365−𝑛+1)
365𝑛
• Prob of 2 or more have same birthdays is P = 1 −
365.364…..(365−𝑛+1)
365𝑛
•
𝑛
𝑝
|
10 20 22 23 30 60
.117 .411 .476 .507 .706 .994
• In class of 23 students it is more likely to have 2 or more sharing same birthdays!
Summary Examples
1.12 Basic Models of Repeated Random Experiments
Independent
Non-Independent
Basic Examples of repeated experiment:
A. Independent Sub Experiments:
When the outcome from one experiment does not affect what happens in the next trial, we say experiments are
independent
Example: Tossing 2 fair coins
However, assume we have 2 coins (one fair, second biased)  ex: P(H) =1/4
We first choose at random a coin
If H, chooses a fair coin, next trial
If T, choose a biased coin, next trial
For Independent Trials, P(A) = P (𝐴1 ∩ 𝐴2 … … … … … . .∩ 𝐴𝑚) = 𝑃(𝐴1)………P(𝐴𝑚)
For dependent Trials, P(A) = P (𝐴𝑚/𝐴𝑚−1 … … … … … . . 𝐴1). 𝑃(𝐴𝑚−1/𝐴𝑚−2 … … … 𝐴1)…….P(𝐴2/𝐴1)𝑃(𝐴1)
P(𝐴𝑚) may depend on 𝐴𝑚−1 … … 𝐴1 or only 𝐴𝑚−1
𝐴𝑖: Best described for trial (sub experiment
Binomial Probability
Law
• When we generalize the Bernoulli trial (a single random trial with two
possible outcomes) to N independent trials, we obtain the Binomial
Probability Law
• The aim, in this case, is usually to find the probability of K wins in N trials
• ( or k successes, yes, H, 1, ……)
• Consider 2 wins in 3 trials: {WWL, WLW, LWW}
• Can be seen as a combination without replacement: n=3, r=2
• Each table has probability: P.P.(1-P) P being the probability of a win
≡ 𝑃 1 − 𝑃
• Since all the events above are mutually exclusive
• P(2 wins) = 3. 𝑃2
1 − 𝑝 =
3!
2! 3−1 !
𝑃2
1 − 𝑃
• The general expression for probability of k wins in N independent trials with
probability of success P is P(k) = 𝑛
𝑘
𝑃𝑘 1 − P 1 − 𝑝 𝑁.𝑘
=
𝑵!
𝒌! 𝑵−𝒌 !
𝑷𝒌
(𝟏 − 𝑷)𝑵−𝒌
Binomial Probability Law
• The Bernoulli experiment can be extended to the case of once that 2 possible outcomes and repeated N
times
• Example 3 possible outcome with P1,P2, P3 and repeated N= 6 times times a possible count (𝑘1 = 2, 𝑘2
= 3, 𝑘3 = 1) is 221332 with probability of 𝑃1
3
𝑃3
1
𝑃2
2
• Considering combination w/o replacement, we have
𝑛!
𝑘1!𝑘2!𝑘3!
•  P(𝑘1, 𝑘2, 𝑘3) = (6𝑘1, 𝑘2, 𝑘3) 𝑃1
𝑘1
… . 𝑃3
𝑘3
• For N trials with M possible outcomes, the probability of 𝑘1, 𝑘2, … . 𝑘𝑚 is : P(𝑘1, … . 𝑘𝑚) =
𝑁!
𝑘1!……𝑘𝑚!
𝑃1
𝑘1
… . 𝑃𝑚
𝑘𝑚
with 𝑘1 + 𝑘2 … . 𝑘𝑚 = 𝑁
• Example: Box with 5 Red balls, 10 Blue, 3 Green. (𝑃1=
3
18
, 𝑃2=
10
18
, 𝑃3=
3
18
)
• Probability of selecting 2R, 3B and 1 G (N=6)
Generalization of Binomial: Multinomial Pseudo Law (Not
covered)
• Another related probability law to Bernoulli sequence is appearance of first
success.
• Now, we fix number of success and count number of trials
• Event:
𝑓 𝑓 ……𝑓 𝑠
𝑘−1 𝑓𝑎𝑖𝑙𝑢𝑟𝑒𝑠
• The probability of first process at trial k is:
• P(k) = (1 − 𝑃)𝑘−1
𝑃 = (1 − 𝑃)𝑘−1
𝑃 𝑤ℎ𝑒𝑟𝑒 𝑘 = 1,2 … … .
• Example: The probability of making phone call successfully is P= 0.9
• Probability of successfully call at Kth trial:
K 1 2 3…
.9 .1x.9 .12
x.9 (Decreasing)
Additional condition reduces probability
NOTE:
Geometrics law can be extended to Kth success and finding the number of trials
Geometric Probability
Law:
Non-Independent Trails
Outcomes affect present outcome (past can be one or more). Common models used
are based on Markov Chains
Geometrics probability law

More Related Content

Similar to Introduction to probabilities and radom variables

Module-2_Notes-with-Example for data science
Module-2_Notes-with-Example for data scienceModule-2_Notes-with-Example for data science
Module-2_Notes-with-Example for data sciencepujashri1975
 
SAMPLE SPACES and PROBABILITY (3).pptx
SAMPLE SPACES and PROBABILITY (3).pptxSAMPLE SPACES and PROBABILITY (3).pptx
SAMPLE SPACES and PROBABILITY (3).pptxvictormiralles2
 
chapter five.pptx
chapter five.pptxchapter five.pptx
chapter five.pptxAbebeNega
 
Topic 1 __basic_probability_concepts
Topic 1 __basic_probability_conceptsTopic 1 __basic_probability_concepts
Topic 1 __basic_probability_conceptsMaleakhi Agung Wijaya
 
Random Variable & Probability Distribution 1.pptx
Random Variable & Probability Distribution 1.pptxRandom Variable & Probability Distribution 1.pptx
Random Variable & Probability Distribution 1.pptxJAYARSOCIAS3
 
advanced_statistics.pdf
advanced_statistics.pdfadvanced_statistics.pdf
advanced_statistics.pdfGerryMakilan2
 
Statistical thinking
Statistical thinkingStatistical thinking
Statistical thinkingmij1120
 
Lecture 4 - probability distributions (2).pptx
Lecture 4 - probability distributions (2).pptxLecture 4 - probability distributions (2).pptx
Lecture 4 - probability distributions (2).pptxSinimol Aniyankunju
 
Basic statistics for algorithmic trading
Basic statistics for algorithmic tradingBasic statistics for algorithmic trading
Basic statistics for algorithmic tradingQuantInsti
 
Linear algebra and probability (Deep Learning chapter 2&3)
Linear algebra and probability (Deep Learning chapter 2&3)Linear algebra and probability (Deep Learning chapter 2&3)
Linear algebra and probability (Deep Learning chapter 2&3)Yan Xu
 
Game theory for neural networks
Game theory for neural networksGame theory for neural networks
Game theory for neural networksDavid Balduzzi
 
Statistics by DURGESH JHARIYA OF jnv,bn,jbp
Statistics by DURGESH JHARIYA OF jnv,bn,jbpStatistics by DURGESH JHARIYA OF jnv,bn,jbp
Statistics by DURGESH JHARIYA OF jnv,bn,jbpDJJNV
 

Similar to Introduction to probabilities and radom variables (20)

Machine learning mathematicals.pdf
Machine learning mathematicals.pdfMachine learning mathematicals.pdf
Machine learning mathematicals.pdf
 
Module-2_Notes-with-Example for data science
Module-2_Notes-with-Example for data scienceModule-2_Notes-with-Example for data science
Module-2_Notes-with-Example for data science
 
Lec13_Bayes.pptx
Lec13_Bayes.pptxLec13_Bayes.pptx
Lec13_Bayes.pptx
 
SAMPLE SPACES and PROBABILITY (3).pptx
SAMPLE SPACES and PROBABILITY (3).pptxSAMPLE SPACES and PROBABILITY (3).pptx
SAMPLE SPACES and PROBABILITY (3).pptx
 
U uni 6 ssb
U uni 6 ssbU uni 6 ssb
U uni 6 ssb
 
chapter five.pptx
chapter five.pptxchapter five.pptx
chapter five.pptx
 
Topic 1 __basic_probability_concepts
Topic 1 __basic_probability_conceptsTopic 1 __basic_probability_concepts
Topic 1 __basic_probability_concepts
 
Random Variable & Probability Distribution 1.pptx
Random Variable & Probability Distribution 1.pptxRandom Variable & Probability Distribution 1.pptx
Random Variable & Probability Distribution 1.pptx
 
advanced_statistics.pdf
advanced_statistics.pdfadvanced_statistics.pdf
advanced_statistics.pdf
 
Statistical thinking
Statistical thinkingStatistical thinking
Statistical thinking
 
Chapter7ppt.pdf
Chapter7ppt.pdfChapter7ppt.pdf
Chapter7ppt.pdf
 
Day 3.pptx
Day 3.pptxDay 3.pptx
Day 3.pptx
 
DEFINE PROBABILITY.pptx
DEFINE PROBABILITY.pptxDEFINE PROBABILITY.pptx
DEFINE PROBABILITY.pptx
 
Lecture 4 - probability distributions (2).pptx
Lecture 4 - probability distributions (2).pptxLecture 4 - probability distributions (2).pptx
Lecture 4 - probability distributions (2).pptx
 
Basic statistics for algorithmic trading
Basic statistics for algorithmic tradingBasic statistics for algorithmic trading
Basic statistics for algorithmic trading
 
Probability
ProbabilityProbability
Probability
 
Linear algebra and probability (Deep Learning chapter 2&3)
Linear algebra and probability (Deep Learning chapter 2&3)Linear algebra and probability (Deep Learning chapter 2&3)
Linear algebra and probability (Deep Learning chapter 2&3)
 
Prob
ProbProb
Prob
 
Game theory for neural networks
Game theory for neural networksGame theory for neural networks
Game theory for neural networks
 
Statistics by DURGESH JHARIYA OF jnv,bn,jbp
Statistics by DURGESH JHARIYA OF jnv,bn,jbpStatistics by DURGESH JHARIYA OF jnv,bn,jbp
Statistics by DURGESH JHARIYA OF jnv,bn,jbp
 

Recently uploaded

(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Serviceranjana rawat
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSMANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSSIVASHANKAR N
 
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Dr.Costas Sachpazis
 
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...ranjana rawat
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxpurnimasatapathy1234
 
KubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlyKubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlysanyuktamishra911
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
AKTU Computer Networks notes --- Unit 3.pdf
AKTU Computer Networks notes ---  Unit 3.pdfAKTU Computer Networks notes ---  Unit 3.pdf
AKTU Computer Networks notes --- Unit 3.pdfankushspencer015
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxpranjaldaimarysona
 
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordCCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordAsst.prof M.Gokilavani
 
Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)simmis5
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
UNIT - IV - Air Compressors and its Performance
UNIT - IV - Air Compressors and its PerformanceUNIT - IV - Air Compressors and its Performance
UNIT - IV - Air Compressors and its Performancesivaprakash250
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escortsranjana rawat
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSKurinjimalarL3
 
Extrusion Processes and Their Limitations
Extrusion Processes and Their LimitationsExtrusion Processes and Their Limitations
Extrusion Processes and Their Limitations120cr0395
 
Top Rated Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
Top Rated  Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...Top Rated  Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
Top Rated Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...Call Girls in Nagpur High Profile
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Dr.Costas Sachpazis
 

Recently uploaded (20)

(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
 
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSMANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
 
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
 
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
 
Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptx
 
KubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlyKubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghly
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
 
AKTU Computer Networks notes --- Unit 3.pdf
AKTU Computer Networks notes ---  Unit 3.pdfAKTU Computer Networks notes ---  Unit 3.pdf
AKTU Computer Networks notes --- Unit 3.pdf
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptx
 
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordCCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
 
Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
UNIT - IV - Air Compressors and its Performance
UNIT - IV - Air Compressors and its PerformanceUNIT - IV - Air Compressors and its Performance
UNIT - IV - Air Compressors and its Performance
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
 
Extrusion Processes and Their Limitations
Extrusion Processes and Their LimitationsExtrusion Processes and Their Limitations
Extrusion Processes and Their Limitations
 
Top Rated Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
Top Rated  Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...Top Rated  Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
Top Rated Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
 

Introduction to probabilities and radom variables

  • 1. DAT203 Probability Theory and its Applications Prof. Mohamed Deriche Professor of AI/ML
  • 2. Part 1: Introduction and Fundamentals • Course overview • Logistics and admin matters • Motivation and coverage • Introduction to probabilities • Approaches to probabilities • Axiomatic approach • Probabilities through sets. • Multiple random experiments • Permutations and combinations • Basic models for repeated experiments (Bernoulli, Binomial, Multinomial and Geometric Probability Laws) • Examples
  • 3. Motivation • The world is not ideal nor deterministic(can you guess the whether tomorrow or a score of a game of whether the dollar goes up of down?) • Random signals (values) appear in various applications: e.g, effect of noise (voice through wireless channels), reading measurements, maximising chance for something to happen, predicting weather, predicting faults, etc... • One way to deal with such phenomena is to leave it to chance! • Or we need mathematical tools to analyse such signals and extract useful information so we can make better and more useful decisions. • Ultimately, we need to use data efficiently and extract useful information and finally make “good” decision (e.g share market)
  • 4. Why study probabilities? • Life is unpredictable. From guessing whether or not a given day will be sunny or rainy to predicting the likelihood of when a deadly disease may come • Such uncertainties make all of us very nervous about the future. • Fortunately, probability theory comes to save us. Essentially, probability is a branch of mathematics which predicts through calculations the various outcomes of a given event. • Simply, it gives the ability to predict how future events may turn out.
  • 5. What are probabilities? • Consider the experiment of throwing a coin, what is randomness and probability? • "randomness" is a way of expressing what we don't know. (forces, orientation, surface smoothness, etc..) • When we say that something is random, we are saying our knowledge about outcome is limited, so we can't be certain what will happen. • Since coin is fair, The outcome is random. If we don't know anything about how it was flipped, probability that it will be head is 50%, or 1/2. What exactly do we mean?
  • 6. What are probabilities? • Since coin is fair, if we don't know anything about how it was flipped, probability that it will be head is 50%, or 1/2. What exactly do we mean? • One interpretation is in terms of relative frequency? (if experiment is repeated many times) • Second interpretation of probability is that it quantifies our degree of subjective belief that something will happen Probability (branch of math) is a measure of the likelihood of an event to occur. Many events cannot be predicted with total certainty. We can predict only the chance of an event to occur i.e. how likely they are to happen
  • 7. Example: communication system • Because of noise, we cannot confirm that a 1 or 0 was sent • Like throwing a coin, and the received guesses what was sent • Knowing the probabilistic nature of the noise, can help us in performing a good guess on what was sent
  • 8. Examples of Real Life Probability •Planning around the Weather •Sports Strategies •Insurance Options •Business decisions •Games and Recreational Activities •Researching a Disease •Space Exploration and Risk Assessment •Telecommunications •Failure and reliability analysis
  • 9. Course Objectives (DAT 203) •This course introduces you to Probability theory, Random variables and Random processes. •It provides fundamentals of theory. •Courses focuses on the application of different probability models to solve practical real work problems in data anlytics
  • 10. Course Learning Outcomes a. Apply the principles and techniques of probability theory to solve data analytics problems. a. Apply the concepts of discrete and continuous probability distributions. a. Derive and use the probability density function of transformations of random variables a. Use methods from algebra and calculus to derive the moments of random variable and random vectors a. Use the concepts of continuous and discrete-time random processes. a. Apply selected probability distributions to solve problems in random processes
  • 11. Coverage • Introduction to probabilities • Random variables • Discrete and continuous probability distributions • Operation on random variables • Multiple random variables • Some important laws of large numbers • Random Processes • Applications
  • 12. Assessment Tools • Class tests, • Mid-Term, • Final, • Homework, • Project
  • 13. Part 1 Introduction to Probability Models
  • 14. 1.1 Introduction • The world is not ideal nor deterministic • Random signals (values) appear in various applications: e.g, effect of noise, reading measurements, maximising chance for something to happen, predicting weather, predicting faults, etc... • One way to deal with such phenomena is to leave it to chance! • Or we need mathematical tools to analyse such signals and extract useful information so we can make leaner and useful decision. • Ultimately, we need to use data efficiently and extract useful information and finally make “good” decision (e.g share market)
  • 15. Why study probabilities? • Life is unpredictable. From guessing whether or not a given day will be sunny or rainy to predicting the likelihood of when a deadly disease may come • Such uncertainties make all of us very nervous about the future. • Fortunately, probability theory comes to save us. Essentially, probability is a branch of mathematics which predicts through calculations the various outcomes of a given event. • Simply, it gives the ability to predict how future events may turn out.
  • 16. Probabilities versus Statistics • Notation: • Probability theory deals with models of unpredictable phenomena. Probability theory is a branch of mathematics concerned with probability. Probability is a numerical description of the likelihood of an event. • Statistics is concerned with collection and representation of data for practical conclusions. Statistics is a branch of mathematics that concerns the collection, organization, displaying, analysis, interpretation and presentation of data. • The relationship between those two is that in statistics, we apply probability (probability theory) to draw conclusions from data.
  • 17. Example • Probability example: You have a fair coin (equal probability of heads or tails). You will toss it 100 times. What is the probability of 60 or more heads? We can get only a single answer because of the standard computation strategy. • Statistics example: You have a coin of unknown provenance. To investigate whether it is fair you toss it 100 times and count the number of heads. Let’s say you count 60 heads. Your job as a statistician is to draw a conclusion (inference) from this data. In this situation, different Statisticians may draw different conclusions because they may use different conclusion forms or may use different methods for predicting the probability(e.g. of landing heads).
  • 18. Occurring Phenomena can be of two types • Deterministic: Happen same way each time experiment repeated. Eg, x(t)=10sin(2t). • Random: Do not happen same way even with same conditions. Eg, tossing a coin.
  • 19. Example of random signal Sinusoid in noise: (Random signal) y(t) = A cos(wot)+n(t) • knowing A, wo can’t determine y(to) exactly (as n(t) changes) • However with no noise, we can predict y(t) at any time instant with no error
  • 20. Example: Defective chips Consider a box of 500 chips with probability of defective chips: P(0 defective)=0.02 P(1 defective)=0.11 P(2 defective)=0.16 P(3 defective)=0.21 P(4 defective)=0.13 P(5 defective)=0.08 P(6, or 7, or 8, . . Or 500) unknown Claim: no more than 5 defective/box What is the Prob. of claim being right? P(correct) = P(0 defective) + P(1) + P(2) + P(3) + P(4) + P(5) = 0.71 On average 71% of boxes have no more than 5 defectives. Decisions can be made based on this information • Other examples: number of agents serving in bank, number of TVs in stock,, Number of chairs for a certain number of people, overbooking in planes, etc….
  • 21. 1.2 Approaches to probability • Personal • relative frequency • equally likely • axiomatic • Personal: each one of us has an intuitive notion of probability. Ex. Waiting to be served (cant use this to solve problems!) • Relative frequency: For a given event A in a random experiment P(A) = lim nA/n = # occurrence of A / # trials ex. Tossing a coin (# heads) (In may cases cant repeat experiments many times, ex. Hitting a target, nuclear experiments vaccine, …..) • Equally Likely: Experiments seen as consisting of several outcomes that have equal chance of occurrence. Ex. Tossing 2 fair coins. Prob. Of at least 1 head. Outcomes {H,H H,T T,H T,T} prob. .25 .25 .25 .25 P(at least 1 head) = .25 + .25 +.25 = .75 (adding favourable outcomes)
  • 22. Relative Frequency of Occurrence: • Roll the die n = 100-10,000 times, count them up! (classroom experiment, histogram). • Q: can we always repeat an experiment 1000 time (eg. (nuclear bomb! • Advantages: • Automatically accounts for non-uniform outcome probabilities. • Can handle continuous outcomes. • Disadvantages: • We don’t have all day! (n << ∞) • Thus, P[E] is not known exactly! n n E P E n    lim ] [
  • 23. Equally likely approach • A six sided die is rolled twice. What is the “probability” of the sum of the two rolls being: 1, 2, 5, 7, or 11? • equally likely , (ratio of total to favorable outcomes) • • Count them up N N E P E  ] [ 2nd 1st die die 1 2 3 4 5 6 1 2 3 4 5 6 7 2 3 4 5 6 7 8 3 4 5 6 7 8 9 4 5 6 7 8 9 10 5 6 7 8 9 10 11 6 7 8 9 10 11 12 2/36 6/36 1/36 4/36
  • 24. • Advantages: • Clear, mathematically sound definition. • Based on model of the events rather than empirical observations of experimental outcomes. • Disadvantages: • Assumes equally likely outcomes - What about loaded dice? No way to handle it! • Requires detailed enumeration of all outcomes. What do you do with continuous data?
  • 25. Question: how to assign probability in a consistent and global way? • How to give a numeric value to the probability of some events? • Common sense tells us that the probability of getting “Head” in tossing of a fair coin experiment is 1/2 or 1 out of two possibilities. • Also the probability of getting “Two dots” in tossing a fair dice 1/6 or 1 out of six possibilities. • Solution: Define probability through rules!
  • 26. 1.3 Axiomatic Approach to Probabilities Instead of defining what probability is?  define the properties that must be satisfied. Notation: Random experiment  can generate more than 1 outcome. Rolled die  1, 2, 3, 4, 5, 6 points up. Sample Space {S} of an experiment is set of all possible outcomes. Above example: S={1, 2, 3, 4, 5, 6} tossing 2 coins S = {HH, HT, TH, TT} Event A: subset of S which collects outcomes of particular interest. A  P(A) {we assign a value which is function of A
  • 27. For each event A in S, we assign P(A): Prob. of A which must satisfy: (axioms of probability) 1. P(A)  0 2. P(S) = 1 3. If A & B are two events in S with no common outcomes  P(A or B) = P(A) + P(B). 3rd axiom  extended to N events IF Am  An = ,  m  n  P( An) =  P(An).
  • 28. More formally: Axiomatic Approach to Probabilities axiom 1 : represent our desire to work with nonnegative numbers P(A) 0  axiom 2 : The sample space is an event with the highest probability , certain event P(S) = 1 P( ) = 0  The null event is an event with the lowest probability , impossible event The probability of an event which is the union of mutually exclusive events is equal to the sum of the individual event probabilities N N N N m n n = 1 n = 1 P A = P(A ) if A A =         axiom 3 For each event A in S, we assign P(A): Prob. of A which must satisfy:
  • 29. Based on the axioms above, probability theory leads itself to use the language of sets. Probability is defined and calculated for sets. Set theory is the branch of mathematics which deals with the formal properties of sets as units (groups, objects, etc…) (without regard to the nature of their individual constituents) Intuitively a set is a collection of objects, which are called elements. Although this seems like a simple idea, it has some nice consequences and applications. Set definition: A set is a collection of objects or elements. 1.4 Probabilities using set theory
  • 30. Some examples of sets: Kitchen tools, electronic circuits, School bags; Shopping Malls; Movies, House; airplanes, chairs, students, etc… More sets…… Examples of sets
  • 31.
  • 32.
  • 33. Notation: Sets are collection of objects (elements). S = {HH, HT, TH, TT} (tossing two coins) set of possible outcomes in tossing 2 coins. i(element)  A, I element of set A. i  A, i not contained in A. B  A, B subset of A.  null set, no elements. S universal set set of all elements. E = A  B, set of elements in A or B or both. (can use notation A+B) E = A  B, set of elements in both A and B. (A.B)
  • 34. Can use Venn Diagram for graphical visualisation of sets. Ac or  complement A in S contains elements in S no in A. disjoint sets: A  B= , no intersection. Other properties: A   = A A  S = S A   =  A  S = A if B  A  A  B = B, A  B = A S A B A  B
  • 35. If every element of a set A is also an element in another set B A is said to be contained in B or A is a subset of B A B  If at least one element exists in B which is not in A , then A is a proper subset of B , denoted as A B  The null set is subset of all other sets  A B
  • 36. Two sets A and B are mutually exclusive if they have no common elements Example Let the sets A and B be given as A = { 1, 3, 5, 7 } and B = { 2, 4, 6, 8, 10, 12, 14 } A B
  • 37. Other properties A  B = B  A, A  B = B  A (Commutatively) (A  B)  C = A  (B  C) (Associatively) (A  B)  C = A  (B  C) (A  B)  C = (A  C)  (B  C) (distributive)
  • 38. A  B = B  A, A  B = B  A (Commutatively) (A  B)  C = A  (B  C) (Associatively) (A  B)  C = A  (B  C) (A  B)  C = (A  C)  (B  C) (distributive) {A1, . . . AM} form a partition in S if Ai  Aj = , i  j and A1  A2 . . .  AM = S Cartesian product C = A x B (all possible pairs I j) ex. A={h1, t1}, B={h2, t2} A x B = {h1 h2, h1 t2, t1 h2, t1 t2} A1 A2 A3 A4 A5
  • 39. De Morgan’s Law: (A  B)c = Ac  Bc (A  B)c = Ac  Bc
  • 40. Difference : Consider two sets A and B Denote the set A – B, is the set containing all elements of A that are not presents in B Example : Let A = { 0.6 < a ≤ 1.6 } and B = { 1.0 ≤ b ≤ 2.5 } Then A – B = { 0.6 < c < 1.0 } 0.6 1.6 1.0 2.5 ( ] [ ] ( ) 0.6 1.0 A B A – B  { 0.6 < c < 1.0 } Similalrly B – A = { 1.6 < d < 2.5 } ≠ A – B
  • 41. Union : The union of two sets A and B call it C is written C = A B B A B A B Intersection : The intersection of two sets A and B call it D is written D = A B A B A B A B Sometimes called the sum of two sets Sometimes called the products of two sets
  • 42. The union and intersection can be repeated for N sets N 1 2 N i i=1 C = A A A = A    N 1 2 N i i=1 D = A A A = A   
  • 43.
  • 44. 1.5 Derivation of probabilities from sets • The idea is to use the concept of sets and add to it the concept of function (which is probability). This function needs to satisfy the axiomatic properties • A  Ac = S A  Ac =  since A & Ac are disjoint, P(A  Ac) = P(A) + P(Ac) = P(S) = 1  P(Ac) = 1 - P(A) let A = S  Ac =   P() = 1 - P(S) = 1 - 1 = 0  P(A  B) = P(A) + P(A  Bc) P(B) = P(A  B) + P(B  Ac)  P (A  B) = P(A) + P(B) - P(A  B) S A B A  B
  • 45. Data Analytics Pipeline based on Probabilities
  • 46. 1.6 Conditional Probabilities In some experiments, we need to know probability of an event occurring given that (conditional) another event has occurred. We denote this conditional prob. as: P(A/B) conditional prob. of A given B. We assume that P(B) > 0. If A  B =   P(A/B) = 0. P(A/B) is defined as P(A/B) = P(A  B)/P(B) (assuming P(B)>0) if B  A  P(A/B) = 1, (A  B = 1) if A  B  P(A/B) = P(A)/P(B)  P(A) Does P(A/B) satisfy the 3 axioms: (valid probability function)? Yes • P(A/B) > 0 • P(S/B) = 1 • if D & E mutually exclusive, P(D  E/B) = P((D  E)  B)/P(B) = (P(D  E)+P(E  B))/P(B) = P(D/B) + P(E/B)
  • 47. Example • Probability of selling a Mobile on a given normal day maybe only 30%. • But if we consider that given day is Long Weekend, then there are much more chances of selling a Mobile! • The conditional Probability of selling a Mobile on a day given that we have a long weekend break might be 40%. • We can represent these probabilities as • P(Mobile sell on a random day) = 30%. • P(Mobile sell given that today is long weekend) = 40%. • So Conditional Probability helps Data Scientists to get better results from the given data set and for Machine Learning Engineers, it helps building better models for prediction.
  • 48. Example • Probability of defaulting on paying the loan based on age group. • This will help predict how many people may default on paying the loans when they apply for such loans.
  • 49. 1.7 Independent Events Within a sample space, we call 2 events independent if the prob. of occurrence of one is not affected by the occurrence of the other event. If A & B are 2 Ind. Events  P(A/B) = P(A) however since P(A/B) = P(AB)/P(B)  P(AB) = P(A).P(B) Note: P(A/B) = P(A)  P(B/A) = P(B)  P(AB) = P(A).P(B) independence assumption simplifies statistical problems. For N events A1  AN are called statistically ind. if, P(AiAj) = P(Ai).P(Aj) P(AiAjAk) = P(Ai).P(Aj).P(Ak) . P(A1A2  AN) = P(A1).P(A1)  P(AN) are satisfied for all 1  i < j < . . .  N Other properties: If A1 & A2 ind. Events  A1 & Â2 are independent.  Â1 & A2 are independent.  Â1 & Â2 are independent.
  • 50. Independent Events •You flip a coin and get a heads and you flip a second coin and get a tails. The two coins don't influence each other. •The probability of rain today and the probability of a pop quiz; Non-independent Events • The probability of snow today and the probability of a pop quiz; Snow causes school closings, in which case your teacher can't give you a pop quiz. • The chance that you are hungry right now and the chance that you're eating right now. Obviously one leads to the other eventually ...
  • 51. Method to Identify Independent Events Before applying probability formulas, one needs to identify an independent event. Few steps for checking whether the probability belongs to a dependent or independent events: Step 1: Check if it possible for the events to happen in any order? If yes, go to Step 2, or else go to Step 3 Step 2: Check if one event affects the outcome of the other event? If yes, go to step 4, or else go to Step 3 Step 3: The event is independent. Use the formula of independent events and get the answer. Step 4: The event is dependent. Use the formula of dependent event and get the answer.
  • 52. Example: A juggler has seven red, five green, and four blue balls. During his stunt, he accidentally drops a ball and then picks it up. As he continues, another ball falls. What is the probability that the first ball that was dropped is blue, and the second ball is green? As we know that the first ball is picked by the juggler, the size of the sample space for both balls is 16, because these events are independent. The probability that the first ball is blue or P (blue ball) = 4/16 The probability that the second ball is green or P(green ball) = 4/16 The probability that the first ball is blue and the second ball is green: P(blue and green)= P(blue) × P(green) = 4/16 × 4/16 = 1/16
  • 53. Simple examples of dependent events: • Robbing a bank and going to jail. • Not paying your power bill on time and having your power cut off. • Boarding a plane first and finding a good seat. • Parking illegally and getting a parking ticket. Parking illegally increases your odds of getting a ticket. • Driving a car and getting in a traffic accident.
  • 54. Consider an event B on the sample space S. Consider also a partition {A1 . . . AN} in S. Recall: B  S = B  (A1 . . AN) = (BA1) . .(B AN)  P(B) = P(BA1) + . . +(B AN) However, recall P(BAi) = P(B/Ai).P(Ai)  P(B) = 𝒊=𝟏 𝑵 P(B/Ai).P(Ai) A1 A2 A3 A4 A5 B 1.8 Total Probabilities
  • 55. Example: 3 boxes of capacitors. Experiment: randomly select a box, then randomly select a capacitor. CBox 1 2 3 Tot C1 {0.01 F} 20 95 25 140 C2 {0.1 F} 55 35 75 165 C3 {1.0 F} 70 80 145 295 Tot 145 210 245 600 Prob. of selecting .01 F given that box 2 was selected? If a .01 F is selected what is the prob. that it comes from box 3? P(C1/B2) = 95/210 P(B3/C1) = P(C1B3)/P(C1)=P(C1/B3).P(B3)/ {P(C1/B1).P(B1)+P(C1/B2).P(B2)+P(C3/B3).P(B3) }
  • 56. Example: 2 boxes containing $1 & $2 coins. Box 1: 8 $1 2 $2 Box 2: 5 $1 20 $2 Box selected at random and a coin is selected. Prob. Of a $2 coin selected? Assume P(B1) = P(B2) = 1/2. P($1/B1) = 8/10, P($2/B1) = 2/10 P($1/B2) = 5/25, P($2/B2) = 20/25 P($2) = P($2/B1).P(B1) + P($2/B2).P(B2) = 2/10*1/2 + 20/25*1/2 = 1/10 + 2/5 = 1/2.
  • 57. 1.9 Bayes Theorem (Posterior Probability) How do we calculate revised prob. of Ai given B. (bringing additional information)  P(A1/B) ? P(A2/B) ? P(AN/B)         N j j j i i i N N i i i i A P A B P A P A B P B A P A P A B P A P A B P A P A B P B P B A P B A P 1 1 1 ) ( ). / ( ) ( ). / ( ) / ( ) ( ). / ( ... ) ( ). / ( ) ( ). / ( ) ( ) ( ) / (
  • 58.
  • 59.
  • 60. Example: a mid-term exam in DAT203, prob. of student studying = .7. For those who study, prob. of passing = 0.9 For those who don’t study, prob. of passing = .05 Given student did not pass, what is prob. that he studied. Event studied : ST, Event Passed: PA 197 . 0 3 . 0 95 . 0 7 . 0 * 1 . 0 7 . 0 1 . 0 ) ( ). / ( ) ( ). / ( ) ( ). / ( ) / (        ST P ST PA P ST P ST PA P ST P ST PA P PA ST P
  • 61. Example Corona Test • Suppose that a test for COVID is 97% sensitive and 95% specific. That is, the test will produce 97% true positive results for COVID patients and 95% true negative results for non- COVID people. • These are the pieces of data that any screening test will have from their history of tests. Bayes’ rule allows us to use this kind of data-driven knowledge to calculate the final probability. What is the probability that a randomly selected individual with a positive test is COVID patient? • P(TP/C)=.95 is sensitivity TP Test Positive • P(TN/𝐶=0.97 is specificity T TN Test Negative • PC)=.005 Prevalence • P(C/TP)=P(TP/C)*P(C)/P(TP)=0.0089!!! only 8.9% chance that someone has COVID even when test is positive.
  • 62. Bayes Theorem: Example Where 10% have own car 30% are Fr Where 20% have own car 40% are So Where 40% have own car 20% are Ju Where 60% have own car 10% are Se At AU If a student has a car, the probability, that he is a junior 𝑃 𝐶 = 𝑃 𝐶 𝐹𝑟 𝑃 𝐹𝑟 + 𝑃 𝐶 𝑆𝑜 𝑃 𝑆𝑜 + 𝑃 𝐶 𝐽𝑢 𝑃 𝐽𝑢 + 𝑃 𝐶 𝑆𝑒 𝑃 𝑆𝑒 = 0.25  25% P(Ju/C) = 𝑃 𝐶 𝐽𝑢 𝑃 𝐽𝑢 𝑃 𝐶 = 0.32 10% 30% 30% Fr % So 40 % 20 % 10 Se Ju 40 60 c C _ c 20% c c
  • 63. • Most experiments considered till now assume outcomes from a single experiment • In some applications, we may need to combine experiments at the same time or in a sequence • Example: Measuring pressure and wind at a location • 𝑆1 = 𝐿 𝑀 𝐻 𝑎𝑛𝑑 𝑆2 = {NW, LW, MW, FW} • Combined sample space S=S1xS2={(L,NW), (L,LW),…………(H,FW) • Example: Flipping Coin and tossing dice S = {H1, H2, H3….T6} 1.10 Multiple Random Experiments
  • 64. Repeating Experiments in Sequence When repeating same experiment in sequence, the outcome of the first may or may not affect the outcome of the second Examples: 10 balls 5R 2G 3B P(R) = 5 10 = 0.5 When we do experiment again, 2 scenarios: Scenario 1 Scenario 2 Assume we got a R ball and we put it back (with replacement) → 𝑃 𝑅 = 5 10 Assume we got a R ball and we move it out (without replacement) → 𝑃 𝑅 = 4 𝑔 2 Issues are important in combined experiments • Whether or not order is important? • Whether or not we have a replacement?
  • 65. Fundamental Counting Principle Fundamental Counting Principle can be used determine the number of possible outcomes when there are two or more characteristics . Fundamental Counting Principle states that if an event has m possible outcomes and another independent event has n possible outcomes, then there are m* n possible outcomes for the two events together. 1.11 Permutations and Combinations
  • 66. Fundamental Counting Principle Lets start with a simple example. A student is to roll a die and flip a coin. How many possible outcomes will there be? 1H 2H 3H 4H 5H 6H 1T 2T 3T 4T 5T 6T 12 outcomes 6*2 = 12 outcomes
  • 67. Fundamental Counting Principle For a college interview, Robert Ahmed to choose what to wear from the following: 4 pants, 3 shirts, 2 shoes and 5 ties. How many possible outfits does he have to choose from? 4*3*2*5 = 120 outfits
  • 68. Permutations A Permutation is an arrangement of items in a particular order. Notice, Order Important! To find the number of Permutations of n items, we can use the Fundamental Counting Principle or factorial notation.
  • 69. Permutations The number of ways to arrange the letters ABC: ____ ____ ____ Number of choices for first blank? 3 ____ ____ 3 2 ___ Number of choices for second blank? Number of choices for third blank? 3 2 1 3*2*1 = 6 3! = 3*2*1 = 6 ABC ACB BAC BCA CAB CBA
  • 70. Permutations A combination lock will open when the right choice of three numbers (from 1 to 30, inclusive) is selected. How many different lock combinations are possible assuming no number is repeated? Practice: 24360 28 * 29 * 30 )! 3 30 ( ! 30 3 30      27! 30! p
  • 71. Combinations A Combination is an arrangement of items in which order does not matter. ORDER DOES NOT MATTER! Since the order does not matter in combinations, there are fewer combinations than permutations. The combinations are a "subset" of the permutations.
  • 72. Combinations To find the number of Combinations of n items chosen r at a time, you can use the formula . 0 where n r r n r n r C n     )! ( ! !
  • 73. Combinations To find the number of Combinations of n items chosen r at a time, you can use the formula . 0 where n r r n r n r C n     )! ( ! ! 10 2 20 1 * 2 4 * 5 1 * 2 * 1 * 2 * 3 1 * 2 * 3 * 4 * 5 )! 3 5 ( ! 3 ! 5 3 5        3!2! 5! C
  • 74. Consider repeating an experiment without a replacement • Example : Box of 3 balls numbered 1,2, and 3. Select 2 balls with no replacement {12 13 21 23 31 32} 1st ball 3 choices but the second ball 2 choices Rule: Ordering r elements taken from “n” element Number of possibilities = 𝒏 × 𝒏 − 𝟏 … . 𝒏 − 𝒓 + 𝟏 = 𝑛 𝑛− … 𝑛−𝑟 𝑛−𝑟−1 …1 = 𝒏! 𝒏−𝒓 ! = 𝑷 𝒏 𝒓 NOTE: here we have the case of: with order, no replacement If order is not important, and no replacement, When order not important, then we have combinations  Combination experiment without replacement Rule: Number of combination of R objects from n = C𝒏 𝒓 = 𝒏! 𝒓! 𝒏−𝒓 ! Permutations and Combinations Number of Permutations
  • 75. Example: 2 out of 4  4! 2!2! = 6 {1 2 1 3 14 23 24 34} Wherein 2,1 counted as 1 • Consider the case of: with order and with replacement • Consider n objects. • 1st object  n possibilities • 2nd Object  n possibilities • . • . • 𝑛𝑥𝑛 …𝑛 𝑟 possibilities : 𝒏𝒓 # 𝒘𝒂𝒚𝒔 𝒇𝒐𝒓 𝒔𝒆𝒍𝒆𝒄𝒕𝒊𝒏𝒈 𝒏 𝒘𝒊𝒕𝒉 𝒐𝒓𝒅𝒆𝒓 𝒘𝒊𝒕𝒉 𝒓𝒆𝒑𝒍𝒂𝒄𝒆𝒎𝒆𝒏𝒕 • In case the order not important  combinations with replacements (no order, with replacement) •  𝒏+𝒓−𝟏 ! 𝒓! 𝒏−𝟏 ! 𝒑𝒐𝒔𝒔𝒊𝒃𝒍𝒆 𝒄𝒐𝒎𝒃𝒊𝒏𝒂𝒕𝒊𝒐𝒏𝒔 (no order, with replacement) Summary Permutation (with order), no replacement 𝑛! 𝑛 − 𝑟 ! Combination (no order), no replacement 𝑛! 𝑟! 𝑛 − 𝑟 !
  • 76. Summary Examples Example 12 students are selected and a team of 4 is selected from those 12, find n number of ways this team is selected Combination or Permutation? Combination! as the order is not important With replacement or without replacement? Once selected, that is it 12! 4! 8! = 12×11×10×9× 4×3×2 = 495 Example A student needs to answer 8 questions from 10, find n number of ways to choose the question if he selected the first 3 First 2 selected  remaining is 5 from 7 combination. 7! 𝑆!×2! = 21 No replacement
  • 77. • Find n # of committees of 5 with 1 chairman can be selected for 12 people. • 1st (𝑐ℎ𝑎𝑖𝑟) 12 possibilities • Remaining 4 from 11  combination ( no order), w/o replacement • 11! 4!7! = 11×10×9×8 4×3×2 = 330 • N = 12x220 = 3960 • Find n number of permutations from letters m • THOSE UNUSUAL SOCIOLOGICAL • THOSE permutations n5  5! (5−5)! = 120 • UNUSUAL permutation m represents but n =7, r=4 (u repeated 3 times 7! 3! = 840 • SOCIOLOGICAL  12! 1! 3! 2! 2! 2! 1! S O C I L G Summary Examples
  • 78. • Example probability that n people have distinct birthdays (𝑛 ≤ 365) • N people, 365 days  365 ways n people can have birthdays • If n have different birthdays • 1st  365 days possibilities • 2nd  365-1 days possibilities • Number of ways people have different birthdays is 365.364…..(365−𝑛+1) 365𝑛 • Prob of 2 or more have same birthdays is P = 1 − 365.364…..(365−𝑛+1) 365𝑛 • 𝑛 𝑝 | 10 20 22 23 30 60 .117 .411 .476 .507 .706 .994 • In class of 23 students it is more likely to have 2 or more sharing same birthdays! Summary Examples
  • 79. 1.12 Basic Models of Repeated Random Experiments Independent Non-Independent Basic Examples of repeated experiment: A. Independent Sub Experiments: When the outcome from one experiment does not affect what happens in the next trial, we say experiments are independent Example: Tossing 2 fair coins However, assume we have 2 coins (one fair, second biased)  ex: P(H) =1/4 We first choose at random a coin If H, chooses a fair coin, next trial If T, choose a biased coin, next trial For Independent Trials, P(A) = P (𝐴1 ∩ 𝐴2 … … … … … . .∩ 𝐴𝑚) = 𝑃(𝐴1)………P(𝐴𝑚) For dependent Trials, P(A) = P (𝐴𝑚/𝐴𝑚−1 … … … … … . . 𝐴1). 𝑃(𝐴𝑚−1/𝐴𝑚−2 … … … 𝐴1)…….P(𝐴2/𝐴1)𝑃(𝐴1) P(𝐴𝑚) may depend on 𝐴𝑚−1 … … 𝐴1 or only 𝐴𝑚−1 𝐴𝑖: Best described for trial (sub experiment
  • 80. Binomial Probability Law • When we generalize the Bernoulli trial (a single random trial with two possible outcomes) to N independent trials, we obtain the Binomial Probability Law • The aim, in this case, is usually to find the probability of K wins in N trials • ( or k successes, yes, H, 1, ……) • Consider 2 wins in 3 trials: {WWL, WLW, LWW} • Can be seen as a combination without replacement: n=3, r=2 • Each table has probability: P.P.(1-P) P being the probability of a win ≡ 𝑃 1 − 𝑃 • Since all the events above are mutually exclusive • P(2 wins) = 3. 𝑃2 1 − 𝑝 = 3! 2! 3−1 ! 𝑃2 1 − 𝑃 • The general expression for probability of k wins in N independent trials with probability of success P is P(k) = 𝑛 𝑘 𝑃𝑘 1 − P 1 − 𝑝 𝑁.𝑘 = 𝑵! 𝒌! 𝑵−𝒌 ! 𝑷𝒌 (𝟏 − 𝑷)𝑵−𝒌 Binomial Probability Law
  • 81. • The Bernoulli experiment can be extended to the case of once that 2 possible outcomes and repeated N times • Example 3 possible outcome with P1,P2, P3 and repeated N= 6 times times a possible count (𝑘1 = 2, 𝑘2 = 3, 𝑘3 = 1) is 221332 with probability of 𝑃1 3 𝑃3 1 𝑃2 2 • Considering combination w/o replacement, we have 𝑛! 𝑘1!𝑘2!𝑘3! •  P(𝑘1, 𝑘2, 𝑘3) = (6𝑘1, 𝑘2, 𝑘3) 𝑃1 𝑘1 … . 𝑃3 𝑘3 • For N trials with M possible outcomes, the probability of 𝑘1, 𝑘2, … . 𝑘𝑚 is : P(𝑘1, … . 𝑘𝑚) = 𝑁! 𝑘1!……𝑘𝑚! 𝑃1 𝑘1 … . 𝑃𝑚 𝑘𝑚 with 𝑘1 + 𝑘2 … . 𝑘𝑚 = 𝑁 • Example: Box with 5 Red balls, 10 Blue, 3 Green. (𝑃1= 3 18 , 𝑃2= 10 18 , 𝑃3= 3 18 ) • Probability of selecting 2R, 3B and 1 G (N=6) Generalization of Binomial: Multinomial Pseudo Law (Not covered)
  • 82. • Another related probability law to Bernoulli sequence is appearance of first success. • Now, we fix number of success and count number of trials • Event: 𝑓 𝑓 ……𝑓 𝑠 𝑘−1 𝑓𝑎𝑖𝑙𝑢𝑟𝑒𝑠 • The probability of first process at trial k is: • P(k) = (1 − 𝑃)𝑘−1 𝑃 = (1 − 𝑃)𝑘−1 𝑃 𝑤ℎ𝑒𝑟𝑒 𝑘 = 1,2 … … . • Example: The probability of making phone call successfully is P= 0.9 • Probability of successfully call at Kth trial: K 1 2 3… .9 .1x.9 .12 x.9 (Decreasing) Additional condition reduces probability NOTE: Geometrics law can be extended to Kth success and finding the number of trials Geometric Probability Law: Non-Independent Trails Outcomes affect present outcome (past can be one or more). Common models used are based on Markov Chains Geometrics probability law