In this presentation we can get to know the meaning of basic discrete distribution for bivariate. There are also discussions regarding the topic along with marginal tables. Also there are certain illustrative example for the ease of understanding. Overall it is a great presentation for the junior engineers aiming in their course.
3. ABSTRACT
Bivariate Discrete Distributions details the
latest techniques of computer simulation
for the distributions considered. It
contains a general introduction to the
structural properties of discrete
distributions, including generating
functions, moment relationships, and the
basic ideas of generalizing and much
more.
5. INTRODUCTION
In this presentation we will consider two or more random variables defined
on the same sample space and discuss how to model the probability
distribution of the random variables jointly. We will begin with the discrete
case by looking at the joint probability mass function for two discrete
random variables.
6. DISCUSSION
JOINT DISTRIBUTION OF RANDOM BIVARIATES
Let X be a random variable the values
X : x1 x2 x3 ……. xm
Let Y be a random variable assuming the following values corresponding to each xi
Y : y1 y2 y3 ……. Yn
In the table that will be shown in next slide,
There are mn number of values (xi,yj) . These values are known as Bivariate Data.
Also,
Pij = Probability of assuming the pair (xi,yj) by (X,Y)
= P { (xi,yj) }
= P ( X = xi, Y = yj )
are known as Joint Probability Mass of Bivariate (X,Y)
7. BIVARIATE JOINT DISTRIBUTION TABLE
Where the row wise and column wise
total are:
• PXi = pi1 + pi2 + …. + pin = σ𝑗=1
𝑛
𝑝𝑖𝑗
• PYi = p1j + p2j + …. + pmj = σ𝑖=1
𝑚
𝑝𝑖𝑗
The grand total:
• σ𝑖=1
𝑚
𝑝𝑋𝑖 + σ𝑗=1
𝑛
𝑝𝑌𝑗 = σ𝑗=1
𝑛
σ𝑖=1
𝑚
𝑝𝑖𝑗 = 1
8. MARGINAL DISTRIBUTION
This probability distribution of X is called as Marginal Distribution of X
X : x1 x2 x3 …… xm Total
pXi: pX1 pX2 pX3 …… pXm 1
Similarly this probability distribution of Y is called as Marginal Distribution of Y
Y : y1 y2 y3 …… yn Total
pYj: pY1 pY2 pY3 …… pYn 1
The row wise totals PXi and the column wise totals PYi are called Marginal Probability Mass of
X and Y respectively.
9. INDEPENDENT RANDOM VARIABLES
Let (X,Y) be a pair of random variables having joint distribution discussed in previous slide. If
pij = pXi pYj
= P ( X = xi, Y = yj )
= P ( X = xi ) P ( Y = yj )
Hold for all values of i ( 1 ≤ i ≤ m ) and j ( 1 ≤ j ≤ n ) then X and Y are called Independent Random
Variables.
Theorem:
If X and Y are independent random variables and A, B are two events then
P { X ∈ A, Y ∈ B } = P ( X ∈ A ) P ( Y ∈ B )
and vice versa.
10. Problem:
Let (X,Y) be a bivariate having the following joint distribution:
Check whether X and Y are independent or not
Solution:
Here we see every data in the main body of the table is equal
to the product of the corresponding data in last column and
last row, e.g 0.35 = 0.70 x 0.50, 0.06 = 0.30 x 0.20 etc.
ILLUSTRATIVE EXAMPLE 1
That is
P ( X = 0.20, Y = 5 ) = P ( X = 0.20 ) P ( Y = 5),
P ( X = 9, Y = 7 ) = P ( X = 9 ) P ( Y = 7 ) etc.
So, X & Y are independent random variable.
11. Problem:
An urn contains 3 Red, 2 White and 5 Blue balls. Three balls are drawn from the urn. X and Y
denote the number of Red and White balls in a draw. Find the Joint Distribution of (X,Y).
Hence find P ( X ≤ 2, Y ≥ 1 ).
Find the Marginal Distribution of Y and hence find the probability of drawing more
than 1 White balls. Are X and Y independent random variable?
Solution:
Consider X and Y as Probability of Red and White balls.
Take the values of X as 0, 1, 2, 3 and Y as 0, 1, 2.
Create a table with values of the bivariate (X,Y).
Where, P { (xi,yj) } = P ( X = xi, Y = yj )
The corresponding probabilities are:
1) P(0,0) = Probability of “no Red”,”no White”,”3 Blue” = Τ
5
𝐶3 10
𝐶3
= 0.083
ILLUSTRATIVE EXAMPLE 2
12. 2) P(0,1) = Probability of “no Red”,”1 White”,”2 Blue” = Τ
2
𝐶1 ∗5
𝐶2 10
𝐶3
= 0.16
3) P(0,2) = Probability of “no Red”,”2 White”,”1 Blue” = Τ
2
𝐶2 ∗5
𝐶1 10
𝐶3
= 0.041
4) P(1,0) = Probability of “1 Red”,”no White”,”2 Blue” = Τ
3
𝐶1 ∗5
𝐶2 10
𝐶3
= 0.25
5) P(1,1) = Probability of “1 Red”,”1 White”,”1 Blue” = Τ
3
𝐶1 ∗2
𝐶1 ∗5
𝐶1 10
𝐶3
= 0.25
6) P(1,2) = Probability of “1 Red”,”2 White”,”no Blue” = Τ
3
𝐶1 ∗2
𝐶2 10
𝐶3
= 0.025
7) P(2,0) = Probability of “2 Red”,”no White”,”1 Blue” = Τ
3
𝐶2 ∗5
𝐶1 10
𝐶3
= 0.125
8) P(2,1) = Probability of “2 Red”,”1 White”,”no Blue” = Τ
3
𝐶2 ∗2
𝐶1 10
𝐶3
= 0.05
9) P(2,2) = Probability of “2 Red”,”2 White”,”no Blue” = P(ϕ) = 0
10) P(3,0) = Probability of “3 Red”,”no White”,”no Blue” = Τ
3
𝐶3 10
𝐶3
= 0.008
11) P(3,1) = Probability of “3 Red”,”1 White”,”no Blue” = P(ϕ) = 0
12) P(3,2) = Probability of “3 Red”,”2 White”,”no Blue” = P(ϕ) = 0
ILLUSTRATIVE EXAMPLE 2 - CONTINUED
}
13. ∴ The Joint Distribution of (X,Y) is given by:
Now, P (X ≤ 2, Y ≥ 1 )
= P ( 2,1 ) + P ( 2,2 ) + P ( 1,1 ) + P ( 1,2 ) + P ( 0,1 ) + P ( 0,2 )
= 0.05 + 0 + 0.25 + 0.025 + 0.16 + 0.041 = 0.526
The Marginal Distribution of Y is given by
Y : 0 1 2
pYj : 0.466 0.466 0.066
ILLUSTRATIVE EXAMPLE 2 - CONTINUED
Probability of “ more than 1 White balls “:
P ( Y ≥ 1 ) = 0.466 + 0.133 = 0.6
From the above table we see, = 0.083 ≠ 0.284 x 0.466
∴ X, Y are not independent
14. APPLICATIONS
The bivariate distribution is useful in analyzing the relationship between two
randomly distributed variables, and thus has heavy application to biology and
economics where the relationship between approximately-random variables
is of great interest. For instance, one of the earliest uses of the bivariate
distribution was in analyzing the relationship between a father's height and
the height of their eldest son, resolving a question Darwin posed in his book
the “The Origin of Species”.
Also used in measuring systems, such as those used in coordinate measuring
machines (CMMs), laser interferometers, linear or rotary encoders, etc.
15. CONCLUSION
In real life, we are often interested in several random variables
that are related to each other. For example, suppose that we
choose a random family, and we would like to study the number of
people in the family, the household income, the ages of the family
members, etc. Each of these is a random variable, and we suspect
that they are dependent. In this presentation, we developed the
tools to study joint distributions of random variables.
16. REFERENCES
• [1]https://online.stat.psu.edu/stat414/lesson/17/17.1
• [2]https://bookdown.org/compfinezbook/introcompfinr/Bivariate-
Distributions.html
• [3]https://en.wikipedia.org/wiki/Random_variable#Discrete_random
_variable
• [4]https://en.wikipedia.org/wiki/Joint_probability_distribution
• [5]https://www.probabilitycourse.com/chapter5/5_1_0_joint_distribut
ions.php
• [6]https://www.stat.ncsu.edu/people/bloomfield/courses/st380/slide
s/Devore-ch05-sec1-2.pdf
• [7]https://brilliant.org/wiki/multivariate-normal-distribution/
• [8]Page 128 Engineering Mathematics Vol – 2A by B.K.Pal & K.Das,
published by U.N Dhur & Sons Private Ltd.