SlideShare a Scribd company logo
1 of 115
Jointly distributed Random
variables
Multivariate distributions
Quite often there will be 2 or more random
variables (X, Y, etc) defined for the same
random experiment.
Example:
A bridge hand (13 cards) is selected from a deck
of 52 cards.
X = the number of spades in the hand.
Y = the number of hearts in the hand.
In this example we will define:
p(x,y) = P[X = x, Y = y]
The function
p(x,y) = P[X = x, Y = y]
is called the joint probability function of
X and Y.
Note:
The possible values of X are 0, 1, 2, …, 13
The possible values of Y are also 0, 1, 2, …, 13
and X + Y ≤ 13.
   
, ,
p x y P X x Y y
  
13 13 26
13
52
13
x y x y
   
   
 
   

 
 
 
The total number of
ways of choosing the
13 cards for the hand
The number of
ways of choosing
the x spades for the
hand
The number of
ways of choosing
the y hearts for the
hand
The number of ways
of completing the hand
with diamonds and
clubs.
Table: p(x,y)
0 1 2 3 4 5 6 7 8 9 10 11 12 13
0 0.0000 0.0002 0.0009 0.0024 0.0035 0.0032 0.0018 0.0006 0.0001 0.0000 0.0000 0.0000 0.0000 0.0000
1 0.0002 0.0021 0.0085 0.0183 0.0229 0.0173 0.0081 0.0023 0.0004 0.0000 0.0000 0.0000 0.0000 -
2 0.0009 0.0085 0.0299 0.0549 0.0578 0.0364 0.0139 0.0032 0.0004 0.0000 0.0000 0.0000 - -
3 0.0024 0.0183 0.0549 0.0847 0.0741 0.0381 0.0116 0.0020 0.0002 0.0000 0.0000 - - -
4 0.0035 0.0229 0.0578 0.0741 0.0530 0.0217 0.0050 0.0006 0.0000 0.0000 - - - -
5 0.0032 0.0173 0.0364 0.0381 0.0217 0.0068 0.0011 0.0001 0.0000 - - - - -
6 0.0018 0.0081 0.0139 0.0116 0.0050 0.0011 0.0001 0.0000 - - - - - -
7 0.0006 0.0023 0.0032 0.0020 0.0006 0.0001 0.0000 - - - - - - -
8 0.0001 0.0004 0.0004 0.0002 0.0000 0.0000 - - - - - - - -
9 0.0000 0.0000 0.0000 0.0000 0.0000 - - - - - - - - -
10 0.0000 0.0000 0.0000 0.0000 - - - - - - - - - -
11 0.0000 0.0000 0.0000 - - - - - - - - - - -
12 0.0000 0.0000 - - - - - - - - - - - -
13 0.0000 - - - - - - - - - - - - -
13 13 26
13
52
13
x y x y
   
   
 
   

 
 
 
Bar graph: p(x,y)
0 1 2 3 4 5 6 7 8 9 10 11 12 13
0
3
6
9
12
-
0.0100
0.0200
0.0300
0.0400
0.0500
0.0600
0.0700
0.0800
0.0900
x
y
p(x,y)
Example:
A die is rolled n = 5 times
X = the number of times a “six” appears.
Y = the number of times a “five” appears.
Now
p(x,y) = P[X = x, Y = y]
The possible values of X are 0, 1, 2, 3, 4, 5.
The possible values of Y are 0, 1, 2, 3, 4, 5.
and X + Y ≤ 5
A typical outcome of rolling a die n = 5 times
will be a sequence F5FF6 where F denotes the
outcome {1,2,3,4}. The probability of any such
sequence will be:
5
1 1 4
6 6 6
x y x y
 
     
     
     
where
x = the number of sixes in the sequence and
y = the number of fives in the sequence
Now
p(x,y) = P[X = x, Y = y]
5
1 1 4
6 6 6
x y x y
K
 
     
      
     
Where K = the number of sequences of length 5
containing x sixes and y fives.
5 5 5
5
x x y
x y x y
  
   
    
 
   
 
 
   
5 !
5! 5!
! 5 ! ! 5 ! ! ! 5 !
x
x x y x y x y x y
  

 
  
  
    
  
Thus
p(x,y) = P[X = x, Y = y]
 
5
5! 1 1 4
! ! 5 ! 6 6 6
x y x y
x y x y
 
     
      
       
if x + y ≤ 5 .
Table: p(x,y)
0 1 2 3 4 5
0 0.1317 0.1646 0.0823 0.0206 0.0026 0.0001
1 0.1646 0.1646 0.0617 0.0103 0.0006 0
2 0.0823 0.0617 0.0154 0.0013 0 0
3 0.0206 0.0103 0.0013 0 0 0
4 0.0026 0.0006 0 0 0 0
5 0.0001 0 0 0 0 0
 
5
5! 1 1 4
! ! 5 ! 6 6 6
x y x y
x y x y
 
     
      
       
0
1
2
3
4
5
0
1
2
3
4
5
0.0000
0.0200
0.0400
0.0600
0.0800
0.1000
0.1200
0.1400
0.1600
0.1800
Bar graph: p(x,y)
x
y
p(x,y)
General properties of the joint probability
function;
p(x,y) = P[X = x, Y = y]
 
1. 0 , 1
p x y
 
 
2. , 1
x y
p x y 

   
3. , ,
P X Y A p x y
 
 
  
 
,
x y A

Example:
A die is rolled n = 5 times
X = the number of times a “six” appears.
Y = the number of times a “five” appears.
What is the probability that we roll more sixes
than fives
i.e. what is P[X > Y]?
Table: p(x,y)
0 1 2 3 4 5
0 0.1317 0.1646 0.0823 0.0206 0.0026 0.0001
1 0.1646 0.1646 0.0617 0.0103 0.0006 0
2 0.0823 0.0617 0.0154 0.0013 0 0
3 0.0206 0.0103 0.0013 0 0 0
4 0.0026 0.0006 0 0 0 0
5 0.0001 0 0 0 0 0
 
5
5! 1 1 4
! ! 5 ! 6 6 6
x y x y
x y x y
 
     
      
       
   
, 0.3441
P X Y p x y
  

x y

Marginal and conditional
distributions
Definition:
Let X and Y denote two discrete random variables
with joint probability function
p(x,y) = P[X = x, Y = y]
Then
pX(x) = P[X = x] is called the marginal
probability function of X.
pY(y) = P[Y = y] is called the marginal
probability function of Y.
and
Note: Let y1, y2, y3, … denote the possible values of Y.
   
X
p x P X x
 
   
1 2
, ,
P X x Y y X x Y y
 
      
 
   
1 2
, ,
P X x Y y P X x Y y
      
   
1 2
, ,
p x y p x y
  
   
, ,
j
j y
p x y p x y
 
 
Thus the marginal probability function of X, pX(x) is
obtained from the joint probability function of X and Y by
summing p(x,y) over the possible values of Y.
Also
   
Y
p y P Y y
 
   
1 2
, ,
P X x Y y X x Y y
 
      
 
   
1 2
, ,
P X x Y y P X x Y y
      
   
1 2
, ,
p x y p x y
  
   
, ,
i
i x
p x y p x y
 
 
Example:
A die is rolled n = 5 times
X = the number of times a “six” appears.
Y = the number of times a “five” appears.
0 1 2 3 4 5 p X (x )
0 0.1317 0.1646 0.0823 0.0206 0.0026 0.0001 0.4019
1 0.1646 0.1646 0.0617 0.0103 0.0006 0 0.4019
2 0.0823 0.0617 0.0154 0.0013 0 0 0.1608
3 0.0206 0.0103 0.0013 0 0 0 0.0322
4 0.0026 0.0006 0 0 0 0 0.0032
5 0.0001 0 0 0 0 0 0.0001
p Y (y ) 0.4019 0.4019 0.1608 0.0322 0.0032 0.0001
Conditional Distributions
Definition:
Let X and Y denote two discrete random variables
with joint probability function
p(x,y) = P[X = x, Y = y]
Then
pX |Y(x|y) = P[X = x|Y = y] is called the conditional
probability function of X given Y
= y
and
pY |X(y|x) = P[Y = y|X = x] is called the conditional
probability function of Y given
X = x
Note
 
X Y
p x y P X x Y y
 
  
 
 
 
 
 
, ,
Y
P X x Y y p x y
P Y y p y
 
 

 
Y X
p y x P Y y X x
 
  
 
 
 
 
 
, ,
X
P X x Y y p x y
P X x p x
 
 

and
• Marginal distributions describe how one
variable behaves ignoring the other variable.
• Conditional distributions describe how one
variable behaves when the other variable is
held fixed
Example:
A die is rolled n = 5 times
X = the number of times a “six” appears.
Y = the number of times a “five” appears.
0 1 2 3 4 5 p X (x )
0 0.1317 0.1646 0.0823 0.0206 0.0026 0.0001 0.4019
1 0.1646 0.1646 0.0617 0.0103 0.0006 0 0.4019
2 0.0823 0.0617 0.0154 0.0013 0 0 0.1608
3 0.0206 0.0103 0.0013 0 0 0 0.0322
4 0.0026 0.0006 0 0 0 0 0.0032
5 0.0001 0 0 0 0 0 0.0001
p Y (y ) 0.4019 0.4019 0.1608 0.0322 0.0032 0.0001
x
y
The conditional distribution of X given Y = y.
0 1 2 3 4 5
0 0.3277 0.4096 0.5120 0.6400 0.8000 1.0000
1 0.4096 0.4096 0.3840 0.3200 0.2000 0.0000
2 0.2048 0.1536 0.0960 0.0400 0.0000 0.0000
3 0.0512 0.0256 0.0080 0.0000 0.0000 0.0000
4 0.0064 0.0016 0.0000 0.0000 0.0000 0.0000
5 0.0003 0.0000 0.0000 0.0000 0.0000 0.0000
pX |Y(x|y) = P[X = x|Y = y]
x
y
0 1 2 3 4 5
0 0.3277 0.4096 0.2048 0.0512 0.0064 0.0003
1 0.4096 0.4096 0.1536 0.0256 0.0016 0.0000
2 0.5120 0.3840 0.0960 0.0080 0.0000 0.0000
3 0.6400 0.3200 0.0400 0.0000 0.0000 0.0000
4 0.8000 0.2000 0.0000 0.0000 0.0000 0.0000
5 1.0000 0.0000 0.0000 0.0000 0.0000 0.0000
The conditional distribution of Y given X = x.
pY |X(y|x) = P[Y = y|X = x]
x
y
Example
A Bernoulli trial (S - p, F – q = 1 – p) is repeated until
two successes have occurred.
X = trial on which the first success occurs
and
Y = trial on which the 2nd success occurs.
Find the joint probability function of X, Y.
Find the marginal probability function of X and Y.
Find the conditional probability functions of Y given X
= x and X given Y = y,
Solution
A typical outcome would be:
FFF…FSFFF…FS
x - 1
x y
y – x - 1
1 1 2 2
if
x y x y
q pq p q p y x
   
  
   
, ,
p x y P X x Y y
  
 
2 2
if
,
0 otherwise
y
q p y x
p x y

 
 

p(x,y) - Table
y
1 2 3 4 5 6 7 8
x
1 0 p2 p2q p2q2 p2q3 p2q4 p2q5 p2q6
2 0 0 p2q p2q2 p2q3 p2q4 p2q5 p2q6
3 0 0 0 p2q2 p2q3 p2q4 p2q5 p2q6
4 0 0 0 0 p2q3 p2q4 p2q5 p2q6
5 0 0 0 0 0 p2q4 p2q5 p2q6
6 0 0 0 0 0 0 p2q5 p2q6
7 0 0 0 0 0 0 0 p2q6
8 0 0 0 0 0 0 0 0
The marginal distribution of X
   
X
p x P X x
   
,
y
p x y
 
2 1 2 2 1 2 2
x x x x
p q p q p q p q
  
    
2 2
1
y
y x
p q


 
 
 
2 1 2 3
1
x
p q q q q

    
2 1 1
1
1
x x
p q pq
q
 
 
 
 

 
This is the geometric distribution
The marginal distribution of Y
   
Y
p y P Y y
   
,
x
p x y
 
  2 2
1 2,3,4,
0 otherwise
y
y p q y

  
 

This is the negative binomial distribution with k = 2.
The conditional distribution of X given Y = y
 
X Y
p x y P X x Y y
 
  
 
 
 
 
 
, ,
Y
P X x Y y p x y
P Y y p y
 
 

2 2
1
y
x
p q
pq



1
for 1, 2, 3
y x
pq y x x x
 
    
This is the geometric distribution with time starting at x.
The conditional distribution of Y given X = x
 
Y X
p y x P Y y X x
 
  
 
 
 
 
 
, ,
X
P X x Y y p x y
P X x p x
 
 

   
2 2
2 2
1
1 1
y
y
p q
y p q y


 
 
This is the uniform distribution on the values 1, 2, …(y – 1)
 
for 1,2,3, , 1
x y
 
Summary
Discrete Random Variables
The joint probability function;
p(x,y) = P[X = x, Y = y]
 
1. 0 , 1
p x y
 
 
2. , 1
x y
p x y 

   
3. , ,
P X Y A p x y
 
 
  
 
,
x y A

Continuous Random Variables
Definition: Two random variable are said to have
joint probability density function f(x,y) if
 
1. 0 ,
f x y

 
2. , 1
f x y dxdy
 
 

 
   
3. , ,
P X Y A f x y dxdy
 
 
  
A
If  
0 ,
f x y

 
,
f x y dxdy

A
then  
,
z f x y

defines a surface over the x – y plane
Multiple Integration
 
,
f x y dxdy

A
f(x,y)
If the region A = {(x,y)| a ≤ x ≤ b, c ≤ y ≤ d} is a
rectangular region with sides parallel to the
coordinate axes:
x
y
d
c
a b
 
,
f x y dxdy

A
   
, ,
d b b d
c a a c
f x y dx dy f x y dy dx
   
 
   
   
   
Then
 
,
f x y dxdy

A
 
,
d b
c a
f x y dxdy
 
To evaluate
 
,
d b
c a
f x y dx dy
 
  
 
 
Then evaluate the outer integral
   
,
b
a
G y f x y dx
 
First evaluate the inner integral
   
,
d b d
c a c
f x y dxdy G y dy

 
x
y
d
c
a b
y
   
,
b
a
G y f x y dx
  = area under surface above the
line where y is constant
   
,
d b d
c a c
f x y dxdy G y dy

 
dy
Infinitesimal volume under
surface above the line where
y is constant
 
,
f x y dxdy

A
 
,
b d
a c
f x y dydx
 
The same quantity can be calculated by integrating
first with respect to y, than x.
 
,
b d
a c
f x y dy dx
 
  
 
 
Then evaluate the outer integral
   
,
d
c
H x f x y dy
 
First evaluate the inner integral
   
,
b d b
a c a
f x y dydx H x dx

 
x
y
d
c
a b
x
   
,
d
c
H x f x y dy
  = area under surface above the
line where x is constant
   
,
b d b
a c a
f x y dydx H x dx

 
dx
Infinitesimal volume under
surface above the line where
x is constant
   
1 1 1 1
2 3 2 3
0 0 0 0
x y xy dxdy x y xy dydx
  
 
Example: Compute
Now
   
1 1 1 1
2 3 2 3
0 0 0 0
x y xy dxdy x y xy dx dy
 
  
 
 
  
1
1 3 2
3
0 0
3 2
x
x
x x
y y dy


 
 
 
 
 

1
1 2 4
3
0 0
1 1 1 1
3 2 3 2 2 4
y
y
y y
y y dy


 
   
 
 

1 1 7
6 8 24
  
The same quantity can be computed by reversing
the order of integration
   
1 1 1 1
2 3 2 3
0 0 0 0
x y xy dydx x y xy dy dx
 
  
 
 
  
1
1 2 4
2
0 0
2 4
y
y
y y
x x dx


 
 
 
 
 

1
1 3 2
2
0 0
1 1 1 1
2 4 2 3 4 2
x
x
x x
x x dx


 
   
 
 

1 1 7
6 8 24
  
Integration over non rectangular
regions
Suppose the region A is defined as follows
A = {(x,y)| a(y) ≤ x ≤ b(y), c ≤ y ≤ d}
x
y
d
c
a(y) b(y)
 
,
f x y dxdy

A
 
 
 
,
b y
d
c a y
f x y dx dy
 
  
 
 
 
Then
If the region A is defined as follows
A = {(x,y)| a ≤ x ≤ b, c(x) ≤ y ≤ d(x) }
x
y
b
a
d(x)
c(x)
 
,
f x y dxdy

A
 
 
 
,
d x
b
a c x
f x y dy dx
 
  
 
 
 
Then
In general the region A can be partitioned into
regions of either type
x
y
A1
A3
A4
A2
A
Example:
Compute the volume under f(x,y) = x2y + xy3 over the
region A = {(x,y)| x + y ≤ 1, 0 ≤ x, 0 ≤ y}
x
y
x + y = 1
(1, 0)
(0, 1)
Integrating first with respect to x than y
x
y
x + y = 1
(1, 0)
(0, 1)
(0, y) (1 - y, y)
 
1
1
2 3
0 0
y
x y xy dx dy

 
 
 
 
 
 
   
1
1
2 3 2 3
0 0
y
x y xy dxdy x y xy dxdy

  
  
A
and
 
1
1
2 3
0 0
y
x y xy dx dy

 

 
 
 
 
1
1 3 2
3
0 0
3 2
x y
x
x x
y y dy
 

 
 
 
 
 

   
3 2
1
3
0
1 1
3 2
y y
y y dy
 
 
 
 
 
 

1 2 3 4 3 4 5
0
3 3 2
3 2
y y y y y y y
dy
 
    
 
 
 

3
1 1 1 2 1
2 4 5 4 5 6
1
3 2
    
 
1 1 1 1 1 1 1
6 3 4 15 8 5 12
      
20 40 30 8 15 24 10 3 1
120 120 40
     
  
Now integrating first with respect to y than x
x
y
x + y = 1
(1, 0)
(0, 1)
(x, 0)
(x, 1 – x )
 
1 1
2 3
0 0
x
x y xy dy dx

 
 
 
 
 
   
1 1
2 3 2 3
0 0
x
x y xy dydx x y xy dydx

  
  
A
Hence
 
1 1
2 3
0 0
x
x y xy dy dx

 

 
 
 
1
1 2 4
2
0 0
2 4
y x
y
y y
x x dx
 

 
 
 
 
 
 
 
 

   
2 4
1
2
0
1 1
2 4
x x
x x dx
 
 

1 2 3 4 2 3 4 5
0
2 4 6 4
2 4
x x x x x x x x
dx
     
 

1 2 3 4 5
0
2 2 2
4
x x x x x
dx
   
 
15 20 15 12 6
1 1 1 1 1 4
8 6 8 10 20 120 120
   
      
Continuous Random Variables
Definition: Two random variable are said to have
joint probability density function f(x,y) if
 
1. 0 ,
f x y

 
2. , 1
f x y dxdy
 
 

 
   
3. , ,
P X Y A f x y dxdy
 
 
  
A
Definition: Let X and Y denote two random
variables with joint probability density function
f(x,y) then
the marginal density of X is
   
,
X
f x f x y dy


 
the marginal density of Y is
   
,
Y
f y f x y dx


 
Definition: Let X and Y denote two random
variables with joint probability density function
f(x,y) and marginal densities fX(x), fY(y) then
the conditional density of Y given X = x
 
 
 
,
Y X
X
f x y
f y x
f x

conditional density of X given Y = y
 
 
 
,
X Y
Y
f x y
f x y
f y

The bivariate Normal distribution
Let
 
2 2
1 1 1 1 2 2 2 2
1 1 2 2
1 2 2
2
,
1
x x x x
Q x x
   

   

 
      
   
 
 
 
      
      
 
 


 
 
 
1 2
1
,
2
1 2 2
1 2
1
, e
2 1
Q x x
f x x
   



where
This distribution is called the bivariate
Normal distribution.
The parameters are 1, 2 , 1, 2 and .
Surface Plots of the bivariate
Normal distribution
Note:
 
2 2
1 1 1 1 2 2 2 2
1 1 2 2
1 2 2
2
,
1
x x x x
Q x x
   

   

 
      
   
 
 
 
      
      
 
 


 
 
 
1 2
1
,
2
1 2 2
1 2
1
, e
2 1
Q x x
f x x
   



is constant when
is constant.
This is true when x1, x2 lie on an ellipse
centered at 1, 2 .
Marginal and Conditional
distributions
   
1 1 1 2 2
,
f x f x x dx


 
Marginal distributions for the Bivariate Normal
distribution
Recall the definition of marginal distributions
for continuous random variables:
and
It can be shown that in the case of the bivariate
normal distribution the marginal distribution of xi
is Normal with mean i and standard deviation i.
   
2 2 1 2 1
,
f x f x x dx


 
The marginal distributions of x2 is
   
2 2 1 2 1
,
f x f x x dx


 
 
 
1 2
1
,
2
1
2
1 2
1
e
2 1
Q x x
dx
   






where
 
2 2
1 1 1 1 2 2 2 2
1 1 2 2
1 2 2
2
,
1
x x x x
Q x x
   

   

 
      
   
 
 
 
      
      
 
 


Proof:
Now:
 
2 2
1 1 1 1 2 2 2 2
1 1 2 2
1 2 2
2
,
1
x x x x
Q x x
   

   

 
      
   
 
 
 
      
      
 
 


2 2 2
1 1
1
2 2 2
2
x a x a a
c x c
b b b b

 
     
 
 
     
2
1 1 2 2
1
2 2 2
2 1
2
1 1 1
x x
x
 

      
 
 
 

 
  
  
 
 
 
 
 
 
 
2
2
2 2 2 2
1
1
2 2 2 2 2
1 2 1 2
2
1 1 1
x x
 

 
      
 
 
  
Hence  
2 2 2
1
1 or 1
b b
   


   
Also
   
1 2 2
2 2 2
2 1
1 1
x
a
b
 

    



 
 
 
 
1
1 2
2
2
1
1
x

  

 


 
  
 
  
and  
1
1 2
2
a x

  

  
Finally
 
 
 
 
 
2
2
2
2 2 2 2
1
1
2 2 2 2 2 2
1 2 1 2
2
1 1 1
x x
a
c
b
 

 
      
 
   
  
 
 
 
 
 
2
2 2
2 2 2 2
1
1 2
2 2 2 2 2
1 2 1 2
2
1 1 1
x x a
c
b
 

 
      
 
   
  
 
 
 
 
 
2
2
2 2 2 2
1
1
2 2 2 2 2
1 2 1 2
2
1 1 1
x x
 

 
      
 
  
  
 
 
2
1
1 2
2
2
1
x

  

 


 
 
 
 


and
 
   
2
2
2 1 1
1 1 2 2 2 2
2
2 2
2 2
1
1
2
1
c x x
 
    
 
 

    

 
 
2
1
1 2 2
2
x

  


 

  
 

  
 
  
2
2
2
1
2 2
2
2 2
2
1
1
1
1
x

 

 
 
  
 
  
2
2 2
2
x 

 

  
 
Summarizing
 
2 2
1 1 1 1 2 2 2 2
1 1 2 2
1 2 2
2
,
1
x x x x
Q x x
   

   

 
      
   
 
 
 
      
      
 
 


2
1
x a
c
b

 
 
 
 
where 2
1 1
b  
 
 
1
1 2
2
a x

  

  
2
2 2
2
x
c


 

  
 
and
Thus    
2 2 1 2 1
,
f x f x x dx


 
 
 
1 2
1
,
2
1
2
1 2
1
e
2 1
Q x x
dx
   






 
2
1
1
2
1
2
1 2
1
e
2 1
x a
c
b
dx
   
 

 
  
 
 
 
 
 




 
2
1
1
2
2
1
2
1 2
2 1
e
2
2 1
x a
c
b
be
dx
b


   

 

   
 




2
2 2
2
1
2
2
1
2
x
e



 

  
 

Thus the marginal distribution of x2 is Normal
with mean 2 and standard deviation 2.
Similarly the marginal distribution of x1 is Normal
with mean 1 and standard deviation 1.
 
 
 
1 2
1 2
12
2 2
,
f x x
f x x
f x

Conditional distributions for the Bivariate Normal
distribution
Recall the definition of conditional distributions
for continuous random variables:
and
It can be shown that in the case of the bivariate
normal distribution the conditional distribution of
xi given xj is Normal with:
 
 
 
1 2
2 1
21
1 1
,
f x x
f x x
f x

and
mean
standard deviation
 
i
i j j
i j
j
x

   

  
2
1
i
i j
  
 
Proof
 
 
 
1 2
2 1
21
1 1
,
f x x
f x x
f x

 
 
1 2
2
2 2
2
1
,
2
2
1 2
1
2
2
e
2 1
1
2
Q x x
x
e


   


 

  
 


 
2
2
2
1 2 2
2 2
1 2
2
2
1 1
1 1
,
2 2
2 2
2 2
1 1
e e
2 1 2 1
x a x
x c
Q x x
b




   
   
 
 
 
   
   
   
 
 
   
   
 
 
where 2
1 1
b  
 
 
1
1 2
2
a x

  

  
2
2 2
2
x
c


 

  
 
and
 
2
1
1
2
1 2
12
1
2
x a
b
f x x e
b


 
  
 

Hence
Thus the conditional distribution of x2 given x1 is Normal
with:
and
mean
standard deviation
 
1
1 2 2
12
2
a x

   

   
2
1
12
1
b   
  
Bivariate Normal Distribution with marginal
distributions
Bivariate Normal Distribution with
conditional distribution
( 1, 2)
x2
x1
Regression
Regression to the
mean
 
2
2 1 2
21
1
x

   

  
Major axis of
ellipses
Suppose that a rectangle is constructed by first choosing
its length, X and then choosing its width Y.
Its length X is selected form an exponential distribution
with mean  = 1/l = 5. Once the length has been chosen
its width, Y, is selected from a uniform distribution form
0 to half its length.
Find the probability that its area A = XY is less than 4.
Example:
Solution:
     
, X Y X
f x y f x f y x

 
1
5
1
5 for 0
x
X
f x e x

 
  1
if 0 2
2
Y X
f y x y x
x
  
1 1
5 5
1 2
5 5
1
= if 0 2, 0
2
x x
x
e e y x x
x
 
   
xy = 4
y = x/2
2 4
y x x
 
2
8 or 8 2 2
x x
  
2 2 2 2 2
y x
  
 
2 2, 2
 
2 2, 2
     
4
2 2 2
0 0 0
2 2
4 , ,
x
x
P XY f x y dydx f x y dydx

  
   
     
4
2 2 2
0 0 0
2 2
4 , ,
x
x
P XY f x y dydx f x y dydx

  
   
1 1
5 5
4
2 2 2
2 2
5 5
0 0 0
2 2
x
x
x x
x x
e dydx e dydx

 
 
   
1 1
5 5
2 2
2 2 4
5 2 5
0 2 2
x x
x
x x x
e dx e dx

 
 
 
1 1
5 5
2 2
2
8
1
5 5
0 2 2
x x
e dx x e dx

 

 
 
This part can be
evaluated
This part may require
Numerical evaluation
multivariate distributions
k ≥ 2
Definition
Let X1, X2, …, Xk denote k discrete random
variables, then
p(x1, x2, …, xk )
is joint probability function of X1, X2, …, Xk if
 
1
1
2. , , 1
n
n
x x
p x x 
 
 
1
1. 0 , , 1
n
p x x
 
   
1 1
3. , , , ,
n n
P X X A p x x
 
 
   
 
1, , n
x x A

Definition
Let X1, X2, …, Xk denote k continuous random
variables, then
f(x1, x2, …, xk )
is joint density function of X1, X2, …, Xk if
 
1 1
2. , , , , 1
n n
f x x dx dx
 
 

 
 
1
1. , , 0
n
f x x 
   
1 1 1
3. , , , , , ,
n n n
P X X A f x x dx dx
 
 
   
A
Example: The Multinomial distribution
Suppose that we observe an experiment that has k
possible outcomes {O1, O2, …, Ok } independently n
times.
Let p1, p2, …, pk denote probabilities of O1, O2, …,
Ok respectively.
Let Xi denote the number of times that outcome Oi
occurs in the n repetitions of the experiment.
Then the joint probability function of the random
variables X1, X2, …, Xk is
  1 2
1 1 2
1 2
!
, ,
! ! !
k
x
x x
n k
k
n
p x x p p p
x x x

Note:
is the probability of a sequence of length n containing
x1 outcomes O1
x2 outcomes O2
…
xk outcomes Ok
1 2
1 2
k
x
x x
k
p p p
1 2
1 2
!
! ! ! k
k
n
n
x x x
x x x
 
  
 
is the number of ways of choosing the positions for the x1
outcomes O1, x2 outcomes O2, …, xk outcomes Ok
1 2
1
3
1 2
k
k
n x x x
n n x
x x
x x
 
    
  
   
  
     
 
 
 
 
 
1 1 2
1 1 2 1 2 3 1 2 3
! !
!
! ! ! ! ! !
n x n x x
n
x n x x n x x x n x x x
   
  
    
   
     
   
1 2
!
! ! !
k
n
x x x

is called the Multinomial distribution
  1 2
1 1 2
1 2
!
, ,
! ! !
k
x
x x
n k
k
n
p x x p p p
x x x

1 2
1 2
1 2
k
x
x x
k
k
n
p p p
x x x
 
  
 
Example:
Suppose that a treatment for back pain has three possible
outcomes:
O1 - Complete cure (no pain) – (30% chance)
O2 - Reduced pain – (50% chance)
O3 - No change – (20% chance)
Hence p1 = 0.30, p2 = 0.50, p3 = 0.20.
Suppose the treatment is applied to n = 4 patients suffering
back pain and let X = the number that result in a complete cure,
Y = the number that result in just reduced pain, and Z = the
number that result in no change.
Find the distribution of X, Y and Z. Compute P[X + Y ≥ Z]
       
4!
, , 0.30 0.50 0.20 4
! ! !
x y z
p x y z x y z
x y z
   
Table: p(x,y,z)
z
x y 0 1 2 3 4
0 0 0 0 0 0 0.0016
0 1 0 0 0 0.0160 0
0 2 0 0 0.0600 0 0
0 3 0 0.1000 0 0 0
0 4 0.0625 0 0 0 0
1 0 0 0 0 0.0096 0
1 1 0 0 0.0720 0 0
1 2 0 0.1800 0 0 0
1 3 0.1500 0 0 0 0
1 4 0 0 0 0 0
2 0 0 0 0.0216 0 0
2 1 0 0.1080 0 0 0
2 2 0.1350 0 0 0 0
2 3 0 0 0 0 0
2 4 0 0 0 0 0
3 0 0 0.0216 0 0 0
3 1 0.0540 0 0 0 0
3 2 0 0 0 0 0
3 3 0 0 0 0 0
3 4 0 0 0 0 0
4 0 0.0081 0 0 0 0
4 1 0 0 0 0 0
4 2 0 0 0 0 0
4 3 0 0 0 0 0
4 4 0 0 0 0 0
P [X + Y ≥ Z]
z
x y 0 1 2 3 4
0 0 0 0 0 0 0.0016
0 1 0 0 0 0.0160 0
0 2 0 0 0.0600 0 0
0 3 0 0.1000 0 0 0
0 4 0.0625 0 0 0 0
1 0 0 0 0 0.0096 0
1 1 0 0 0.0720 0 0
1 2 0 0.1800 0 0 0
1 3 0.1500 0 0 0 0
1 4 0 0 0 0 0
2 0 0 0 0.0216 0 0
2 1 0 0.1080 0 0 0
2 2 0.1350 0 0 0 0
2 3 0 0 0 0 0
2 4 0 0 0 0 0
3 0 0 0.0216 0 0 0
3 1 0.0540 0 0 0 0
3 2 0 0 0 0 0
3 3 0 0 0 0 0
3 4 0 0 0 0 0
4 0 0.0081 0 0 0 0
4 1 0 0 0 0 0
4 2 0 0 0 0 0
4 3 0 0 0 0 0
4 4 0 0 0 0 0
= 0.9728
Example: The Multivariate Normal distribution
Recall the univariate normal distribution
   
2
1
2
1
2
x
f x e






the bivariate normal distribution
          
2
2
1
2
2 1
2
2
1
,
2 1
x x
x x y y
x x
x x y y
x y
f x y e
 
 
   


  
 
 

 
  
 
 


The k-variate Normal distribution
   
 
   
1
1
2
1 /2 1/2
1
, ,
2
k k
f x x f e



   
 

x μ x μ
x
where
1
2
k
x
x
x
 
 
 

 
 
 
x
1
2
k



 
 
 

 
 
 
μ
11 12 1
12 22 2
1 2
k
k
k k kk
  
  
  
 
 
 
 
 
 
 
Marginal distributions
Definition
Let X1, X2, …, Xq, Xq+1 …, Xk denote k discrete
random variables with joint probability function
p(x1, x2, …, xq, xq+1 …, xk )
   
1
12 1 1
, , , ,
q n
q q n
x x
p x x p x x

   
then the marginal joint probability function
of X1, X2, …, Xq is
Definition
Let X1, X2, …, Xq, Xq+1 …, Xk denote k continuous
random variables with joint probability density
function
f(x1, x2, …, xq, xq+1 …, xk )
   
12 1 1 1
, , , ,
q q n q n
f x x f x x dx dx
 

 
   
then the marginal joint probability function
of X1, X2, …, Xq is
Conditional distributions
Definition
Let X1, X2, …, Xq, Xq+1 …, Xk denote k discrete
random variables with joint probability function
p(x1, x2, …, xq, xq+1 …, xk )
   
 
1
1 1
1 1
1 1
, ,
, , , ,
, ,
k
q q k
q q k
q k q k
p x x
p x x x x
p x x


 
  

then the conditional joint probability function
of X1, X2, …, Xq given Xq+1 = xq+1 , …, Xk = xk is
Definition
Let X1, X2, …, Xq, Xq+1 …, Xk denote k continuous
random variables with joint probability density
function
f(x1, x2, …, xq, xq+1 …, xk )
then the conditional joint probability function
of X1, X2, …, Xq given Xq+1 = xq+1 , …, Xk = xk is
Definition
   
 
1
1 1
1 1
1 1
, ,
, , , ,
, ,
k
q q k
q q k
q k q k
f x x
f x x x x
f x x


 
  

Definition
Let X1, X2, …, Xq, Xq+1 …, Xk denote k continuous
random variables with joint probability density
function
f(x1, x2, …, xq, xq+1 …, xk )
then the variables X1, X2, …, Xq are independent
of Xq+1, …, Xk if
Definition – Independence of sets of vectors
     
1 1 1 1 1
, , , , , ,
k q q q k q k
f x x f x x f x x
 
  
A similar definition for discrete random variables.
Definition
Let X1, X2, …, Xk denote k continuous random
variables with joint probability density function
f(x1, x2, …, xk )
then the variables X1, X2, …, Xk are called
mutually independent if
Definition – Mutual Independence
       
1 1 1 2 2
, , k k k
f x x f x f x f x
 
A similar definition for discrete random variables.
Example
Let X, Y, Z denote 3 jointly distributed random
variable with joint density function then
 
 
2
0 1,0 1,0 1
, ,
0 otherwise
K x yz x y z
f x y z
       

 


Find the value of K.
Determine the marginal distributions of X, Y and Z.
Determine the joint marginal distributions of
X, Y
X, Z
Y, Z
Solution
   
1 1 1
2
0 0 0
1 , ,
f x y z dxdydz K x yz dxdydz
  
  
  
   
Determining the value of K.
1
1 1 1 1
3
0 0 0 0
0
1
3 3
x
x
x
K xyz dydz K yz dydz


   
   
   
 
 
 
1
1 1
2
0 0
0
1 1 1
3 2 3 2
y
y
y
K y z dz K z dz


   
   
   
 
 
 
1
2
0
1 1 7
1
3 4 3 4 12
z z
K K K
   
     
   
 
 
12
if
7
K 
     
1 1
2
1
0 0
12
, ,
7
f x f x y z dydz x yz dydz
 
 
  
  
The marginal distribution of X.
1
1 1
2
2 2
0 0
0
12 12 1
7 2 7 2
y
y
y
x y z dz x z dz


   
   
   
 
 
 
1
2
2 2
0
12 12 1
for 0 1
7 4 7 4
z
x z x x
   
     
   
 
 
     
1
2
12
0
12
, , ,
7
f x y f x y z dz x yz dz


  
 
The marginal distribution of X,Y.
1
2
2
0
12
7 2
z
z
z
x z y


 
 
 
 
2
12 1
for 0 1,0 1
7 2
x y x y
 
     
 
 
Find the conditional distribution of:
1. Z given X = x, Y = y,
2. Y given X = x, Z = z,
3. X given Y = y, Z = z,
4. Y , Z given X = x,
5. X , Z given Y = y
6. X , Y given Z = z
7. Y given X = x,
8. X given Y = y
9. X given Z = z
10. Z given X = x,
11. Z given Y = y
12. Y given Z = z
The marginal distribution of X,Y.
  2
12
12 1
, for 0 1,0 1
7 2
f x y x y x y
 
     
 
 
Thus the conditional distribution of Z given X = x,Y = y
is
 
 
 
2
2
12
12
, , 7
12 1
,
7 2
x yz
f x y z
f x y
x y


 

 
 
2
2
for 0 1
1
2
x yz
z
x y

  

The marginal distribution of X.
  2
1
12 1
for 0 1
7 4
f x x x
 
   
 
 
Thus the conditional distribution of Y , Z given X = x is
 
 
 
2
2
1
12
, , 7
12 1
7 4
x yz
f x y z
f x
x


 

 
 
2
2
for 0 1,0 1
1
4
x yz
y z
x

    


More Related Content

What's hot

Methods of solving ODE
Methods of solving ODEMethods of solving ODE
Methods of solving ODEkishor pokar
 
laplace transform and inverse laplace, properties, Inverse Laplace Calculatio...
laplace transform and inverse laplace, properties, Inverse Laplace Calculatio...laplace transform and inverse laplace, properties, Inverse Laplace Calculatio...
laplace transform and inverse laplace, properties, Inverse Laplace Calculatio...Waqas Afzal
 
Lesson 1 derivative of trigonometric functions
Lesson 1 derivative of trigonometric functionsLesson 1 derivative of trigonometric functions
Lesson 1 derivative of trigonometric functionsLawrence De Vera
 
Beta & Gamma Functions
Beta & Gamma FunctionsBeta & Gamma Functions
Beta & Gamma FunctionsDrDeepaChauhan
 
Solved exercises line integral
Solved exercises line integralSolved exercises line integral
Solved exercises line integralKamel Attar
 
Solution of non-linear equations
Solution of non-linear equationsSolution of non-linear equations
Solution of non-linear equationsZunAib Ali
 
First order linear differential equation
First order linear differential equationFirst order linear differential equation
First order linear differential equationNofal Umair
 
Limits And Derivative
Limits And DerivativeLimits And Derivative
Limits And DerivativeAshams kurian
 
Lesson 7: Vector-valued functions
Lesson 7: Vector-valued functionsLesson 7: Vector-valued functions
Lesson 7: Vector-valued functionsMatthew Leingang
 
Complex analysis
Complex analysisComplex analysis
Complex analysissujathavvv
 
INTEGRATION BY PARTS PPT
INTEGRATION BY PARTS PPT INTEGRATION BY PARTS PPT
INTEGRATION BY PARTS PPT 03062679929
 
Cubic Spline Interpolation
Cubic Spline InterpolationCubic Spline Interpolation
Cubic Spline InterpolationVARUN KUMAR
 
Calculus of variations
Calculus of variationsCalculus of variations
Calculus of variationsSolo Hermelin
 
8 arc length and area of surfaces x
8 arc length and area of surfaces x8 arc length and area of surfaces x
8 arc length and area of surfaces xmath266
 
Differential equations
Differential equationsDifferential equations
Differential equationsCharan Kumar
 
Dobule and triple integral
Dobule and triple integralDobule and triple integral
Dobule and triple integralsonendra Gupta
 

What's hot (20)

Methods of solving ODE
Methods of solving ODEMethods of solving ODE
Methods of solving ODE
 
laplace transform and inverse laplace, properties, Inverse Laplace Calculatio...
laplace transform and inverse laplace, properties, Inverse Laplace Calculatio...laplace transform and inverse laplace, properties, Inverse Laplace Calculatio...
laplace transform and inverse laplace, properties, Inverse Laplace Calculatio...
 
Lesson 1 derivative of trigonometric functions
Lesson 1 derivative of trigonometric functionsLesson 1 derivative of trigonometric functions
Lesson 1 derivative of trigonometric functions
 
Analytic function
Analytic functionAnalytic function
Analytic function
 
Ordinary differential equation
Ordinary differential equationOrdinary differential equation
Ordinary differential equation
 
Beta & Gamma Functions
Beta & Gamma FunctionsBeta & Gamma Functions
Beta & Gamma Functions
 
Solved exercises line integral
Solved exercises line integralSolved exercises line integral
Solved exercises line integral
 
Solution of non-linear equations
Solution of non-linear equationsSolution of non-linear equations
Solution of non-linear equations
 
Introduction to differential equation
Introduction to differential equationIntroduction to differential equation
Introduction to differential equation
 
First order linear differential equation
First order linear differential equationFirst order linear differential equation
First order linear differential equation
 
Limits And Derivative
Limits And DerivativeLimits And Derivative
Limits And Derivative
 
Lesson 7: Vector-valued functions
Lesson 7: Vector-valued functionsLesson 7: Vector-valued functions
Lesson 7: Vector-valued functions
 
Second Order Derivative | Mathematics
Second Order Derivative | MathematicsSecond Order Derivative | Mathematics
Second Order Derivative | Mathematics
 
Complex analysis
Complex analysisComplex analysis
Complex analysis
 
INTEGRATION BY PARTS PPT
INTEGRATION BY PARTS PPT INTEGRATION BY PARTS PPT
INTEGRATION BY PARTS PPT
 
Cubic Spline Interpolation
Cubic Spline InterpolationCubic Spline Interpolation
Cubic Spline Interpolation
 
Calculus of variations
Calculus of variationsCalculus of variations
Calculus of variations
 
8 arc length and area of surfaces x
8 arc length and area of surfaces x8 arc length and area of surfaces x
8 arc length and area of surfaces x
 
Differential equations
Differential equationsDifferential equations
Differential equations
 
Dobule and triple integral
Dobule and triple integralDobule and triple integral
Dobule and triple integral
 

Similar to Jointly distributed random variables and their properties

Newton divided difference interpolation
Newton divided difference interpolationNewton divided difference interpolation
Newton divided difference interpolationVISHAL DONGA
 
Qt random variables notes
Qt random variables notesQt random variables notes
Qt random variables notesRohan Bhatkar
 
Conditional Expectations Liner algebra
Conditional Expectations Liner algebra Conditional Expectations Liner algebra
Conditional Expectations Liner algebra AZIZ ULLAH SURANI
 
Properties of bivariate and conditional Gaussian PDFs
Properties of bivariate and conditional Gaussian PDFsProperties of bivariate and conditional Gaussian PDFs
Properties of bivariate and conditional Gaussian PDFsAhmad Gomaa
 
Upload
UploadUpload
Uploadvokenn
 
15 Probability Distribution Practical (HSC).pdf
15 Probability Distribution Practical (HSC).pdf15 Probability Distribution Practical (HSC).pdf
15 Probability Distribution Practical (HSC).pdfvedantsk1
 
138191 rvsp lecture notes
138191 rvsp lecture notes138191 rvsp lecture notes
138191 rvsp lecture notesAhmed Tayeh
 
Quantitative Techniques random variables
Quantitative Techniques random variablesQuantitative Techniques random variables
Quantitative Techniques random variablesRohan Bhatkar
 
Bivariate Discrete Distribution
Bivariate Discrete DistributionBivariate Discrete Distribution
Bivariate Discrete DistributionArijitDhali
 
Partial differentiation B tech
Partial differentiation B techPartial differentiation B tech
Partial differentiation B techRaj verma
 
Expectation of Discrete Random Variable.ppt
Expectation of Discrete Random Variable.pptExpectation of Discrete Random Variable.ppt
Expectation of Discrete Random Variable.pptAlyasarJabbarli
 

Similar to Jointly distributed random variables and their properties (20)

L6.slides.pdf
L6.slides.pdfL6.slides.pdf
L6.slides.pdf
 
L6.slides.pdf
L6.slides.pdfL6.slides.pdf
L6.slides.pdf
 
Newton divided difference interpolation
Newton divided difference interpolationNewton divided difference interpolation
Newton divided difference interpolation
 
Qt random variables notes
Qt random variables notesQt random variables notes
Qt random variables notes
 
Estadistica U4
Estadistica U4Estadistica U4
Estadistica U4
 
Chapter 4-1 (1).pptx
Chapter 4-1 (1).pptxChapter 4-1 (1).pptx
Chapter 4-1 (1).pptx
 
Conditional Expectations Liner algebra
Conditional Expectations Liner algebra Conditional Expectations Liner algebra
Conditional Expectations Liner algebra
 
Properties of bivariate and conditional Gaussian PDFs
Properties of bivariate and conditional Gaussian PDFsProperties of bivariate and conditional Gaussian PDFs
Properties of bivariate and conditional Gaussian PDFs
 
Upload
UploadUpload
Upload
 
15 Probability Distribution Practical (HSC).pdf
15 Probability Distribution Practical (HSC).pdf15 Probability Distribution Practical (HSC).pdf
15 Probability Distribution Practical (HSC).pdf
 
138191 rvsp lecture notes
138191 rvsp lecture notes138191 rvsp lecture notes
138191 rvsp lecture notes
 
Quantitative Techniques random variables
Quantitative Techniques random variablesQuantitative Techniques random variables
Quantitative Techniques random variables
 
Bivariate Discrete Distribution
Bivariate Discrete DistributionBivariate Discrete Distribution
Bivariate Discrete Distribution
 
Prob review
Prob reviewProb review
Prob review
 
Linear equations 2-2 a graphing and x-y intercepts
Linear equations   2-2 a graphing and x-y interceptsLinear equations   2-2 a graphing and x-y intercepts
Linear equations 2-2 a graphing and x-y intercepts
 
Unit II PPT.pptx
Unit II PPT.pptxUnit II PPT.pptx
Unit II PPT.pptx
 
boolean.pdf
boolean.pdfboolean.pdf
boolean.pdf
 
Partial differentiation B tech
Partial differentiation B techPartial differentiation B tech
Partial differentiation B tech
 
Digital logic circuits
Digital  logic  circuitsDigital  logic  circuits
Digital logic circuits
 
Expectation of Discrete Random Variable.ppt
Expectation of Discrete Random Variable.pptExpectation of Discrete Random Variable.ppt
Expectation of Discrete Random Variable.ppt
 

More from RohitKumar639388

WHAT IS BUSINESS ANALYTICS um hj mnjh nit 1 ppt only kjjn
WHAT IS BUSINESS ANALYTICS um hj mnjh nit 1 ppt only kjjnWHAT IS BUSINESS ANALYTICS um hj mnjh nit 1 ppt only kjjn
WHAT IS BUSINESS ANALYTICS um hj mnjh nit 1 ppt only kjjnRohitKumar639388
 
Microstrip patch antenna in hfss Anyss presentation PPT for college final year
Microstrip patch antenna in hfss Anyss presentation PPT for college final yearMicrostrip patch antenna in hfss Anyss presentation PPT for college final year
Microstrip patch antenna in hfss Anyss presentation PPT for college final yearRohitKumar639388
 
Cost management forece md engineerrs .ppt
Cost management forece md  engineerrs .pptCost management forece md  engineerrs .ppt
Cost management forece md engineerrs .pptRohitKumar639388
 
pythontraining-201jn026043638.pptx
pythontraining-201jn026043638.pptxpythontraining-201jn026043638.pptx
pythontraining-201jn026043638.pptxRohitKumar639388
 

More from RohitKumar639388 (8)

WHAT IS BUSINESS ANALYTICS um hj mnjh nit 1 ppt only kjjn
WHAT IS BUSINESS ANALYTICS um hj mnjh nit 1 ppt only kjjnWHAT IS BUSINESS ANALYTICS um hj mnjh nit 1 ppt only kjjn
WHAT IS BUSINESS ANALYTICS um hj mnjh nit 1 ppt only kjjn
 
Microstrip patch antenna in hfss Anyss presentation PPT for college final year
Microstrip patch antenna in hfss Anyss presentation PPT for college final yearMicrostrip patch antenna in hfss Anyss presentation PPT for college final year
Microstrip patch antenna in hfss Anyss presentation PPT for college final year
 
Cost management forece md engineerrs .ppt
Cost management forece md  engineerrs .pptCost management forece md  engineerrs .ppt
Cost management forece md engineerrs .ppt
 
BITTU PPT.pptx
BITTU PPT.pptxBITTU PPT.pptx
BITTU PPT.pptx
 
Lec12-Probability.ppt
Lec12-Probability.pptLec12-Probability.ppt
Lec12-Probability.ppt
 
lectr10a.ppt
lectr10a.pptlectr10a.ppt
lectr10a.ppt
 
pythontraining-201jn026043638.pptx
pythontraining-201jn026043638.pptxpythontraining-201jn026043638.pptx
pythontraining-201jn026043638.pptx
 
internsala c-and-c.pptx
internsala c-and-c.pptxinternsala c-and-c.pptx
internsala c-and-c.pptx
 

Recently uploaded

Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfchloefrazer622
 
General AI for Medical Educators April 2024
General AI for Medical Educators April 2024General AI for Medical Educators April 2024
General AI for Medical Educators April 2024Janet Corral
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsTechSoup
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...Sapna Thakur
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfchloefrazer622
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxVishalSingh1417
 
social pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajansocial pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajanpragatimahajan3
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...fonyou31
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfJayanti Pande
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingTeacherCyreneCayanan
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...PsychoTech Services
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 

Recently uploaded (20)

Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdf
 
General AI for Medical Educators April 2024
General AI for Medical Educators April 2024General AI for Medical Educators April 2024
General AI for Medical Educators April 2024
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdf
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
social pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajansocial pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajan
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writing
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 

Jointly distributed random variables and their properties

  • 2. Quite often there will be 2 or more random variables (X, Y, etc) defined for the same random experiment. Example: A bridge hand (13 cards) is selected from a deck of 52 cards. X = the number of spades in the hand. Y = the number of hearts in the hand. In this example we will define: p(x,y) = P[X = x, Y = y]
  • 3. The function p(x,y) = P[X = x, Y = y] is called the joint probability function of X and Y.
  • 4. Note: The possible values of X are 0, 1, 2, …, 13 The possible values of Y are also 0, 1, 2, …, 13 and X + Y ≤ 13.     , , p x y P X x Y y    13 13 26 13 52 13 x y x y                      The total number of ways of choosing the 13 cards for the hand The number of ways of choosing the x spades for the hand The number of ways of choosing the y hearts for the hand The number of ways of completing the hand with diamonds and clubs.
  • 5. Table: p(x,y) 0 1 2 3 4 5 6 7 8 9 10 11 12 13 0 0.0000 0.0002 0.0009 0.0024 0.0035 0.0032 0.0018 0.0006 0.0001 0.0000 0.0000 0.0000 0.0000 0.0000 1 0.0002 0.0021 0.0085 0.0183 0.0229 0.0173 0.0081 0.0023 0.0004 0.0000 0.0000 0.0000 0.0000 - 2 0.0009 0.0085 0.0299 0.0549 0.0578 0.0364 0.0139 0.0032 0.0004 0.0000 0.0000 0.0000 - - 3 0.0024 0.0183 0.0549 0.0847 0.0741 0.0381 0.0116 0.0020 0.0002 0.0000 0.0000 - - - 4 0.0035 0.0229 0.0578 0.0741 0.0530 0.0217 0.0050 0.0006 0.0000 0.0000 - - - - 5 0.0032 0.0173 0.0364 0.0381 0.0217 0.0068 0.0011 0.0001 0.0000 - - - - - 6 0.0018 0.0081 0.0139 0.0116 0.0050 0.0011 0.0001 0.0000 - - - - - - 7 0.0006 0.0023 0.0032 0.0020 0.0006 0.0001 0.0000 - - - - - - - 8 0.0001 0.0004 0.0004 0.0002 0.0000 0.0000 - - - - - - - - 9 0.0000 0.0000 0.0000 0.0000 0.0000 - - - - - - - - - 10 0.0000 0.0000 0.0000 0.0000 - - - - - - - - - - 11 0.0000 0.0000 0.0000 - - - - - - - - - - - 12 0.0000 0.0000 - - - - - - - - - - - - 13 0.0000 - - - - - - - - - - - - - 13 13 26 13 52 13 x y x y                     
  • 6. Bar graph: p(x,y) 0 1 2 3 4 5 6 7 8 9 10 11 12 13 0 3 6 9 12 - 0.0100 0.0200 0.0300 0.0400 0.0500 0.0600 0.0700 0.0800 0.0900 x y p(x,y)
  • 7. Example: A die is rolled n = 5 times X = the number of times a “six” appears. Y = the number of times a “five” appears. Now p(x,y) = P[X = x, Y = y] The possible values of X are 0, 1, 2, 3, 4, 5. The possible values of Y are 0, 1, 2, 3, 4, 5. and X + Y ≤ 5
  • 8. A typical outcome of rolling a die n = 5 times will be a sequence F5FF6 where F denotes the outcome {1,2,3,4}. The probability of any such sequence will be: 5 1 1 4 6 6 6 x y x y                     where x = the number of sixes in the sequence and y = the number of fives in the sequence
  • 9. Now p(x,y) = P[X = x, Y = y] 5 1 1 4 6 6 6 x y x y K                      Where K = the number of sequences of length 5 containing x sixes and y fives. 5 5 5 5 x x y x y x y                           5 ! 5! 5! ! 5 ! ! 5 ! ! ! 5 ! x x x y x y x y x y                    
  • 10. Thus p(x,y) = P[X = x, Y = y]   5 5! 1 1 4 ! ! 5 ! 6 6 6 x y x y x y x y                        if x + y ≤ 5 .
  • 11. Table: p(x,y) 0 1 2 3 4 5 0 0.1317 0.1646 0.0823 0.0206 0.0026 0.0001 1 0.1646 0.1646 0.0617 0.0103 0.0006 0 2 0.0823 0.0617 0.0154 0.0013 0 0 3 0.0206 0.0103 0.0013 0 0 0 4 0.0026 0.0006 0 0 0 0 5 0.0001 0 0 0 0 0   5 5! 1 1 4 ! ! 5 ! 6 6 6 x y x y x y x y                       
  • 13. General properties of the joint probability function; p(x,y) = P[X = x, Y = y]   1. 0 , 1 p x y     2. , 1 x y p x y       3. , , P X Y A p x y          , x y A 
  • 14. Example: A die is rolled n = 5 times X = the number of times a “six” appears. Y = the number of times a “five” appears. What is the probability that we roll more sixes than fives i.e. what is P[X > Y]?
  • 15. Table: p(x,y) 0 1 2 3 4 5 0 0.1317 0.1646 0.0823 0.0206 0.0026 0.0001 1 0.1646 0.1646 0.0617 0.0103 0.0006 0 2 0.0823 0.0617 0.0154 0.0013 0 0 3 0.0206 0.0103 0.0013 0 0 0 4 0.0026 0.0006 0 0 0 0 5 0.0001 0 0 0 0 0   5 5! 1 1 4 ! ! 5 ! 6 6 6 x y x y x y x y                            , 0.3441 P X Y p x y     x y 
  • 17. Definition: Let X and Y denote two discrete random variables with joint probability function p(x,y) = P[X = x, Y = y] Then pX(x) = P[X = x] is called the marginal probability function of X. pY(y) = P[Y = y] is called the marginal probability function of Y. and
  • 18. Note: Let y1, y2, y3, … denote the possible values of Y.     X p x P X x       1 2 , , P X x Y y X x Y y                1 2 , , P X x Y y P X x Y y            1 2 , , p x y p x y        , , j j y p x y p x y     Thus the marginal probability function of X, pX(x) is obtained from the joint probability function of X and Y by summing p(x,y) over the possible values of Y.
  • 19. Also     Y p y P Y y       1 2 , , P X x Y y X x Y y                1 2 , , P X x Y y P X x Y y            1 2 , , p x y p x y        , , i i x p x y p x y    
  • 20. Example: A die is rolled n = 5 times X = the number of times a “six” appears. Y = the number of times a “five” appears. 0 1 2 3 4 5 p X (x ) 0 0.1317 0.1646 0.0823 0.0206 0.0026 0.0001 0.4019 1 0.1646 0.1646 0.0617 0.0103 0.0006 0 0.4019 2 0.0823 0.0617 0.0154 0.0013 0 0 0.1608 3 0.0206 0.0103 0.0013 0 0 0 0.0322 4 0.0026 0.0006 0 0 0 0 0.0032 5 0.0001 0 0 0 0 0 0.0001 p Y (y ) 0.4019 0.4019 0.1608 0.0322 0.0032 0.0001
  • 22. Definition: Let X and Y denote two discrete random variables with joint probability function p(x,y) = P[X = x, Y = y] Then pX |Y(x|y) = P[X = x|Y = y] is called the conditional probability function of X given Y = y and pY |X(y|x) = P[Y = y|X = x] is called the conditional probability function of Y given X = x
  • 23. Note   X Y p x y P X x Y y                , , Y P X x Y y p x y P Y y p y        Y X p y x P Y y X x                , , X P X x Y y p x y P X x p x      and
  • 24. • Marginal distributions describe how one variable behaves ignoring the other variable. • Conditional distributions describe how one variable behaves when the other variable is held fixed
  • 25. Example: A die is rolled n = 5 times X = the number of times a “six” appears. Y = the number of times a “five” appears. 0 1 2 3 4 5 p X (x ) 0 0.1317 0.1646 0.0823 0.0206 0.0026 0.0001 0.4019 1 0.1646 0.1646 0.0617 0.0103 0.0006 0 0.4019 2 0.0823 0.0617 0.0154 0.0013 0 0 0.1608 3 0.0206 0.0103 0.0013 0 0 0 0.0322 4 0.0026 0.0006 0 0 0 0 0.0032 5 0.0001 0 0 0 0 0 0.0001 p Y (y ) 0.4019 0.4019 0.1608 0.0322 0.0032 0.0001 x y
  • 26. The conditional distribution of X given Y = y. 0 1 2 3 4 5 0 0.3277 0.4096 0.5120 0.6400 0.8000 1.0000 1 0.4096 0.4096 0.3840 0.3200 0.2000 0.0000 2 0.2048 0.1536 0.0960 0.0400 0.0000 0.0000 3 0.0512 0.0256 0.0080 0.0000 0.0000 0.0000 4 0.0064 0.0016 0.0000 0.0000 0.0000 0.0000 5 0.0003 0.0000 0.0000 0.0000 0.0000 0.0000 pX |Y(x|y) = P[X = x|Y = y] x y
  • 27. 0 1 2 3 4 5 0 0.3277 0.4096 0.2048 0.0512 0.0064 0.0003 1 0.4096 0.4096 0.1536 0.0256 0.0016 0.0000 2 0.5120 0.3840 0.0960 0.0080 0.0000 0.0000 3 0.6400 0.3200 0.0400 0.0000 0.0000 0.0000 4 0.8000 0.2000 0.0000 0.0000 0.0000 0.0000 5 1.0000 0.0000 0.0000 0.0000 0.0000 0.0000 The conditional distribution of Y given X = x. pY |X(y|x) = P[Y = y|X = x] x y
  • 28. Example A Bernoulli trial (S - p, F – q = 1 – p) is repeated until two successes have occurred. X = trial on which the first success occurs and Y = trial on which the 2nd success occurs. Find the joint probability function of X, Y. Find the marginal probability function of X and Y. Find the conditional probability functions of Y given X = x and X given Y = y,
  • 29. Solution A typical outcome would be: FFF…FSFFF…FS x - 1 x y y – x - 1 1 1 2 2 if x y x y q pq p q p y x            , , p x y P X x Y y      2 2 if , 0 otherwise y q p y x p x y      
  • 30. p(x,y) - Table y 1 2 3 4 5 6 7 8 x 1 0 p2 p2q p2q2 p2q3 p2q4 p2q5 p2q6 2 0 0 p2q p2q2 p2q3 p2q4 p2q5 p2q6 3 0 0 0 p2q2 p2q3 p2q4 p2q5 p2q6 4 0 0 0 0 p2q3 p2q4 p2q5 p2q6 5 0 0 0 0 0 p2q4 p2q5 p2q6 6 0 0 0 0 0 0 p2q5 p2q6 7 0 0 0 0 0 0 0 p2q6 8 0 0 0 0 0 0 0 0
  • 31. The marginal distribution of X     X p x P X x     , y p x y   2 1 2 2 1 2 2 x x x x p q p q p q p q         2 2 1 y y x p q         2 1 2 3 1 x p q q q q       2 1 1 1 1 x x p q pq q            This is the geometric distribution
  • 32. The marginal distribution of Y     Y p y P Y y     , x p x y     2 2 1 2,3,4, 0 otherwise y y p q y        This is the negative binomial distribution with k = 2.
  • 33. The conditional distribution of X given Y = y   X Y p x y P X x Y y                , , Y P X x Y y p x y P Y y p y      2 2 1 y x p q pq    1 for 1, 2, 3 y x pq y x x x        This is the geometric distribution with time starting at x.
  • 34. The conditional distribution of Y given X = x   Y X p y x P Y y X x                , , X P X x Y y p x y P X x p x          2 2 2 2 1 1 1 y y p q y p q y       This is the uniform distribution on the values 1, 2, …(y – 1)   for 1,2,3, , 1 x y  
  • 36. The joint probability function; p(x,y) = P[X = x, Y = y]   1. 0 , 1 p x y     2. , 1 x y p x y       3. , , P X Y A p x y          , x y A 
  • 38. Definition: Two random variable are said to have joint probability density function f(x,y) if   1. 0 , f x y    2. , 1 f x y dxdy            3. , , P X Y A f x y dxdy        A
  • 39. If   0 , f x y    , f x y dxdy  A then   , z f x y  defines a surface over the x – y plane
  • 41.   , f x y dxdy  A f(x,y)
  • 42. If the region A = {(x,y)| a ≤ x ≤ b, c ≤ y ≤ d} is a rectangular region with sides parallel to the coordinate axes: x y d c a b   , f x y dxdy  A     , , d b b d c a a c f x y dx dy f x y dy dx                   Then
  • 43.   , f x y dxdy  A   , d b c a f x y dxdy   To evaluate   , d b c a f x y dx dy          Then evaluate the outer integral     , b a G y f x y dx   First evaluate the inner integral     , d b d c a c f x y dxdy G y dy   
  • 44. x y d c a b y     , b a G y f x y dx   = area under surface above the line where y is constant     , d b d c a c f x y dxdy G y dy    dy Infinitesimal volume under surface above the line where y is constant
  • 45.   , f x y dxdy  A   , b d a c f x y dydx   The same quantity can be calculated by integrating first with respect to y, than x.   , b d a c f x y dy dx          Then evaluate the outer integral     , d c H x f x y dy   First evaluate the inner integral     , b d b a c a f x y dydx H x dx   
  • 46. x y d c a b x     , d c H x f x y dy   = area under surface above the line where x is constant     , b d b a c a f x y dydx H x dx    dx Infinitesimal volume under surface above the line where x is constant
  • 47.     1 1 1 1 2 3 2 3 0 0 0 0 x y xy dxdy x y xy dydx      Example: Compute Now     1 1 1 1 2 3 2 3 0 0 0 0 x y xy dxdy x y xy dx dy             1 1 3 2 3 0 0 3 2 x x x x y y dy              1 1 2 4 3 0 0 1 1 1 1 3 2 3 2 2 4 y y y y y y dy              1 1 7 6 8 24   
  • 48. The same quantity can be computed by reversing the order of integration     1 1 1 1 2 3 2 3 0 0 0 0 x y xy dydx x y xy dy dx             1 1 2 4 2 0 0 2 4 y y y y x x dx              1 1 3 2 2 0 0 1 1 1 1 2 4 2 3 4 2 x x x x x x dx              1 1 7 6 8 24   
  • 49. Integration over non rectangular regions
  • 50. Suppose the region A is defined as follows A = {(x,y)| a(y) ≤ x ≤ b(y), c ≤ y ≤ d} x y d c a(y) b(y)   , f x y dxdy  A       , b y d c a y f x y dx dy            Then
  • 51. If the region A is defined as follows A = {(x,y)| a ≤ x ≤ b, c(x) ≤ y ≤ d(x) } x y b a d(x) c(x)   , f x y dxdy  A       , d x b a c x f x y dy dx            Then
  • 52. In general the region A can be partitioned into regions of either type x y A1 A3 A4 A2 A
  • 53. Example: Compute the volume under f(x,y) = x2y + xy3 over the region A = {(x,y)| x + y ≤ 1, 0 ≤ x, 0 ≤ y} x y x + y = 1 (1, 0) (0, 1)
  • 54. Integrating first with respect to x than y x y x + y = 1 (1, 0) (0, 1) (0, y) (1 - y, y)   1 1 2 3 0 0 y x y xy dx dy                  1 1 2 3 2 3 0 0 y x y xy dxdy x y xy dxdy        A
  • 55. and   1 1 2 3 0 0 y x y xy dx dy             1 1 3 2 3 0 0 3 2 x y x x x y y dy                   3 2 1 3 0 1 1 3 2 y y y y dy              1 2 3 4 3 4 5 0 3 3 2 3 2 y y y y y y y dy               3 1 1 1 2 1 2 4 5 4 5 6 1 3 2        1 1 1 1 1 1 1 6 3 4 15 8 5 12        20 40 30 8 15 24 10 3 1 120 120 40         
  • 56. Now integrating first with respect to y than x x y x + y = 1 (1, 0) (0, 1) (x, 0) (x, 1 – x )   1 1 2 3 0 0 x x y xy dy dx                1 1 2 3 2 3 0 0 x x y xy dydx x y xy dydx        A
  • 57. Hence   1 1 2 3 0 0 x x y xy dy dx           1 1 2 4 2 0 0 2 4 y x y y y x x dx                         2 4 1 2 0 1 1 2 4 x x x x dx      1 2 3 4 2 3 4 5 0 2 4 6 4 2 4 x x x x x x x x dx          1 2 3 4 5 0 2 2 2 4 x x x x x dx       15 20 15 12 6 1 1 1 1 1 4 8 6 8 10 20 120 120           
  • 59. Definition: Two random variable are said to have joint probability density function f(x,y) if   1. 0 , f x y    2. , 1 f x y dxdy            3. , , P X Y A f x y dxdy        A
  • 60. Definition: Let X and Y denote two random variables with joint probability density function f(x,y) then the marginal density of X is     , X f x f x y dy     the marginal density of Y is     , Y f y f x y dx    
  • 61. Definition: Let X and Y denote two random variables with joint probability density function f(x,y) and marginal densities fX(x), fY(y) then the conditional density of Y given X = x       , Y X X f x y f y x f x  conditional density of X given Y = y       , X Y Y f x y f x y f y 
  • 62. The bivariate Normal distribution
  • 63. Let   2 2 1 1 1 1 2 2 2 2 1 1 2 2 1 2 2 2 , 1 x x x x Q x x                                                        1 2 1 , 2 1 2 2 1 2 1 , e 2 1 Q x x f x x        where This distribution is called the bivariate Normal distribution. The parameters are 1, 2 , 1, 2 and .
  • 64. Surface Plots of the bivariate Normal distribution
  • 65. Note:   2 2 1 1 1 1 2 2 2 2 1 1 2 2 1 2 2 2 , 1 x x x x Q x x                                                        1 2 1 , 2 1 2 2 1 2 1 , e 2 1 Q x x f x x        is constant when is constant. This is true when x1, x2 lie on an ellipse centered at 1, 2 .
  • 66.
  • 68.     1 1 1 2 2 , f x f x x dx     Marginal distributions for the Bivariate Normal distribution Recall the definition of marginal distributions for continuous random variables: and It can be shown that in the case of the bivariate normal distribution the marginal distribution of xi is Normal with mean i and standard deviation i.     2 2 1 2 1 , f x f x x dx    
  • 69. The marginal distributions of x2 is     2 2 1 2 1 , f x f x x dx         1 2 1 , 2 1 2 1 2 1 e 2 1 Q x x dx           where   2 2 1 1 1 1 2 2 2 2 1 1 2 2 1 2 2 2 , 1 x x x x Q x x                                                  Proof:
  • 70. Now:   2 2 1 1 1 1 2 2 2 2 1 1 2 2 1 2 2 2 , 1 x x x x Q x x                                                  2 2 2 1 1 1 2 2 2 2 x a x a a c x c b b b b                    2 1 1 2 2 1 2 2 2 2 1 2 1 1 1 x x x                                        2 2 2 2 2 2 1 1 2 2 2 2 2 1 2 1 2 2 1 1 1 x x                   
  • 71. Hence   2 2 2 1 1 or 1 b b           Also     1 2 2 2 2 2 2 1 1 1 x a b                    1 1 2 2 2 1 1 x                    and   1 1 2 2 a x        
  • 72. Finally           2 2 2 2 2 2 2 1 1 2 2 2 2 2 2 1 2 1 2 2 1 1 1 x x a c b                                2 2 2 2 2 2 2 1 1 2 2 2 2 2 2 1 2 1 2 2 1 1 1 x x a c b                                2 2 2 2 2 2 1 1 2 2 2 2 2 1 2 1 2 2 1 1 1 x x                         2 1 1 2 2 2 1 x                   
  • 73. and       2 2 2 1 1 1 1 2 2 2 2 2 2 2 2 2 1 1 2 1 c x x                       2 1 1 2 2 2 x                        2 2 2 1 2 2 2 2 2 2 1 1 1 1 x                 2 2 2 2 x          
  • 74. Summarizing   2 2 1 1 1 1 2 2 2 2 1 1 2 2 1 2 2 2 , 1 x x x x Q x x                                                  2 1 x a c b          where 2 1 1 b       1 1 2 2 a x         2 2 2 2 x c           and
  • 75. Thus     2 2 1 2 1 , f x f x x dx         1 2 1 , 2 1 2 1 2 1 e 2 1 Q x x dx             2 1 1 2 1 2 1 2 1 e 2 1 x a c b dx                             2 1 1 2 2 1 2 1 2 2 1 e 2 2 1 x a c b be dx b                     2 2 2 2 1 2 2 1 2 x e            
  • 76. Thus the marginal distribution of x2 is Normal with mean 2 and standard deviation 2. Similarly the marginal distribution of x1 is Normal with mean 1 and standard deviation 1.
  • 77.       1 2 1 2 12 2 2 , f x x f x x f x  Conditional distributions for the Bivariate Normal distribution Recall the definition of conditional distributions for continuous random variables: and It can be shown that in the case of the bivariate normal distribution the conditional distribution of xi given xj is Normal with:       1 2 2 1 21 1 1 , f x x f x x f x  and mean standard deviation   i i j j i j j x          2 1 i i j     
  • 78. Proof       1 2 2 1 21 1 1 , f x x f x x f x      1 2 2 2 2 2 1 , 2 2 1 2 1 2 2 e 2 1 1 2 Q x x x e                     2 2 2 1 2 2 2 2 1 2 2 2 1 1 1 1 , 2 2 2 2 2 2 1 1 e e 2 1 2 1 x a x x c Q x x b                                              
  • 79. where 2 1 1 b       1 1 2 2 a x         2 2 2 2 x c           and   2 1 1 2 1 2 12 1 2 x a b f x x e b           Hence Thus the conditional distribution of x2 given x1 is Normal with: and mean standard deviation   1 1 2 2 12 2 a x           2 1 12 1 b      
  • 80. Bivariate Normal Distribution with marginal distributions
  • 81. Bivariate Normal Distribution with conditional distribution
  • 82. ( 1, 2) x2 x1 Regression Regression to the mean   2 2 1 2 21 1 x          Major axis of ellipses
  • 83. Suppose that a rectangle is constructed by first choosing its length, X and then choosing its width Y. Its length X is selected form an exponential distribution with mean  = 1/l = 5. Once the length has been chosen its width, Y, is selected from a uniform distribution form 0 to half its length. Find the probability that its area A = XY is less than 4. Example:
  • 84. Solution:       , X Y X f x y f x f y x    1 5 1 5 for 0 x X f x e x      1 if 0 2 2 Y X f y x y x x    1 1 5 5 1 2 5 5 1 = if 0 2, 0 2 x x x e e y x x x      
  • 85. xy = 4 y = x/2 2 4 y x x   2 8 or 8 2 2 x x    2 2 2 2 2 y x      2 2, 2
  • 86.   2 2, 2       4 2 2 2 0 0 0 2 2 4 , , x x P XY f x y dydx f x y dydx        
  • 87.       4 2 2 2 0 0 0 2 2 4 , , x x P XY f x y dydx f x y dydx         1 1 5 5 4 2 2 2 2 2 5 5 0 0 0 2 2 x x x x x x e dydx e dydx          1 1 5 5 2 2 2 2 4 5 2 5 0 2 2 x x x x x x e dx e dx        1 1 5 5 2 2 2 8 1 5 5 0 2 2 x x e dx x e dx         This part can be evaluated This part may require Numerical evaluation
  • 88.
  • 90. Definition Let X1, X2, …, Xk denote k discrete random variables, then p(x1, x2, …, xk ) is joint probability function of X1, X2, …, Xk if   1 1 2. , , 1 n n x x p x x      1 1. 0 , , 1 n p x x       1 1 3. , , , , n n P X X A p x x           1, , n x x A 
  • 91. Definition Let X1, X2, …, Xk denote k continuous random variables, then f(x1, x2, …, xk ) is joint density function of X1, X2, …, Xk if   1 1 2. , , , , 1 n n f x x dx dx          1 1. , , 0 n f x x      1 1 1 3. , , , , , , n n n P X X A f x x dx dx         A
  • 92. Example: The Multinomial distribution Suppose that we observe an experiment that has k possible outcomes {O1, O2, …, Ok } independently n times. Let p1, p2, …, pk denote probabilities of O1, O2, …, Ok respectively. Let Xi denote the number of times that outcome Oi occurs in the n repetitions of the experiment. Then the joint probability function of the random variables X1, X2, …, Xk is   1 2 1 1 2 1 2 ! , , ! ! ! k x x x n k k n p x x p p p x x x 
  • 93. Note: is the probability of a sequence of length n containing x1 outcomes O1 x2 outcomes O2 … xk outcomes Ok 1 2 1 2 k x x x k p p p
  • 94. 1 2 1 2 ! ! ! ! k k n n x x x x x x        is the number of ways of choosing the positions for the x1 outcomes O1, x2 outcomes O2, …, xk outcomes Ok 1 2 1 3 1 2 k k n x x x n n x x x x x                                  1 1 2 1 1 2 1 2 3 1 2 3 ! ! ! ! ! ! ! ! ! n x n x x n x n x x n x x x n x x x                           1 2 ! ! ! ! k n x x x 
  • 95. is called the Multinomial distribution   1 2 1 1 2 1 2 ! , , ! ! ! k x x x n k k n p x x p p p x x x  1 2 1 2 1 2 k x x x k k n p p p x x x       
  • 96. Example: Suppose that a treatment for back pain has three possible outcomes: O1 - Complete cure (no pain) – (30% chance) O2 - Reduced pain – (50% chance) O3 - No change – (20% chance) Hence p1 = 0.30, p2 = 0.50, p3 = 0.20. Suppose the treatment is applied to n = 4 patients suffering back pain and let X = the number that result in a complete cure, Y = the number that result in just reduced pain, and Z = the number that result in no change. Find the distribution of X, Y and Z. Compute P[X + Y ≥ Z]         4! , , 0.30 0.50 0.20 4 ! ! ! x y z p x y z x y z x y z    
  • 97. Table: p(x,y,z) z x y 0 1 2 3 4 0 0 0 0 0 0 0.0016 0 1 0 0 0 0.0160 0 0 2 0 0 0.0600 0 0 0 3 0 0.1000 0 0 0 0 4 0.0625 0 0 0 0 1 0 0 0 0 0.0096 0 1 1 0 0 0.0720 0 0 1 2 0 0.1800 0 0 0 1 3 0.1500 0 0 0 0 1 4 0 0 0 0 0 2 0 0 0 0.0216 0 0 2 1 0 0.1080 0 0 0 2 2 0.1350 0 0 0 0 2 3 0 0 0 0 0 2 4 0 0 0 0 0 3 0 0 0.0216 0 0 0 3 1 0.0540 0 0 0 0 3 2 0 0 0 0 0 3 3 0 0 0 0 0 3 4 0 0 0 0 0 4 0 0.0081 0 0 0 0 4 1 0 0 0 0 0 4 2 0 0 0 0 0 4 3 0 0 0 0 0 4 4 0 0 0 0 0
  • 98. P [X + Y ≥ Z] z x y 0 1 2 3 4 0 0 0 0 0 0 0.0016 0 1 0 0 0 0.0160 0 0 2 0 0 0.0600 0 0 0 3 0 0.1000 0 0 0 0 4 0.0625 0 0 0 0 1 0 0 0 0 0.0096 0 1 1 0 0 0.0720 0 0 1 2 0 0.1800 0 0 0 1 3 0.1500 0 0 0 0 1 4 0 0 0 0 0 2 0 0 0 0.0216 0 0 2 1 0 0.1080 0 0 0 2 2 0.1350 0 0 0 0 2 3 0 0 0 0 0 2 4 0 0 0 0 0 3 0 0 0.0216 0 0 0 3 1 0.0540 0 0 0 0 3 2 0 0 0 0 0 3 3 0 0 0 0 0 3 4 0 0 0 0 0 4 0 0.0081 0 0 0 0 4 1 0 0 0 0 0 4 2 0 0 0 0 0 4 3 0 0 0 0 0 4 4 0 0 0 0 0 = 0.9728
  • 99. Example: The Multivariate Normal distribution Recall the univariate normal distribution     2 1 2 1 2 x f x e       the bivariate normal distribution            2 2 1 2 2 1 2 2 1 , 2 1 x x x x y y x x x x y y x y f x y e                             
  • 100. The k-variate Normal distribution           1 1 2 1 /2 1/2 1 , , 2 k k f x x f e           x μ x μ x where 1 2 k x x x              x 1 2 k                 μ 11 12 1 12 22 2 1 2 k k k k kk                       
  • 102. Definition Let X1, X2, …, Xq, Xq+1 …, Xk denote k discrete random variables with joint probability function p(x1, x2, …, xq, xq+1 …, xk )     1 12 1 1 , , , , q n q q n x x p x x p x x      then the marginal joint probability function of X1, X2, …, Xq is
  • 103. Definition Let X1, X2, …, Xq, Xq+1 …, Xk denote k continuous random variables with joint probability density function f(x1, x2, …, xq, xq+1 …, xk )     12 1 1 1 , , , , q q n q n f x x f x x dx dx          then the marginal joint probability function of X1, X2, …, Xq is
  • 105. Definition Let X1, X2, …, Xq, Xq+1 …, Xk denote k discrete random variables with joint probability function p(x1, x2, …, xq, xq+1 …, xk )       1 1 1 1 1 1 1 , , , , , , , , k q q k q q k q k q k p x x p x x x x p x x         then the conditional joint probability function of X1, X2, …, Xq given Xq+1 = xq+1 , …, Xk = xk is
  • 106. Definition Let X1, X2, …, Xq, Xq+1 …, Xk denote k continuous random variables with joint probability density function f(x1, x2, …, xq, xq+1 …, xk ) then the conditional joint probability function of X1, X2, …, Xq given Xq+1 = xq+1 , …, Xk = xk is Definition       1 1 1 1 1 1 1 , , , , , , , , k q q k q q k q k q k f x x f x x x x f x x        
  • 107. Definition Let X1, X2, …, Xq, Xq+1 …, Xk denote k continuous random variables with joint probability density function f(x1, x2, …, xq, xq+1 …, xk ) then the variables X1, X2, …, Xq are independent of Xq+1, …, Xk if Definition – Independence of sets of vectors       1 1 1 1 1 , , , , , , k q q q k q k f x x f x x f x x      A similar definition for discrete random variables.
  • 108. Definition Let X1, X2, …, Xk denote k continuous random variables with joint probability density function f(x1, x2, …, xk ) then the variables X1, X2, …, Xk are called mutually independent if Definition – Mutual Independence         1 1 1 2 2 , , k k k f x x f x f x f x   A similar definition for discrete random variables.
  • 109. Example Let X, Y, Z denote 3 jointly distributed random variable with joint density function then     2 0 1,0 1,0 1 , , 0 otherwise K x yz x y z f x y z              Find the value of K. Determine the marginal distributions of X, Y and Z. Determine the joint marginal distributions of X, Y X, Z Y, Z
  • 110. Solution     1 1 1 2 0 0 0 1 , , f x y z dxdydz K x yz dxdydz              Determining the value of K. 1 1 1 1 1 3 0 0 0 0 0 1 3 3 x x x K xyz dydz K yz dydz                     1 1 1 2 0 0 0 1 1 1 3 2 3 2 y y y K y z dz K z dz                     1 2 0 1 1 7 1 3 4 3 4 12 z z K K K                   12 if 7 K 
  • 111.       1 1 2 1 0 0 12 , , 7 f x f x y z dydz x yz dydz           The marginal distribution of X. 1 1 1 2 2 2 0 0 0 12 12 1 7 2 7 2 y y y x y z dz x z dz                     1 2 2 2 0 12 12 1 for 0 1 7 4 7 4 z x z x x                  
  • 112.       1 2 12 0 12 , , , 7 f x y f x y z dz x yz dz        The marginal distribution of X,Y. 1 2 2 0 12 7 2 z z z x z y           2 12 1 for 0 1,0 1 7 2 x y x y            
  • 113. Find the conditional distribution of: 1. Z given X = x, Y = y, 2. Y given X = x, Z = z, 3. X given Y = y, Z = z, 4. Y , Z given X = x, 5. X , Z given Y = y 6. X , Y given Z = z 7. Y given X = x, 8. X given Y = y 9. X given Z = z 10. Z given X = x, 11. Z given Y = y 12. Y given Z = z
  • 114. The marginal distribution of X,Y.   2 12 12 1 , for 0 1,0 1 7 2 f x y x y x y             Thus the conditional distribution of Z given X = x,Y = y is       2 2 12 12 , , 7 12 1 , 7 2 x yz f x y z f x y x y          2 2 for 0 1 1 2 x yz z x y     
  • 115. The marginal distribution of X.   2 1 12 1 for 0 1 7 4 f x x x           Thus the conditional distribution of Y , Z given X = x is       2 2 1 12 , , 7 12 1 7 4 x yz f x y z f x x          2 2 for 0 1,0 1 1 4 x yz y z x       