The document discusses joint and marginal probability distributions of random variables. It provides examples of defining joint probability functions p(x,y) for two random variables X and Y based on a card hand experiment and dice rolling experiment. It also discusses calculating marginal probabilities by summing the joint probabilities over all values of one variable. Conditional probabilities are defined as the probability of one variable given a particular value of the other.
2. Quite often there will be 2 or more random
variables (X, Y, etc) defined for the same
random experiment.
Example:
A bridge hand (13 cards) is selected from a deck
of 52 cards.
X = the number of spades in the hand.
Y = the number of hearts in the hand.
In this example we will define:
p(x,y) = P[X = x, Y = y]
3. The function
p(x,y) = P[X = x, Y = y]
is called the joint probability function of
X and Y.
4. Note:
The possible values of X are 0, 1, 2, …, 13
The possible values of Y are also 0, 1, 2, …, 13
and X + Y ≤ 13.
, ,
p x y P X x Y y
13 13 26
13
52
13
x y x y
The total number of
ways of choosing the
13 cards for the hand
The number of
ways of choosing
the x spades for the
hand
The number of
ways of choosing
the y hearts for the
hand
The number of ways
of completing the hand
with diamonds and
clubs.
7. Example:
A die is rolled n = 5 times
X = the number of times a “six” appears.
Y = the number of times a “five” appears.
Now
p(x,y) = P[X = x, Y = y]
The possible values of X are 0, 1, 2, 3, 4, 5.
The possible values of Y are 0, 1, 2, 3, 4, 5.
and X + Y ≤ 5
8. A typical outcome of rolling a die n = 5 times
will be a sequence F5FF6 where F denotes the
outcome {1,2,3,4}. The probability of any such
sequence will be:
5
1 1 4
6 6 6
x y x y
where
x = the number of sixes in the sequence and
y = the number of fives in the sequence
9. Now
p(x,y) = P[X = x, Y = y]
5
1 1 4
6 6 6
x y x y
K
Where K = the number of sequences of length 5
containing x sixes and y fives.
5 5 5
5
x x y
x y x y
5 !
5! 5!
! 5 ! ! 5 ! ! ! 5 !
x
x x y x y x y x y
10. Thus
p(x,y) = P[X = x, Y = y]
5
5! 1 1 4
! ! 5 ! 6 6 6
x y x y
x y x y
if x + y ≤ 5 .
13. General properties of the joint probability
function;
p(x,y) = P[X = x, Y = y]
1. 0 , 1
p x y
2. , 1
x y
p x y
3. , ,
P X Y A p x y
,
x y A
14. Example:
A die is rolled n = 5 times
X = the number of times a “six” appears.
Y = the number of times a “five” appears.
What is the probability that we roll more sixes
than fives
i.e. what is P[X > Y]?
15. Table: p(x,y)
0 1 2 3 4 5
0 0.1317 0.1646 0.0823 0.0206 0.0026 0.0001
1 0.1646 0.1646 0.0617 0.0103 0.0006 0
2 0.0823 0.0617 0.0154 0.0013 0 0
3 0.0206 0.0103 0.0013 0 0 0
4 0.0026 0.0006 0 0 0 0
5 0.0001 0 0 0 0 0
5
5! 1 1 4
! ! 5 ! 6 6 6
x y x y
x y x y
, 0.3441
P X Y p x y
x y
17. Definition:
Let X and Y denote two discrete random variables
with joint probability function
p(x,y) = P[X = x, Y = y]
Then
pX(x) = P[X = x] is called the marginal
probability function of X.
pY(y) = P[Y = y] is called the marginal
probability function of Y.
and
18. Note: Let y1, y2, y3, … denote the possible values of Y.
X
p x P X x
1 2
, ,
P X x Y y X x Y y
1 2
, ,
P X x Y y P X x Y y
1 2
, ,
p x y p x y
, ,
j
j y
p x y p x y
Thus the marginal probability function of X, pX(x) is
obtained from the joint probability function of X and Y by
summing p(x,y) over the possible values of Y.
19. Also
Y
p y P Y y
1 2
, ,
P X x Y y X x Y y
1 2
, ,
P X x Y y P X x Y y
1 2
, ,
p x y p x y
, ,
i
i x
p x y p x y
20. Example:
A die is rolled n = 5 times
X = the number of times a “six” appears.
Y = the number of times a “five” appears.
0 1 2 3 4 5 p X (x )
0 0.1317 0.1646 0.0823 0.0206 0.0026 0.0001 0.4019
1 0.1646 0.1646 0.0617 0.0103 0.0006 0 0.4019
2 0.0823 0.0617 0.0154 0.0013 0 0 0.1608
3 0.0206 0.0103 0.0013 0 0 0 0.0322
4 0.0026 0.0006 0 0 0 0 0.0032
5 0.0001 0 0 0 0 0 0.0001
p Y (y ) 0.4019 0.4019 0.1608 0.0322 0.0032 0.0001
22. Definition:
Let X and Y denote two discrete random variables
with joint probability function
p(x,y) = P[X = x, Y = y]
Then
pX |Y(x|y) = P[X = x|Y = y] is called the conditional
probability function of X given Y
= y
and
pY |X(y|x) = P[Y = y|X = x] is called the conditional
probability function of Y given
X = x
23. Note
X Y
p x y P X x Y y
, ,
Y
P X x Y y p x y
P Y y p y
Y X
p y x P Y y X x
, ,
X
P X x Y y p x y
P X x p x
and
24. • Marginal distributions describe how one
variable behaves ignoring the other variable.
• Conditional distributions describe how one
variable behaves when the other variable is
held fixed
25. Example:
A die is rolled n = 5 times
X = the number of times a “six” appears.
Y = the number of times a “five” appears.
0 1 2 3 4 5 p X (x )
0 0.1317 0.1646 0.0823 0.0206 0.0026 0.0001 0.4019
1 0.1646 0.1646 0.0617 0.0103 0.0006 0 0.4019
2 0.0823 0.0617 0.0154 0.0013 0 0 0.1608
3 0.0206 0.0103 0.0013 0 0 0 0.0322
4 0.0026 0.0006 0 0 0 0 0.0032
5 0.0001 0 0 0 0 0 0.0001
p Y (y ) 0.4019 0.4019 0.1608 0.0322 0.0032 0.0001
x
y
26. The conditional distribution of X given Y = y.
0 1 2 3 4 5
0 0.3277 0.4096 0.5120 0.6400 0.8000 1.0000
1 0.4096 0.4096 0.3840 0.3200 0.2000 0.0000
2 0.2048 0.1536 0.0960 0.0400 0.0000 0.0000
3 0.0512 0.0256 0.0080 0.0000 0.0000 0.0000
4 0.0064 0.0016 0.0000 0.0000 0.0000 0.0000
5 0.0003 0.0000 0.0000 0.0000 0.0000 0.0000
pX |Y(x|y) = P[X = x|Y = y]
x
y
27. 0 1 2 3 4 5
0 0.3277 0.4096 0.2048 0.0512 0.0064 0.0003
1 0.4096 0.4096 0.1536 0.0256 0.0016 0.0000
2 0.5120 0.3840 0.0960 0.0080 0.0000 0.0000
3 0.6400 0.3200 0.0400 0.0000 0.0000 0.0000
4 0.8000 0.2000 0.0000 0.0000 0.0000 0.0000
5 1.0000 0.0000 0.0000 0.0000 0.0000 0.0000
The conditional distribution of Y given X = x.
pY |X(y|x) = P[Y = y|X = x]
x
y
28. Example
A Bernoulli trial (S - p, F – q = 1 – p) is repeated until
two successes have occurred.
X = trial on which the first success occurs
and
Y = trial on which the 2nd success occurs.
Find the joint probability function of X, Y.
Find the marginal probability function of X and Y.
Find the conditional probability functions of Y given X
= x and X given Y = y,
29. Solution
A typical outcome would be:
FFF…FSFFF…FS
x - 1
x y
y – x - 1
1 1 2 2
if
x y x y
q pq p q p y x
, ,
p x y P X x Y y
2 2
if
,
0 otherwise
y
q p y x
p x y
31. The marginal distribution of X
X
p x P X x
,
y
p x y
2 1 2 2 1 2 2
x x x x
p q p q p q p q
2 2
1
y
y x
p q
2 1 2 3
1
x
p q q q q
2 1 1
1
1
x x
p q pq
q
This is the geometric distribution
32. The marginal distribution of Y
Y
p y P Y y
,
x
p x y
2 2
1 2,3,4,
0 otherwise
y
y p q y
This is the negative binomial distribution with k = 2.
33. The conditional distribution of X given Y = y
X Y
p x y P X x Y y
, ,
Y
P X x Y y p x y
P Y y p y
2 2
1
y
x
p q
pq
1
for 1, 2, 3
y x
pq y x x x
This is the geometric distribution with time starting at x.
34. The conditional distribution of Y given X = x
Y X
p y x P Y y X x
, ,
X
P X x Y y p x y
P X x p x
2 2
2 2
1
1 1
y
y
p q
y p q y
This is the uniform distribution on the values 1, 2, …(y – 1)
for 1,2,3, , 1
x y
36. The joint probability function;
p(x,y) = P[X = x, Y = y]
1. 0 , 1
p x y
2. , 1
x y
p x y
3. , ,
P X Y A p x y
,
x y A
38. Definition: Two random variable are said to have
joint probability density function f(x,y) if
1. 0 ,
f x y
2. , 1
f x y dxdy
3. , ,
P X Y A f x y dxdy
A
39. If
0 ,
f x y
,
f x y dxdy
A
then
,
z f x y
defines a surface over the x – y plane
42. If the region A = {(x,y)| a ≤ x ≤ b, c ≤ y ≤ d} is a
rectangular region with sides parallel to the
coordinate axes:
x
y
d
c
a b
,
f x y dxdy
A
, ,
d b b d
c a a c
f x y dx dy f x y dy dx
Then
43.
,
f x y dxdy
A
,
d b
c a
f x y dxdy
To evaluate
,
d b
c a
f x y dx dy
Then evaluate the outer integral
,
b
a
G y f x y dx
First evaluate the inner integral
,
d b d
c a c
f x y dxdy G y dy
44. x
y
d
c
a b
y
,
b
a
G y f x y dx
= area under surface above the
line where y is constant
,
d b d
c a c
f x y dxdy G y dy
dy
Infinitesimal volume under
surface above the line where
y is constant
45.
,
f x y dxdy
A
,
b d
a c
f x y dydx
The same quantity can be calculated by integrating
first with respect to y, than x.
,
b d
a c
f x y dy dx
Then evaluate the outer integral
,
d
c
H x f x y dy
First evaluate the inner integral
,
b d b
a c a
f x y dydx H x dx
46. x
y
d
c
a b
x
,
d
c
H x f x y dy
= area under surface above the
line where x is constant
,
b d b
a c a
f x y dydx H x dx
dx
Infinitesimal volume under
surface above the line where
x is constant
47.
1 1 1 1
2 3 2 3
0 0 0 0
x y xy dxdy x y xy dydx
Example: Compute
Now
1 1 1 1
2 3 2 3
0 0 0 0
x y xy dxdy x y xy dx dy
1
1 3 2
3
0 0
3 2
x
x
x x
y y dy
1
1 2 4
3
0 0
1 1 1 1
3 2 3 2 2 4
y
y
y y
y y dy
1 1 7
6 8 24
48. The same quantity can be computed by reversing
the order of integration
1 1 1 1
2 3 2 3
0 0 0 0
x y xy dydx x y xy dy dx
1
1 2 4
2
0 0
2 4
y
y
y y
x x dx
1
1 3 2
2
0 0
1 1 1 1
2 4 2 3 4 2
x
x
x x
x x dx
1 1 7
6 8 24
50. Suppose the region A is defined as follows
A = {(x,y)| a(y) ≤ x ≤ b(y), c ≤ y ≤ d}
x
y
d
c
a(y) b(y)
,
f x y dxdy
A
,
b y
d
c a y
f x y dx dy
Then
51. If the region A is defined as follows
A = {(x,y)| a ≤ x ≤ b, c(x) ≤ y ≤ d(x) }
x
y
b
a
d(x)
c(x)
,
f x y dxdy
A
,
d x
b
a c x
f x y dy dx
Then
52. In general the region A can be partitioned into
regions of either type
x
y
A1
A3
A4
A2
A
53. Example:
Compute the volume under f(x,y) = x2y + xy3 over the
region A = {(x,y)| x + y ≤ 1, 0 ≤ x, 0 ≤ y}
x
y
x + y = 1
(1, 0)
(0, 1)
54. Integrating first with respect to x than y
x
y
x + y = 1
(1, 0)
(0, 1)
(0, y) (1 - y, y)
1
1
2 3
0 0
y
x y xy dx dy
1
1
2 3 2 3
0 0
y
x y xy dxdy x y xy dxdy
A
55. and
1
1
2 3
0 0
y
x y xy dx dy
1
1 3 2
3
0 0
3 2
x y
x
x x
y y dy
3 2
1
3
0
1 1
3 2
y y
y y dy
1 2 3 4 3 4 5
0
3 3 2
3 2
y y y y y y y
dy
3
1 1 1 2 1
2 4 5 4 5 6
1
3 2
1 1 1 1 1 1 1
6 3 4 15 8 5 12
20 40 30 8 15 24 10 3 1
120 120 40
56. Now integrating first with respect to y than x
x
y
x + y = 1
(1, 0)
(0, 1)
(x, 0)
(x, 1 – x )
1 1
2 3
0 0
x
x y xy dy dx
1 1
2 3 2 3
0 0
x
x y xy dydx x y xy dydx
A
57. Hence
1 1
2 3
0 0
x
x y xy dy dx
1
1 2 4
2
0 0
2 4
y x
y
y y
x x dx
2 4
1
2
0
1 1
2 4
x x
x x dx
1 2 3 4 2 3 4 5
0
2 4 6 4
2 4
x x x x x x x x
dx
1 2 3 4 5
0
2 2 2
4
x x x x x
dx
15 20 15 12 6
1 1 1 1 1 4
8 6 8 10 20 120 120
59. Definition: Two random variable are said to have
joint probability density function f(x,y) if
1. 0 ,
f x y
2. , 1
f x y dxdy
3. , ,
P X Y A f x y dxdy
A
60. Definition: Let X and Y denote two random
variables with joint probability density function
f(x,y) then
the marginal density of X is
,
X
f x f x y dy
the marginal density of Y is
,
Y
f y f x y dx
61. Definition: Let X and Y denote two random
variables with joint probability density function
f(x,y) and marginal densities fX(x), fY(y) then
the conditional density of Y given X = x
,
Y X
X
f x y
f y x
f x
conditional density of X given Y = y
,
X Y
Y
f x y
f x y
f y
68.
1 1 1 2 2
,
f x f x x dx
Marginal distributions for the Bivariate Normal
distribution
Recall the definition of marginal distributions
for continuous random variables:
and
It can be shown that in the case of the bivariate
normal distribution the marginal distribution of xi
is Normal with mean i and standard deviation i.
2 2 1 2 1
,
f x f x x dx
69. The marginal distributions of x2 is
2 2 1 2 1
,
f x f x x dx
1 2
1
,
2
1
2
1 2
1
e
2 1
Q x x
dx
where
2 2
1 1 1 1 2 2 2 2
1 1 2 2
1 2 2
2
,
1
x x x x
Q x x
Proof:
70. Now:
2 2
1 1 1 1 2 2 2 2
1 1 2 2
1 2 2
2
,
1
x x x x
Q x x
2 2 2
1 1
1
2 2 2
2
x a x a a
c x c
b b b b
2
1 1 2 2
1
2 2 2
2 1
2
1 1 1
x x
x
2
2
2 2 2 2
1
1
2 2 2 2 2
1 2 1 2
2
1 1 1
x x
71. Hence
2 2 2
1
1 or 1
b b
Also
1 2 2
2 2 2
2 1
1 1
x
a
b
1
1 2
2
2
1
1
x
and
1
1 2
2
a x
74. Summarizing
2 2
1 1 1 1 2 2 2 2
1 1 2 2
1 2 2
2
,
1
x x x x
Q x x
2
1
x a
c
b
where 2
1 1
b
1
1 2
2
a x
2
2 2
2
x
c
and
75. Thus
2 2 1 2 1
,
f x f x x dx
1 2
1
,
2
1
2
1 2
1
e
2 1
Q x x
dx
2
1
1
2
1
2
1 2
1
e
2 1
x a
c
b
dx
2
1
1
2
2
1
2
1 2
2 1
e
2
2 1
x a
c
b
be
dx
b
2
2 2
2
1
2
2
1
2
x
e
76. Thus the marginal distribution of x2 is Normal
with mean 2 and standard deviation 2.
Similarly the marginal distribution of x1 is Normal
with mean 1 and standard deviation 1.
77.
1 2
1 2
12
2 2
,
f x x
f x x
f x
Conditional distributions for the Bivariate Normal
distribution
Recall the definition of conditional distributions
for continuous random variables:
and
It can be shown that in the case of the bivariate
normal distribution the conditional distribution of
xi given xj is Normal with:
1 2
2 1
21
1 1
,
f x x
f x x
f x
and
mean
standard deviation
i
i j j
i j
j
x
2
1
i
i j
78. Proof
1 2
2 1
21
1 1
,
f x x
f x x
f x
1 2
2
2 2
2
1
,
2
2
1 2
1
2
2
e
2 1
1
2
Q x x
x
e
2
2
2
1 2 2
2 2
1 2
2
2
1 1
1 1
,
2 2
2 2
2 2
1 1
e e
2 1 2 1
x a x
x c
Q x x
b
79. where 2
1 1
b
1
1 2
2
a x
2
2 2
2
x
c
and
2
1
1
2
1 2
12
1
2
x a
b
f x x e
b
Hence
Thus the conditional distribution of x2 given x1 is Normal
with:
and
mean
standard deviation
1
1 2 2
12
2
a x
2
1
12
1
b
83. Suppose that a rectangle is constructed by first choosing
its length, X and then choosing its width Y.
Its length X is selected form an exponential distribution
with mean = 1/l = 5. Once the length has been chosen
its width, Y, is selected from a uniform distribution form
0 to half its length.
Find the probability that its area A = XY is less than 4.
Example:
84. Solution:
, X Y X
f x y f x f y x
1
5
1
5 for 0
x
X
f x e x
1
if 0 2
2
Y X
f y x y x
x
1 1
5 5
1 2
5 5
1
= if 0 2, 0
2
x x
x
e e y x x
x
85. xy = 4
y = x/2
2 4
y x x
2
8 or 8 2 2
x x
2 2 2 2 2
y x
2 2, 2
86.
2 2, 2
4
2 2 2
0 0 0
2 2
4 , ,
x
x
P XY f x y dydx f x y dydx
87.
4
2 2 2
0 0 0
2 2
4 , ,
x
x
P XY f x y dydx f x y dydx
1 1
5 5
4
2 2 2
2 2
5 5
0 0 0
2 2
x
x
x x
x x
e dydx e dydx
1 1
5 5
2 2
2 2 4
5 2 5
0 2 2
x x
x
x x x
e dx e dx
1 1
5 5
2 2
2
8
1
5 5
0 2 2
x x
e dx x e dx
This part can be
evaluated
This part may require
Numerical evaluation
90. Definition
Let X1, X2, …, Xk denote k discrete random
variables, then
p(x1, x2, …, xk )
is joint probability function of X1, X2, …, Xk if
1
1
2. , , 1
n
n
x x
p x x
1
1. 0 , , 1
n
p x x
1 1
3. , , , ,
n n
P X X A p x x
1, , n
x x A
91. Definition
Let X1, X2, …, Xk denote k continuous random
variables, then
f(x1, x2, …, xk )
is joint density function of X1, X2, …, Xk if
1 1
2. , , , , 1
n n
f x x dx dx
1
1. , , 0
n
f x x
1 1 1
3. , , , , , ,
n n n
P X X A f x x dx dx
A
92. Example: The Multinomial distribution
Suppose that we observe an experiment that has k
possible outcomes {O1, O2, …, Ok } independently n
times.
Let p1, p2, …, pk denote probabilities of O1, O2, …,
Ok respectively.
Let Xi denote the number of times that outcome Oi
occurs in the n repetitions of the experiment.
Then the joint probability function of the random
variables X1, X2, …, Xk is
1 2
1 1 2
1 2
!
, ,
! ! !
k
x
x x
n k
k
n
p x x p p p
x x x
93. Note:
is the probability of a sequence of length n containing
x1 outcomes O1
x2 outcomes O2
…
xk outcomes Ok
1 2
1 2
k
x
x x
k
p p p
94. 1 2
1 2
!
! ! ! k
k
n
n
x x x
x x x
is the number of ways of choosing the positions for the x1
outcomes O1, x2 outcomes O2, …, xk outcomes Ok
1 2
1
3
1 2
k
k
n x x x
n n x
x x
x x
1 1 2
1 1 2 1 2 3 1 2 3
! !
!
! ! ! ! ! !
n x n x x
n
x n x x n x x x n x x x
1 2
!
! ! !
k
n
x x x
95. is called the Multinomial distribution
1 2
1 1 2
1 2
!
, ,
! ! !
k
x
x x
n k
k
n
p x x p p p
x x x
1 2
1 2
1 2
k
x
x x
k
k
n
p p p
x x x
96. Example:
Suppose that a treatment for back pain has three possible
outcomes:
O1 - Complete cure (no pain) – (30% chance)
O2 - Reduced pain – (50% chance)
O3 - No change – (20% chance)
Hence p1 = 0.30, p2 = 0.50, p3 = 0.20.
Suppose the treatment is applied to n = 4 patients suffering
back pain and let X = the number that result in a complete cure,
Y = the number that result in just reduced pain, and Z = the
number that result in no change.
Find the distribution of X, Y and Z. Compute P[X + Y ≥ Z]
4!
, , 0.30 0.50 0.20 4
! ! !
x y z
p x y z x y z
x y z
99. Example: The Multivariate Normal distribution
Recall the univariate normal distribution
2
1
2
1
2
x
f x e
the bivariate normal distribution
2
2
1
2
2 1
2
2
1
,
2 1
x x
x x y y
x x
x x y y
x y
f x y e
100. The k-variate Normal distribution
1
1
2
1 /2 1/2
1
, ,
2
k k
f x x f e
x μ x μ
x
where
1
2
k
x
x
x
x
1
2
k
μ
11 12 1
12 22 2
1 2
k
k
k k kk
102. Definition
Let X1, X2, …, Xq, Xq+1 …, Xk denote k discrete
random variables with joint probability function
p(x1, x2, …, xq, xq+1 …, xk )
1
12 1 1
, , , ,
q n
q q n
x x
p x x p x x
then the marginal joint probability function
of X1, X2, …, Xq is
103. Definition
Let X1, X2, …, Xq, Xq+1 …, Xk denote k continuous
random variables with joint probability density
function
f(x1, x2, …, xq, xq+1 …, xk )
12 1 1 1
, , , ,
q q n q n
f x x f x x dx dx
then the marginal joint probability function
of X1, X2, …, Xq is
105. Definition
Let X1, X2, …, Xq, Xq+1 …, Xk denote k discrete
random variables with joint probability function
p(x1, x2, …, xq, xq+1 …, xk )
1
1 1
1 1
1 1
, ,
, , , ,
, ,
k
q q k
q q k
q k q k
p x x
p x x x x
p x x
then the conditional joint probability function
of X1, X2, …, Xq given Xq+1 = xq+1 , …, Xk = xk is
106. Definition
Let X1, X2, …, Xq, Xq+1 …, Xk denote k continuous
random variables with joint probability density
function
f(x1, x2, …, xq, xq+1 …, xk )
then the conditional joint probability function
of X1, X2, …, Xq given Xq+1 = xq+1 , …, Xk = xk is
Definition
1
1 1
1 1
1 1
, ,
, , , ,
, ,
k
q q k
q q k
q k q k
f x x
f x x x x
f x x
107. Definition
Let X1, X2, …, Xq, Xq+1 …, Xk denote k continuous
random variables with joint probability density
function
f(x1, x2, …, xq, xq+1 …, xk )
then the variables X1, X2, …, Xq are independent
of Xq+1, …, Xk if
Definition – Independence of sets of vectors
1 1 1 1 1
, , , , , ,
k q q q k q k
f x x f x x f x x
A similar definition for discrete random variables.
108. Definition
Let X1, X2, …, Xk denote k continuous random
variables with joint probability density function
f(x1, x2, …, xk )
then the variables X1, X2, …, Xk are called
mutually independent if
Definition – Mutual Independence
1 1 1 2 2
, , k k k
f x x f x f x f x
A similar definition for discrete random variables.
109. Example
Let X, Y, Z denote 3 jointly distributed random
variable with joint density function then
2
0 1,0 1,0 1
, ,
0 otherwise
K x yz x y z
f x y z
Find the value of K.
Determine the marginal distributions of X, Y and Z.
Determine the joint marginal distributions of
X, Y
X, Z
Y, Z
110. Solution
1 1 1
2
0 0 0
1 , ,
f x y z dxdydz K x yz dxdydz
Determining the value of K.
1
1 1 1 1
3
0 0 0 0
0
1
3 3
x
x
x
K xyz dydz K yz dydz
1
1 1
2
0 0
0
1 1 1
3 2 3 2
y
y
y
K y z dz K z dz
1
2
0
1 1 7
1
3 4 3 4 12
z z
K K K
12
if
7
K
111.
1 1
2
1
0 0
12
, ,
7
f x f x y z dydz x yz dydz
The marginal distribution of X.
1
1 1
2
2 2
0 0
0
12 12 1
7 2 7 2
y
y
y
x y z dz x z dz
1
2
2 2
0
12 12 1
for 0 1
7 4 7 4
z
x z x x
112.
1
2
12
0
12
, , ,
7
f x y f x y z dz x yz dz
The marginal distribution of X,Y.
1
2
2
0
12
7 2
z
z
z
x z y
2
12 1
for 0 1,0 1
7 2
x y x y
113. Find the conditional distribution of:
1. Z given X = x, Y = y,
2. Y given X = x, Z = z,
3. X given Y = y, Z = z,
4. Y , Z given X = x,
5. X , Z given Y = y
6. X , Y given Z = z
7. Y given X = x,
8. X given Y = y
9. X given Z = z
10. Z given X = x,
11. Z given Y = y
12. Y given Z = z
114. The marginal distribution of X,Y.
2
12
12 1
, for 0 1,0 1
7 2
f x y x y x y
Thus the conditional distribution of Z given X = x,Y = y
is
2
2
12
12
, , 7
12 1
,
7 2
x yz
f x y z
f x y
x y
2
2
for 0 1
1
2
x yz
z
x y
115. The marginal distribution of X.
2
1
12 1
for 0 1
7 4
f x x x
Thus the conditional distribution of Y , Z given X = x is
2
2
1
12
, , 7
12 1
7 4
x yz
f x y z
f x
x
2
2
for 0 1,0 1
1
4
x yz
y z
x