SlideShare a Scribd company logo
1 of 24
10. Joint Moments and Joint Characteristic
Functions
Following section 6, in this section we shall introduce
various parameters to compactly represent the information
contained in the joint p.d.f of two r.vs. Given two r.vs X and
Y and a function g ( x, y ), define the r.v
Z = g ( X ,Y )

(10-1)

Using (6-2), we can define the mean of Z to be
µZ = E ( Z ) = ∫

+∞

−∞

(10-2)

z f Z ( z )dz.
1

PILLAI
However, the situation here is similar to that in (6-13), and
it is possible to express the mean of Z = g ( X , Y ) in terms
of f XY ( x, y ) without computing f Z (z ). To see this, recall from
(5-26) and (7-10) that
P ( z < Z ≤ z + ∆z ) = f Z ( z ) ∆z = P ( z < g ( X , Y ) ≤ z + ∆z )
=∑

∑f

( x , y )∈D∆z

XY

( x, y ) ∆x∆y

(10-3)

where D∆z is the region in xy plane satisfying the above
inequality. From (10-3), we get
z f Z ( z ) ∆z = g ( x, y ) ∑

∑

( x , y )∈D∆z

f XY ( x, y ) ∆x ∆y.

(10-4)

As ∆z covers the entire z axis, the corresponding regions D∆z
are nonoverlapping, and they cover the entire xy plane.
2

PILLAI
By integrating (10-4), we obtain the useful formula
E(Z ) = ∫

+∞
−∞

z f Z ( z )dz = ∫

+∞
−∞

∫

+∞
−∞

g ( x, y ) f XY ( x, y )dxdy.

(10-5)

or
E [ g ( X , Y )] = ∫

+∞
−∞

∫

+∞
−∞

g ( x, y ) f XY ( x , y )dxdy.

(10-6)

If X and Y are discrete-type r.vs, then
E[ g ( X , Y )] = ∑ ∑ g ( xi , y j ) P( X = xi , Y = y j ).
i

(10-7)

j

Since expectation is a linear operator, we also get


E  ∑ ak g k ( X , Y )  = ∑ ak E [ g k ( X , Y )].
 k
 k

(10-8)

3

PILLAI
If X and Y are independent r.vs, it is easy to see that Z = g ( X )
and W = h(Y ) are always independent of each other. In that
case using (10-7), we get the interesting result
E[ g ( X )h(Y )] = ∫

+∞
−∞

∫

+∞
−∞

g ( x )h( y ) f X ( x ) fY ( y )dxdy

+∞

+∞

−∞

−∞

= ∫ g ( x ) f X ( x )dx ∫ h( y ) fY ( y )dy = E[ g ( X )]E[h (Y )]. (10-9)

However (10-9) is in general not true (if X and Y are not
independent).
In the case of one random variable (see (10- 6)), we defined
the parameters mean and variance to represent its average
behavior. How does one parametrically represent similar
cross-behavior between two random variables? Towards
this, we can generalize the variance definition given in
(6-16) as shown below:
4

PILLAI
Covariance: Given any two r.vs X and Y, define
Cov ( X , Y ) = E [ ( X − µX )(Y − µY )] .

(10-10)

By expanding and simplifying the right side of (10-10), we
also get
Cov ( X , Y ) = E ( XY ) − µ X µY = E ( XY ) − E ( X ) E (Y )
____

__ __

= XY − X Y .

(10-11)

Cov ( X , Y ) ≤ Var ( X )Var (Y ) .

(10-12)

It is easy to see that

To see (10-12), let U = aX + Y , so that

[

Var (U ) = E {a ( X − µX ) + (Y − µY )}

2

]

= a 2Var ( X ) + 2a Cov ( X , Y ) +Var (Y ) ≥ 0 .

5

(10-13)
PILLAI
The right side of (10-13) represents a quadratic in the
variable a that has no distinct real roots (Fig. 10.1). Thus the
roots are imaginary (or double) and hence the discriminant

[Cov( X , Y )] 2 −Var ( X ) Var (Y )
must be non-positive, and that gives (10-12). Using (10-12),
we may define the normalized parameter
ρXY =

Cov ( X , Y )
Cov ( X , Y )
=
,
σ X σY
Var ( X )Var (Y )

or
Cov ( X , Y ) = ρXY σ X σY

−1 ≤ ρXY ≤ 1,

(10-14)

Var (U )

(10-15)

and it represents the correlation
coefficient between X and Y.

a
Fig. 10.1

6

PILLAI
Uncorrelated r.vs: If ρ XY = 0, then X and Y are said to be
uncorrelated r.vs. From (11), if X and Y are uncorrelated,
then
E ( XY ) = E ( X ) E (Y ).

Orthogonality: X and Y are said to be orthogonal if
E ( XY ) = 0.

(10-16)
(10-17)

From (10-16) - (10-17), if either X or Y has zero mean, then
orthogonality implies uncorrelatedness also and vice-versa.
Suppose X and Y are independent r.vs. Then from (10-9)
with g ( X ) = X , h(Y ) = Y , we get
E ( XY ) = E ( X ) E (Y ),

and together with (10-16), we conclude that the random
variables are uncorrelated, thus justifying the original
definition in (10-10). Thus independence implies
uncorrelatedness.
7

PILLAI
Naturally, if two random variables are statistically
independent, then there cannot be any correlation between
them ( ρ XY = 0). However, the converse is in general not
true. As the next example shows, random variables can be
uncorrelated without being independent.
Example 10.1: Let X ∼ U (0,1), Y ∼ U (0,1). Suppose X and Y
are independent. Define Z = X + Y, W = X - Y . Show that Z
and W are dependent, but uncorrelated r.vs.
Solution: z = x + y, w = x − y gives the only solution set to be
z+w
z−w
x=
, y=
.
2
2
Moreover 0 < z < 2, − 1 < w < 1, z + w ≤ 2, z − w ≤ 2, z >| w |
and | J ( z, w) |= 1 / 2.

8

PILLAI
Thus (see the shaded region in Fig. 10.2)
1 / 2, 0 < z < 2, − 1 < w < 1, z + w ≤ 2, z − w ≤ 2, | w |< z,
f ZW ( z, w) = 
otherwise,
 0,

(10-18)

w

1

2

and hence

fZ ( z) = ∫

−1

z

Fig. 10.2

 z 1
0 < z < 1,
 ∫ − z 2 dw = z ,

f ZW ( z, w)dw = 
 2 -z 1
∫ z- 2 2 dw = 2 − z, 1 < z < 2,


or by direct computation ( Z = X + Y )

9

PILLAI
and

0 < z < 1,
 z,

f Z ( z ) = f X ( z ) ⊗ fY ( z ) = 2 − z, 1 < z < 2,
 0,
otherwise,

fW ( w) = ∫ f ZW ( z, w)dz = ∫

2 −|w|

|w|

(10-19)

1− | w |, − 1 < w < 1,
1
dz = 
otherwise.
2
 0,

(10-20)

Clearly f ZW ( z, w) ≠ f Z ( z ) fW ( w). Thus Z and W are not
independent. However
E ( ZW ) = E [ ( X + Y )( X − Y )] = E ( X 2 ) − E (Y 2 ) = 0,

and
and hence

(10-21)

E (W ) = E ( X − Y ) = 0,
Cov ( Z ,W ) = E ( ZW ) − E ( Z ) E (W ) = 0

(10-22)

implying that Z and W are uncorrelated random variables.
10

PILLAI
Example 10.2: Let Z = aX + bY . Determine the variance of Z
in terms of σ X ,σ Y and ρ XY .
Solution:
µ Z = E ( Z ) = E ( aX + bY ) = a µ X + bµY

and using (10-15)

2
2
σ Z = Var ( Z ) = E ( Z − µ Z ) 2  = E ( a ( X − µ X ) + b(Y − µY ) ) 




= a 2 E ( X − µ X ) 2 + 2abE ( ( X − µ X )(Y − µY ) ) + b2 E (Y − µY ) 2
(10-23)
= a 2σ 2 + 2ab ρ σ σ + b2σ 2 .
X

XY

X

Y

Y

In particular if X and Y are independent, then ρXY
(10-23) reduces to
2
2
2
σ Z = a 2σ X + b 2σ Y .

= 0,

and
(10-24)

Thus the variance of the sum of independent r.vs is the sum
11
of their variances (a = b = 1).
PILLAI
Moments:
E[ X Y ] = ∫
k

m

+∞

−∞

∫

+∞

−∞

(10-25)

x k y m f XY ( x, y )dx dy ,

represents the joint moment of order (k,m) for X and Y.
Following the one random variable case, we can define the
joint characteristic function between two random variables
which will turn out to be useful for moment calculations.
Joint characteristic functions:
The joint characteristic function between X and Y is defined
as
Φ XY (u, v ) = E ( e

Note that

j ( Xu +Yv )

)

=∫

+∞
−∞

∫

+∞
−∞

Φ XY (u, v ) ≤ Φ XY (0,0) = 1.

e j ( Xu +Yv ) f XY ( x, y )dxdy. (10-26)

12

PILLAI
It is easy to show that
1 ∂ 2Φ XY (u, v )
E ( XY ) = 2
j
∂u∂v

.
u =0 , v = 0

(10-27)

If X and Y are independent r.vs, then from (10-26), we
obtain
Φ XY (u, v ) = E ( e juX ) E ( e jvY ) = Φ X (u )ΦY ( v ).

(10-28)

Φ X (u ) = Φ XY (u, 0),

(10-29)

Also
ΦY ( v ) = Φ XY (0, v ).

More on Gaussian r.vs :
From Lecture 7, X and Y are said to be jointly Gaussian
2
2
N ( µ X , µY , σ X , σ Y , ρ ),
as
if their joint p.d.f has the form in
(7-23). In that case, by direct substitution and simplification,
we obtain the joint characteristic function of two jointly
13
Gaussian r.vs to be
PILLAI
Φ XY (u, v ) = E ( e j ( Xu +Yv ) ) = e

1 2
2
j ( µ X u + µY v ) − (σ X u 2 +2 ρσ X σ Y uv +σ Y v 2 )
2

.

(10-30)

Equation (10-14) can be used to make various conclusions.
Letting v = 0 in (10-30), we get
Φ X (u ) = Φ XY (u,0) = e

1 2
jµ X u − σ X u 2
2

(10-31)

,

and it agrees with (6-47).
From (7-23) by direct computation using (10-11), it is easy
to show that for two jointly Gaussian random variables
Cov ( X , Y ) = ρ σ X σ Y .
2
2
ρ in N ( µ X , µY , σ X , σ Y , ρ ) represents
Hence from (10-14),

the actual correlation coefficient of the two jointly Gaussian
r.vs in (7-23). Notice that ρ = 0 implies
14

PILLAI
f XY ( X , Y ) = f X ( x ) fY ( y ).

Thus if X and Y are jointly Gaussian, uncorrelatedness does
imply independence between the two random variables.
Gaussian case is the only exception where the two concepts
imply each other.
Example 10.3: Let X and Y be jointly Gaussian r.vs with
2
parameters N ( µ X , µY , σ X ,σ Y2 , ρ ). Define Z = aX + bY .
Determine f Z (z ).
Solution: In this case we can make use of characteristic
function to solve this problem.
ΦZ (u ) = E ( e jZu ) = E ( e j ( aX +bY ) u ) = E ( e jauX + jbuY )
= Φ XY ( au, bu ).

(10-32)
15

PILLAI
From (10-30) with u and v replaced by au and bu
respectively we get
ΦZ (u ) = e

1
2
2
j ( aµ X +bµY ) u − ( a 2σ X +2 ρabσ X σ Y +b 2σ Y ) u 2
2

=e

1 2
jµZ u − σ Z u 2
2

,

(10-33)

where
∆
µZ = aµX + bµY ,

(10-34)

2 ∆
2
2
σ Z = a 2σ X + 2 ρabσ X σY + b2σY .

(10-35)

Notice that (10-33) has the same form as (10-31), and hence
we conclude that Z = aX + bY is also Gaussian with mean and
variance as in (10-34) - (10-35), which also agrees with (1023).
From the previous example, we conclude that any linear
combination of jointly Gaussian r.vs generate a Gaussian r.v.
16

PILLAI
In other words, linearity preserves Gaussianity. We can use
the characteristic function relation to conclude an even more
general result.
Example 10.4: Suppose X and Y are jointly Gaussian r.vs as
in the previous example. Define two linear combinations
Z = aX + bY ,

W = cX + dY .

(10-36)

what can we say about their joint distribution?
Solution: The characteristic function of Z and W is given by
ΦZW (u, v ) = E ( e j ( Zu +Wv ) ) = E ( e j ( aX +bY ) u + j ( cX +dY ) v )
= E ( e jX ( au +cv ) + jY ( bu +dv ) ) = Φ XY ( au + cv , bu + dv ).

(10-37)

As before substituting (10-30) into (10-37) with u and v
replaced by au + cv and bu + dv respectively, we get
17

PILLAI
ΦZW (u, v ) = e

1 2
2
j ( µZ u + µW v ) − (σ Z u 2 +2 ρZW σ X σ Y uv +σW v 2 )
2

(10-38)

,

where

(10-39)
(10-40)

µZ = aµX + bµY ,
µW = cµX + dµY ,
2
2
2
σ Z = a 2σ X + 2abρσ X σY + b2σY ,

(10-41)

2
2
2
σW = c 2σ X + 2cdρσ X σY + d 2σY ,

(10-42)

and
ρ ZW

2
2
acσ X + ( ad + bc ) ρσ X σ Y + bdσ Y
=
.
σ Zσ W

(10-43)

From (10-38), we conclude that Z and W are also jointly
distributed Gaussian r.vs with means, variances and
correlation coefficient as in (10-39) - (10-43).
18

PILLAI
To summarize, any two linear combinations of jointly
Gaussian random variables (independent or dependent) are
also jointly Gaussian r.vs.
Gaussian input

Linear
operator

Gaussian output

Fig. 10.3

Of course, we could have reached the same conclusion by
deriving the joint p.d.f f ZW ( z, w) using the technique
developed in section 9 (refer (7-29)).
Gaussian random variables are also interesting because of
the following result:
Central Limit Theorem: Suppose X 1 , X 2 ,, X n are a set of
zero mean independent, identically distributed (i.i.d) random
19

PILLAI
variables with some common distribution. Consider their
scaled sum
X 1 + X 2 + + X n
.
n
asymptotically (as n →∞)

(10-44)

Y =

Then

Y → N (0, σ 2 ).

(10-45)

Proof: Although the theorem is true under even more
general conditions, we shall prove it here under the
independence assumption. Let σ 2 represent their common
variance. Since
E ( X i ) = 0,

we have

Var ( X i ) = E ( X i2 ) = σ 2 .

(10-46)
(10-47)
20

PILLAI
Consider
Φ Y (u ) = E ( e

jYu

(

)=E e

j ( X 1 + X 2 ++ X n ) u / n

) = ∏ E (e
n

jX i u / n

)

i =1

n

= ∏ Φ X i (u / n )

(10-48)

i =1

where we have made use of the independence of the
r.vs X 1 , X 2 ,, X n . But
E (e

jX i u / n



jX i u j 2 X i2u 2 j 3 X i3u 3
σ 2u 2
 1 
) = E 1 −
+
+
+  = 1 −
+ o  3 / 2 ,


2! n
3! n 3 / 2
2n
n
n 



(10-49)

where we have made use of (10-46) - (10-47). Substituting
(10-49) into (10-48), we obtain
n

 σ u
 1 
Φ Y (u ) = 1 −
+ o 3 / 2   ,
2n
 n 

2 2

and as

lim ΦY (u ) → e
n →∞

−σ 2u 2 / 2

,

(10-50)

21

(10-51)
PILLAI
since

n

x

lim 1 −  → e −x .
n →∞
n


(10-52)

[Note that o(1 / n 3 / 2 ) terms in (10-50) decay faster than 1/ n 3 / 2 ].
But (10-51) represents the characteristic function of a zero
σ 2 and (10-45) follows.
mean normal r.v with variance
The central limit theorem states that a large sum of
independent random variables each with finite variance
tends to behave like a normal random variable. Thus the
individual p.d.fs become unimportant to analyze the
collective sum behavior. If we model the noise phenomenon
as the sum of a large number of independent random
variables (eg: electron motion in resistor components), then
this theorem allows us to conclude that noise behaves like a
Gaussian r.v.
22
PILLAI
It may be remarked that the finite variance assumption is
necessary for the theorem to hold good. To prove its
importance, consider the r.vs to be Cauchy distributed, and
let
X 1 + X 2 + + X n
Y =
.
n

(10-53)

where each X i ∼ C (α). Then since
Φ X i (u ) = e −α |u| ,

(10-54)

substituting this into (10-48), we get
n

(

ΦY ( u ) = ∏ Φ X ( u / n ) = e
i =1

−α |u|/ n

)

n

~ C (α n ),

(10-55)

which shows that Y is still Cauchy with parameter α n .
In other words, central limit theorem doesn’t hold good for
a set of Cauchy r.vs as their variances are undefined.
23

PILLAI
Joint characteristic functions are useful in determining the
p.d.f of linear combinations of r.vs. For example, with X and
Y as independent Poisson r.vs with parameters and1
λ
λ
2
respectively, let
Z = X +Y .
Φ Z (u ) = Φ X (u )ΦY (u ).

Then

(10-56)
(10-57)

But from (6-33)
Φ X (u ) = e

λ1 ( e ju −1)

,

so that
Φ Z (u ) = e

( λ1 + λ2 )( e ju −1)

ΦY (u ) = e

λ2 ( e ju −1)

∼ P( λ1 + λ2 )

i.e., sum of independent Poisson r.vs is also a Poisson
random variable.
24

(10-58)
(10-59)

PILLAI

More Related Content

What's hot

Image Filtering in the Frequency Domain
Image Filtering in the Frequency DomainImage Filtering in the Frequency Domain
Image Filtering in the Frequency DomainAmnaakhaan
 
Lecture 16 KL Transform in Image Processing
Lecture 16 KL Transform in Image ProcessingLecture 16 KL Transform in Image Processing
Lecture 16 KL Transform in Image ProcessingVARUN KUMAR
 
357502575-resuelto-TPN-2-Muestreo.pptx
357502575-resuelto-TPN-2-Muestreo.pptx357502575-resuelto-TPN-2-Muestreo.pptx
357502575-resuelto-TPN-2-Muestreo.pptxedwinmurillo21
 
Sampling Theorem, Quantization Noise and its types, PCM, Channel Capacity, Ny...
Sampling Theorem, Quantization Noise and its types, PCM, Channel Capacity, Ny...Sampling Theorem, Quantization Noise and its types, PCM, Channel Capacity, Ny...
Sampling Theorem, Quantization Noise and its types, PCM, Channel Capacity, Ny...Waqas Afzal
 
Image sampling and quantization
Image sampling and quantizationImage sampling and quantization
Image sampling and quantizationBCET, Balasore
 
Lossless predictive coding in Digital Image Processing
Lossless predictive coding in Digital Image ProcessingLossless predictive coding in Digital Image Processing
Lossless predictive coding in Digital Image Processingpriyadharshini murugan
 
Image Enhancement - Point Processing
Image Enhancement - Point ProcessingImage Enhancement - Point Processing
Image Enhancement - Point ProcessingGayathri31093
 
Lecture 13 (Usage of Fourier transform in image processing)
Lecture 13 (Usage of Fourier transform in image processing)Lecture 13 (Usage of Fourier transform in image processing)
Lecture 13 (Usage of Fourier transform in image processing)VARUN KUMAR
 
OKUMURA, HATA and COST231 Propagation Models
OKUMURA, HATA and COST231 Propagation ModelsOKUMURA, HATA and COST231 Propagation Models
OKUMURA, HATA and COST231 Propagation ModelsMohammed Abuibaid
 
Discrete cosine transform
Discrete cosine transform   Discrete cosine transform
Discrete cosine transform Rashmi Karkra
 
Image enhancement
Image enhancementImage enhancement
Image enhancementKuppusamy P
 
Frequency Domain Image Enhancement Techniques
Frequency Domain Image Enhancement TechniquesFrequency Domain Image Enhancement Techniques
Frequency Domain Image Enhancement TechniquesDiwaker Pant
 
Structure from motion
Structure from motionStructure from motion
Structure from motionFatima Radi
 
Histogram Processing
Histogram ProcessingHistogram Processing
Histogram ProcessingAmnaakhaan
 
DIGITAL IMAGE PROCESSING - Visual perception - DAY 2
DIGITAL IMAGE PROCESSING - Visual perception - DAY 2DIGITAL IMAGE PROCESSING - Visual perception - DAY 2
DIGITAL IMAGE PROCESSING - Visual perception - DAY 2vijayanand Kandaswamy
 

What's hot (20)

Image Filtering in the Frequency Domain
Image Filtering in the Frequency DomainImage Filtering in the Frequency Domain
Image Filtering in the Frequency Domain
 
Lecture 16 KL Transform in Image Processing
Lecture 16 KL Transform in Image ProcessingLecture 16 KL Transform in Image Processing
Lecture 16 KL Transform in Image Processing
 
357502575-resuelto-TPN-2-Muestreo.pptx
357502575-resuelto-TPN-2-Muestreo.pptx357502575-resuelto-TPN-2-Muestreo.pptx
357502575-resuelto-TPN-2-Muestreo.pptx
 
Noise Models
Noise ModelsNoise Models
Noise Models
 
Sampling Theorem, Quantization Noise and its types, PCM, Channel Capacity, Ny...
Sampling Theorem, Quantization Noise and its types, PCM, Channel Capacity, Ny...Sampling Theorem, Quantization Noise and its types, PCM, Channel Capacity, Ny...
Sampling Theorem, Quantization Noise and its types, PCM, Channel Capacity, Ny...
 
Image sampling and quantization
Image sampling and quantizationImage sampling and quantization
Image sampling and quantization
 
Lossless predictive coding in Digital Image Processing
Lossless predictive coding in Digital Image ProcessingLossless predictive coding in Digital Image Processing
Lossless predictive coding in Digital Image Processing
 
Image Enhancement - Point Processing
Image Enhancement - Point ProcessingImage Enhancement - Point Processing
Image Enhancement - Point Processing
 
Lecture 13 (Usage of Fourier transform in image processing)
Lecture 13 (Usage of Fourier transform in image processing)Lecture 13 (Usage of Fourier transform in image processing)
Lecture 13 (Usage of Fourier transform in image processing)
 
Introduction to Optical Detector
Introduction to Optical DetectorIntroduction to Optical Detector
Introduction to Optical Detector
 
Image segmentation
Image segmentationImage segmentation
Image segmentation
 
OKUMURA, HATA and COST231 Propagation Models
OKUMURA, HATA and COST231 Propagation ModelsOKUMURA, HATA and COST231 Propagation Models
OKUMURA, HATA and COST231 Propagation Models
 
Discrete cosine transform
Discrete cosine transform   Discrete cosine transform
Discrete cosine transform
 
Image enhancement
Image enhancementImage enhancement
Image enhancement
 
Chap 4
Chap 4Chap 4
Chap 4
 
Frequency Domain Image Enhancement Techniques
Frequency Domain Image Enhancement TechniquesFrequency Domain Image Enhancement Techniques
Frequency Domain Image Enhancement Techniques
 
Structure from motion
Structure from motionStructure from motion
Structure from motion
 
Histogram Processing
Histogram ProcessingHistogram Processing
Histogram Processing
 
Edge detection-LOG
Edge detection-LOGEdge detection-LOG
Edge detection-LOG
 
DIGITAL IMAGE PROCESSING - Visual perception - DAY 2
DIGITAL IMAGE PROCESSING - Visual perception - DAY 2DIGITAL IMAGE PROCESSING - Visual perception - DAY 2
DIGITAL IMAGE PROCESSING - Visual perception - DAY 2
 

Similar to Joint moments and joint characteristic functions

Joint moment and joint character lectr10a.ppt
Joint moment and joint character lectr10a.pptJoint moment and joint character lectr10a.ppt
Joint moment and joint character lectr10a.pptsadafshahbaz7777
 
Two Random variable based on probability lect7a.ppt
Two Random variable based on probability lect7a.pptTwo Random variable based on probability lect7a.ppt
Two Random variable based on probability lect7a.pptsadafshahbaz7777
 
Pullbacks and Pushouts in the Category of Graphs
Pullbacks and Pushouts in the Category of GraphsPullbacks and Pushouts in the Category of Graphs
Pullbacks and Pushouts in the Category of Graphsiosrjce
 
Transformation of random variables
Transformation of random variablesTransformation of random variables
Transformation of random variablesTarun Gehlot
 
A unique common fixed point theorem for four
A unique common fixed point theorem for fourA unique common fixed point theorem for four
A unique common fixed point theorem for fourAlexander Decker
 
Function of a random variable lect5a.ppt
Function of a random variable lect5a.pptFunction of a random variable lect5a.ppt
Function of a random variable lect5a.pptsadafshahbaz7777
 
One function of two Random variable a.ppt
One function of two Random variable a.pptOne function of two Random variable a.ppt
One function of two Random variable a.pptsadafshahbaz7777
 
Mean, variable , moment characteris function.ppt
Mean, variable , moment characteris function.pptMean, variable , moment characteris function.ppt
Mean, variable , moment characteris function.pptsadafshahbaz7777
 
Conditional Expectations Liner algebra
Conditional Expectations Liner algebra Conditional Expectations Liner algebra
Conditional Expectations Liner algebra AZIZ ULLAH SURANI
 
Generarlized operations on fuzzy graphs
Generarlized operations on fuzzy graphsGenerarlized operations on fuzzy graphs
Generarlized operations on fuzzy graphsAlexander Decker
 
Conditional density function ectr11a.ppt
Conditional density function ectr11a.pptConditional density function ectr11a.ppt
Conditional density function ectr11a.pptsadafshahbaz7777
 
Lesson20 Tangent Planes Slides+Notes
Lesson20   Tangent Planes Slides+NotesLesson20   Tangent Planes Slides+Notes
Lesson20 Tangent Planes Slides+NotesMatthew Leingang
 
Integration
IntegrationIntegration
IntegrationRipaBiba
 
Topology and Electrostatics
Topology and Electrostatics Topology and Electrostatics
Topology and Electrostatics Vorname Nachname
 
Solovay Kitaev theorem
Solovay Kitaev theoremSolovay Kitaev theorem
Solovay Kitaev theoremJamesMa54
 
adv-2015-16-solution-09
adv-2015-16-solution-09adv-2015-16-solution-09
adv-2015-16-solution-09志远 姚
 
5.4 Saddle-point interpretation, 5.5 Optimality conditions, 5.6 Perturbation ...
5.4 Saddle-point interpretation, 5.5 Optimality conditions, 5.6 Perturbation ...5.4 Saddle-point interpretation, 5.5 Optimality conditions, 5.6 Perturbation ...
5.4 Saddle-point interpretation, 5.5 Optimality conditions, 5.6 Perturbation ...RyotaroTsukada
 

Similar to Joint moments and joint characteristic functions (20)

Joint moment and joint character lectr10a.ppt
Joint moment and joint character lectr10a.pptJoint moment and joint character lectr10a.ppt
Joint moment and joint character lectr10a.ppt
 
lectr10a.ppt
lectr10a.pptlectr10a.ppt
lectr10a.ppt
 
Two Random variable based on probability lect7a.ppt
Two Random variable based on probability lect7a.pptTwo Random variable based on probability lect7a.ppt
Two Random variable based on probability lect7a.ppt
 
Pullbacks and Pushouts in the Category of Graphs
Pullbacks and Pushouts in the Category of GraphsPullbacks and Pushouts in the Category of Graphs
Pullbacks and Pushouts in the Category of Graphs
 
Transformation of random variables
Transformation of random variablesTransformation of random variables
Transformation of random variables
 
A unique common fixed point theorem for four
A unique common fixed point theorem for fourA unique common fixed point theorem for four
A unique common fixed point theorem for four
 
Function of a random variable lect5a.ppt
Function of a random variable lect5a.pptFunction of a random variable lect5a.ppt
Function of a random variable lect5a.ppt
 
One function of two Random variable a.ppt
One function of two Random variable a.pptOne function of two Random variable a.ppt
One function of two Random variable a.ppt
 
Mean, variable , moment characteris function.ppt
Mean, variable , moment characteris function.pptMean, variable , moment characteris function.ppt
Mean, variable , moment characteris function.ppt
 
Conditional Expectations Liner algebra
Conditional Expectations Liner algebra Conditional Expectations Liner algebra
Conditional Expectations Liner algebra
 
Prob review
Prob reviewProb review
Prob review
 
Generarlized operations on fuzzy graphs
Generarlized operations on fuzzy graphsGenerarlized operations on fuzzy graphs
Generarlized operations on fuzzy graphs
 
Conditional density function ectr11a.ppt
Conditional density function ectr11a.pptConditional density function ectr11a.ppt
Conditional density function ectr11a.ppt
 
Lesson20 Tangent Planes Slides+Notes
Lesson20   Tangent Planes Slides+NotesLesson20   Tangent Planes Slides+Notes
Lesson20 Tangent Planes Slides+Notes
 
Integration
IntegrationIntegration
Integration
 
Maxwell's equations
Maxwell's equationsMaxwell's equations
Maxwell's equations
 
Topology and Electrostatics
Topology and Electrostatics Topology and Electrostatics
Topology and Electrostatics
 
Solovay Kitaev theorem
Solovay Kitaev theoremSolovay Kitaev theorem
Solovay Kitaev theorem
 
adv-2015-16-solution-09
adv-2015-16-solution-09adv-2015-16-solution-09
adv-2015-16-solution-09
 
5.4 Saddle-point interpretation, 5.5 Optimality conditions, 5.6 Perturbation ...
5.4 Saddle-point interpretation, 5.5 Optimality conditions, 5.6 Perturbation ...5.4 Saddle-point interpretation, 5.5 Optimality conditions, 5.6 Perturbation ...
5.4 Saddle-point interpretation, 5.5 Optimality conditions, 5.6 Perturbation ...
 

Recently uploaded

Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingEdi Saputra
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...apidays
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...apidays
 
Platformless Horizons for Digital Adaptability
Platformless Horizons for Digital AdaptabilityPlatformless Horizons for Digital Adaptability
Platformless Horizons for Digital AdaptabilityWSO2
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Zilliz
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDropbox
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FMESafe Software
 
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...apidays
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...Zilliz
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Orbitshub
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Jeffrey Haguewood
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MIND CTI
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businesspanagenda
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native ApplicationsWSO2
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfsudhanshuwaghmare1
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontologyjohnbeverley2021
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherRemote DBA Services
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024The Digital Insurer
 

Recently uploaded (20)

Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
 
Platformless Horizons for Digital Adaptability
Platformless Horizons for Digital AdaptabilityPlatformless Horizons for Digital Adaptability
Platformless Horizons for Digital Adaptability
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor Presentation
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontology
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 

Joint moments and joint characteristic functions

  • 1. 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent the information contained in the joint p.d.f of two r.vs. Given two r.vs X and Y and a function g ( x, y ), define the r.v Z = g ( X ,Y ) (10-1) Using (6-2), we can define the mean of Z to be µZ = E ( Z ) = ∫ +∞ −∞ (10-2) z f Z ( z )dz. 1 PILLAI
  • 2. However, the situation here is similar to that in (6-13), and it is possible to express the mean of Z = g ( X , Y ) in terms of f XY ( x, y ) without computing f Z (z ). To see this, recall from (5-26) and (7-10) that P ( z < Z ≤ z + ∆z ) = f Z ( z ) ∆z = P ( z < g ( X , Y ) ≤ z + ∆z ) =∑ ∑f ( x , y )∈D∆z XY ( x, y ) ∆x∆y (10-3) where D∆z is the region in xy plane satisfying the above inequality. From (10-3), we get z f Z ( z ) ∆z = g ( x, y ) ∑ ∑ ( x , y )∈D∆z f XY ( x, y ) ∆x ∆y. (10-4) As ∆z covers the entire z axis, the corresponding regions D∆z are nonoverlapping, and they cover the entire xy plane. 2 PILLAI
  • 3. By integrating (10-4), we obtain the useful formula E(Z ) = ∫ +∞ −∞ z f Z ( z )dz = ∫ +∞ −∞ ∫ +∞ −∞ g ( x, y ) f XY ( x, y )dxdy. (10-5) or E [ g ( X , Y )] = ∫ +∞ −∞ ∫ +∞ −∞ g ( x, y ) f XY ( x , y )dxdy. (10-6) If X and Y are discrete-type r.vs, then E[ g ( X , Y )] = ∑ ∑ g ( xi , y j ) P( X = xi , Y = y j ). i (10-7) j Since expectation is a linear operator, we also get   E  ∑ ak g k ( X , Y )  = ∑ ak E [ g k ( X , Y )].  k  k (10-8) 3 PILLAI
  • 4. If X and Y are independent r.vs, it is easy to see that Z = g ( X ) and W = h(Y ) are always independent of each other. In that case using (10-7), we get the interesting result E[ g ( X )h(Y )] = ∫ +∞ −∞ ∫ +∞ −∞ g ( x )h( y ) f X ( x ) fY ( y )dxdy +∞ +∞ −∞ −∞ = ∫ g ( x ) f X ( x )dx ∫ h( y ) fY ( y )dy = E[ g ( X )]E[h (Y )]. (10-9) However (10-9) is in general not true (if X and Y are not independent). In the case of one random variable (see (10- 6)), we defined the parameters mean and variance to represent its average behavior. How does one parametrically represent similar cross-behavior between two random variables? Towards this, we can generalize the variance definition given in (6-16) as shown below: 4 PILLAI
  • 5. Covariance: Given any two r.vs X and Y, define Cov ( X , Y ) = E [ ( X − µX )(Y − µY )] . (10-10) By expanding and simplifying the right side of (10-10), we also get Cov ( X , Y ) = E ( XY ) − µ X µY = E ( XY ) − E ( X ) E (Y ) ____ __ __ = XY − X Y . (10-11) Cov ( X , Y ) ≤ Var ( X )Var (Y ) . (10-12) It is easy to see that To see (10-12), let U = aX + Y , so that [ Var (U ) = E {a ( X − µX ) + (Y − µY )} 2 ] = a 2Var ( X ) + 2a Cov ( X , Y ) +Var (Y ) ≥ 0 . 5 (10-13) PILLAI
  • 6. The right side of (10-13) represents a quadratic in the variable a that has no distinct real roots (Fig. 10.1). Thus the roots are imaginary (or double) and hence the discriminant [Cov( X , Y )] 2 −Var ( X ) Var (Y ) must be non-positive, and that gives (10-12). Using (10-12), we may define the normalized parameter ρXY = Cov ( X , Y ) Cov ( X , Y ) = , σ X σY Var ( X )Var (Y ) or Cov ( X , Y ) = ρXY σ X σY −1 ≤ ρXY ≤ 1, (10-14) Var (U ) (10-15) and it represents the correlation coefficient between X and Y. a Fig. 10.1 6 PILLAI
  • 7. Uncorrelated r.vs: If ρ XY = 0, then X and Y are said to be uncorrelated r.vs. From (11), if X and Y are uncorrelated, then E ( XY ) = E ( X ) E (Y ). Orthogonality: X and Y are said to be orthogonal if E ( XY ) = 0. (10-16) (10-17) From (10-16) - (10-17), if either X or Y has zero mean, then orthogonality implies uncorrelatedness also and vice-versa. Suppose X and Y are independent r.vs. Then from (10-9) with g ( X ) = X , h(Y ) = Y , we get E ( XY ) = E ( X ) E (Y ), and together with (10-16), we conclude that the random variables are uncorrelated, thus justifying the original definition in (10-10). Thus independence implies uncorrelatedness. 7 PILLAI
  • 8. Naturally, if two random variables are statistically independent, then there cannot be any correlation between them ( ρ XY = 0). However, the converse is in general not true. As the next example shows, random variables can be uncorrelated without being independent. Example 10.1: Let X ∼ U (0,1), Y ∼ U (0,1). Suppose X and Y are independent. Define Z = X + Y, W = X - Y . Show that Z and W are dependent, but uncorrelated r.vs. Solution: z = x + y, w = x − y gives the only solution set to be z+w z−w x= , y= . 2 2 Moreover 0 < z < 2, − 1 < w < 1, z + w ≤ 2, z − w ≤ 2, z >| w | and | J ( z, w) |= 1 / 2. 8 PILLAI
  • 9. Thus (see the shaded region in Fig. 10.2) 1 / 2, 0 < z < 2, − 1 < w < 1, z + w ≤ 2, z − w ≤ 2, | w |< z, f ZW ( z, w) =  otherwise,  0, (10-18) w 1 2 and hence fZ ( z) = ∫ −1 z Fig. 10.2  z 1 0 < z < 1,  ∫ − z 2 dw = z ,  f ZW ( z, w)dw =   2 -z 1 ∫ z- 2 2 dw = 2 − z, 1 < z < 2,  or by direct computation ( Z = X + Y ) 9 PILLAI
  • 10. and 0 < z < 1,  z,  f Z ( z ) = f X ( z ) ⊗ fY ( z ) = 2 − z, 1 < z < 2,  0, otherwise,  fW ( w) = ∫ f ZW ( z, w)dz = ∫ 2 −|w| |w| (10-19) 1− | w |, − 1 < w < 1, 1 dz =  otherwise. 2  0, (10-20) Clearly f ZW ( z, w) ≠ f Z ( z ) fW ( w). Thus Z and W are not independent. However E ( ZW ) = E [ ( X + Y )( X − Y )] = E ( X 2 ) − E (Y 2 ) = 0, and and hence (10-21) E (W ) = E ( X − Y ) = 0, Cov ( Z ,W ) = E ( ZW ) − E ( Z ) E (W ) = 0 (10-22) implying that Z and W are uncorrelated random variables. 10 PILLAI
  • 11. Example 10.2: Let Z = aX + bY . Determine the variance of Z in terms of σ X ,σ Y and ρ XY . Solution: µ Z = E ( Z ) = E ( aX + bY ) = a µ X + bµY and using (10-15) 2 2 σ Z = Var ( Z ) = E ( Z − µ Z ) 2  = E ( a ( X − µ X ) + b(Y − µY ) )      = a 2 E ( X − µ X ) 2 + 2abE ( ( X − µ X )(Y − µY ) ) + b2 E (Y − µY ) 2 (10-23) = a 2σ 2 + 2ab ρ σ σ + b2σ 2 . X XY X Y Y In particular if X and Y are independent, then ρXY (10-23) reduces to 2 2 2 σ Z = a 2σ X + b 2σ Y . = 0, and (10-24) Thus the variance of the sum of independent r.vs is the sum 11 of their variances (a = b = 1). PILLAI
  • 12. Moments: E[ X Y ] = ∫ k m +∞ −∞ ∫ +∞ −∞ (10-25) x k y m f XY ( x, y )dx dy , represents the joint moment of order (k,m) for X and Y. Following the one random variable case, we can define the joint characteristic function between two random variables which will turn out to be useful for moment calculations. Joint characteristic functions: The joint characteristic function between X and Y is defined as Φ XY (u, v ) = E ( e Note that j ( Xu +Yv ) ) =∫ +∞ −∞ ∫ +∞ −∞ Φ XY (u, v ) ≤ Φ XY (0,0) = 1. e j ( Xu +Yv ) f XY ( x, y )dxdy. (10-26) 12 PILLAI
  • 13. It is easy to show that 1 ∂ 2Φ XY (u, v ) E ( XY ) = 2 j ∂u∂v . u =0 , v = 0 (10-27) If X and Y are independent r.vs, then from (10-26), we obtain Φ XY (u, v ) = E ( e juX ) E ( e jvY ) = Φ X (u )ΦY ( v ). (10-28) Φ X (u ) = Φ XY (u, 0), (10-29) Also ΦY ( v ) = Φ XY (0, v ). More on Gaussian r.vs : From Lecture 7, X and Y are said to be jointly Gaussian 2 2 N ( µ X , µY , σ X , σ Y , ρ ), as if their joint p.d.f has the form in (7-23). In that case, by direct substitution and simplification, we obtain the joint characteristic function of two jointly 13 Gaussian r.vs to be PILLAI
  • 14. Φ XY (u, v ) = E ( e j ( Xu +Yv ) ) = e 1 2 2 j ( µ X u + µY v ) − (σ X u 2 +2 ρσ X σ Y uv +σ Y v 2 ) 2 . (10-30) Equation (10-14) can be used to make various conclusions. Letting v = 0 in (10-30), we get Φ X (u ) = Φ XY (u,0) = e 1 2 jµ X u − σ X u 2 2 (10-31) , and it agrees with (6-47). From (7-23) by direct computation using (10-11), it is easy to show that for two jointly Gaussian random variables Cov ( X , Y ) = ρ σ X σ Y . 2 2 ρ in N ( µ X , µY , σ X , σ Y , ρ ) represents Hence from (10-14), the actual correlation coefficient of the two jointly Gaussian r.vs in (7-23). Notice that ρ = 0 implies 14 PILLAI
  • 15. f XY ( X , Y ) = f X ( x ) fY ( y ). Thus if X and Y are jointly Gaussian, uncorrelatedness does imply independence between the two random variables. Gaussian case is the only exception where the two concepts imply each other. Example 10.3: Let X and Y be jointly Gaussian r.vs with 2 parameters N ( µ X , µY , σ X ,σ Y2 , ρ ). Define Z = aX + bY . Determine f Z (z ). Solution: In this case we can make use of characteristic function to solve this problem. ΦZ (u ) = E ( e jZu ) = E ( e j ( aX +bY ) u ) = E ( e jauX + jbuY ) = Φ XY ( au, bu ). (10-32) 15 PILLAI
  • 16. From (10-30) with u and v replaced by au and bu respectively we get ΦZ (u ) = e 1 2 2 j ( aµ X +bµY ) u − ( a 2σ X +2 ρabσ X σ Y +b 2σ Y ) u 2 2 =e 1 2 jµZ u − σ Z u 2 2 , (10-33) where ∆ µZ = aµX + bµY , (10-34) 2 ∆ 2 2 σ Z = a 2σ X + 2 ρabσ X σY + b2σY . (10-35) Notice that (10-33) has the same form as (10-31), and hence we conclude that Z = aX + bY is also Gaussian with mean and variance as in (10-34) - (10-35), which also agrees with (1023). From the previous example, we conclude that any linear combination of jointly Gaussian r.vs generate a Gaussian r.v. 16 PILLAI
  • 17. In other words, linearity preserves Gaussianity. We can use the characteristic function relation to conclude an even more general result. Example 10.4: Suppose X and Y are jointly Gaussian r.vs as in the previous example. Define two linear combinations Z = aX + bY , W = cX + dY . (10-36) what can we say about their joint distribution? Solution: The characteristic function of Z and W is given by ΦZW (u, v ) = E ( e j ( Zu +Wv ) ) = E ( e j ( aX +bY ) u + j ( cX +dY ) v ) = E ( e jX ( au +cv ) + jY ( bu +dv ) ) = Φ XY ( au + cv , bu + dv ). (10-37) As before substituting (10-30) into (10-37) with u and v replaced by au + cv and bu + dv respectively, we get 17 PILLAI
  • 18. ΦZW (u, v ) = e 1 2 2 j ( µZ u + µW v ) − (σ Z u 2 +2 ρZW σ X σ Y uv +σW v 2 ) 2 (10-38) , where (10-39) (10-40) µZ = aµX + bµY , µW = cµX + dµY , 2 2 2 σ Z = a 2σ X + 2abρσ X σY + b2σY , (10-41) 2 2 2 σW = c 2σ X + 2cdρσ X σY + d 2σY , (10-42) and ρ ZW 2 2 acσ X + ( ad + bc ) ρσ X σ Y + bdσ Y = . σ Zσ W (10-43) From (10-38), we conclude that Z and W are also jointly distributed Gaussian r.vs with means, variances and correlation coefficient as in (10-39) - (10-43). 18 PILLAI
  • 19. To summarize, any two linear combinations of jointly Gaussian random variables (independent or dependent) are also jointly Gaussian r.vs. Gaussian input Linear operator Gaussian output Fig. 10.3 Of course, we could have reached the same conclusion by deriving the joint p.d.f f ZW ( z, w) using the technique developed in section 9 (refer (7-29)). Gaussian random variables are also interesting because of the following result: Central Limit Theorem: Suppose X 1 , X 2 ,, X n are a set of zero mean independent, identically distributed (i.i.d) random 19 PILLAI
  • 20. variables with some common distribution. Consider their scaled sum X 1 + X 2 + + X n . n asymptotically (as n →∞) (10-44) Y = Then Y → N (0, σ 2 ). (10-45) Proof: Although the theorem is true under even more general conditions, we shall prove it here under the independence assumption. Let σ 2 represent their common variance. Since E ( X i ) = 0, we have Var ( X i ) = E ( X i2 ) = σ 2 . (10-46) (10-47) 20 PILLAI
  • 21. Consider Φ Y (u ) = E ( e jYu ( )=E e j ( X 1 + X 2 ++ X n ) u / n ) = ∏ E (e n jX i u / n ) i =1 n = ∏ Φ X i (u / n ) (10-48) i =1 where we have made use of the independence of the r.vs X 1 , X 2 ,, X n . But E (e jX i u / n   jX i u j 2 X i2u 2 j 3 X i3u 3 σ 2u 2  1  ) = E 1 − + + +  = 1 − + o  3 / 2 ,   2! n 3! n 3 / 2 2n n n    (10-49) where we have made use of (10-46) - (10-47). Substituting (10-49) into (10-48), we obtain n  σ u  1  Φ Y (u ) = 1 − + o 3 / 2   , 2n  n   2 2 and as lim ΦY (u ) → e n →∞ −σ 2u 2 / 2 , (10-50) 21 (10-51) PILLAI
  • 22. since n x  lim 1 −  → e −x . n →∞ n  (10-52) [Note that o(1 / n 3 / 2 ) terms in (10-50) decay faster than 1/ n 3 / 2 ]. But (10-51) represents the characteristic function of a zero σ 2 and (10-45) follows. mean normal r.v with variance The central limit theorem states that a large sum of independent random variables each with finite variance tends to behave like a normal random variable. Thus the individual p.d.fs become unimportant to analyze the collective sum behavior. If we model the noise phenomenon as the sum of a large number of independent random variables (eg: electron motion in resistor components), then this theorem allows us to conclude that noise behaves like a Gaussian r.v. 22 PILLAI
  • 23. It may be remarked that the finite variance assumption is necessary for the theorem to hold good. To prove its importance, consider the r.vs to be Cauchy distributed, and let X 1 + X 2 + + X n Y = . n (10-53) where each X i ∼ C (α). Then since Φ X i (u ) = e −α |u| , (10-54) substituting this into (10-48), we get n ( ΦY ( u ) = ∏ Φ X ( u / n ) = e i =1 −α |u|/ n ) n ~ C (α n ), (10-55) which shows that Y is still Cauchy with parameter α n . In other words, central limit theorem doesn’t hold good for a set of Cauchy r.vs as their variances are undefined. 23 PILLAI
  • 24. Joint characteristic functions are useful in determining the p.d.f of linear combinations of r.vs. For example, with X and Y as independent Poisson r.vs with parameters and1 λ λ 2 respectively, let Z = X +Y . Φ Z (u ) = Φ X (u )ΦY (u ). Then (10-56) (10-57) But from (6-33) Φ X (u ) = e λ1 ( e ju −1) , so that Φ Z (u ) = e ( λ1 + λ2 )( e ju −1) ΦY (u ) = e λ2 ( e ju −1) ∼ P( λ1 + λ2 ) i.e., sum of independent Poisson r.vs is also a Poisson random variable. 24 (10-58) (10-59) PILLAI