SlideShare a Scribd company logo
1 of 70
Overview
of
Random variables and random process
Table of Contents
2
◊ 1 Introduction
◊ 2 Probability
◊ 3 Random Variables
◊ 4 Statistical Averages
◊ 5 Random Processes
◊ 6 Mean, Correlation and Covariance Functions
◊ 7 Transmission of a Random Process through a Linear
Filter
◊ 8 Power Spectral Density
◊ 9 Power Spectral Density
◊ 9 Gaussian Process
◊ 10 Noise
◊ 11 Narrowband Noise
Introduction
◊ Fourier transform is a mathematical tool for the representation of
deterministic signals.
◊ Deterministic signals: the class of signals that may be modeled as
completely specified functions of time.
A signal is “random” if it is not possible to predict its precise value
◊
3
in advance.
◊ A random process consists of an ensemble (family) of sample
functions, each of which varies randomly with time.
◊ A random variable is obtained by observing a random process at a
fixed instant of time.
Probability
◊
Probability theory is rooted in phenomena that, explicitly or implicitly, can be
modeled by an experiment with an outcome that is subject to chance.
◊ Example: Experiment may be the observation of the result of tossing a fair
coin. In this experiment, the possible outcomes of a trial are “heads” or
“tails”.
◊ If an experiment has K possible outcomes, then for the kth possible outcome
we have a point called the sample point, which we denote by sk. With this
basic framework, we make the following definitions:
◊ The set of all possible outcomes of the experiment is calledthe
sample space, which we denote by S.
◊ An event corresponds to either a single sample point or a set of sample
points in the space S.
4
Probability
◊ Probability theory is rooted in phenomena that, explicitly or
implicitly, can be modeled by an experiment with an outcome that is
subject to chance.
◊ Example: Experiment may be the observation of the result of
tossing a fair coin. In this experiment, the possible outcomes of a
trial are “heads” or “tails”.
◊ If an experiment has K possible outcomes, then for the kth possible
outcome we have a point called the sample point, which we denote
by sk. With this basic framework, we make the following definitions:
◊ The set of all possible outcomes of the experiment is calledthe
sample space, which we denote by S.
◊ An event corresponds to either a single sample point or a set of
sample points in the space S.
5
Probability
◊ A single sample point is called an elementary event.
◊ The entire sample space S is called the sure event; and the null set
is called the null or impossible event.

◊ Two events are mutually exclusive if the occurrence of one event
precludes the occurrence of the other event.
◊ A probability measure P is a function that assigns a non-negative
number to an event A in the sample space S and satisfies the
following three properties (axioms):
5.1
5.2
1. 0  PA1
2. PS 1
3. If A and B are two mutually exclusive events, then
PA B PA PB 5.3
6
7
Conditional Probability
Conditional Probability
◊ Example 5.1 Binary Symmetric Channel
◊ This channel is said to be discrete in that it is designed to handle
discrete messages.
◊ The channel is memoryless in the sense that the channel output at
any time depends only on the channel input at that time.
◊ The channel is symmetric, which means that the probability of
receiving symbol 1 when 0 is sent is the same as the probability
of receiving symbol 0 when symbol 1 is sent.
11
Conditional Probability
Random Variables
◊ We denote the random variable as X(s) or just X.
◊ X is a function.
◊ Random variable may be discrete or continuous.
◊ Consider the random variable X and the probability of the event
X ≤ x. We denote this probability by P[X ≤ x].
◊ To simplify our notation, we write
5.15
FX x  PX  x
◊ The function FX(x) is called the cumulative distribution
function (cdf) or simply the distribution function of the random
variable X.
◊ The distribution function FX(x) has the following properties:
1. 0  Fx x 1
15
if x1 x2
2. Fx x1 Fx x2 
Random Variables
16
There may be more than one
random variable associated with
the same random experiment.
Random Variables
◊ If the distribution function is continuously differentiable, then
X X
dx
f x
d
F x 5.17
◊ fX(x) is called the probability density function (pdf) of the random
variable X.
◊ Probability of the event x1 < X ≤ x2 equals
Px1  X  x2  PX  x2  PX  x1
 FX x2  FX x1 5.19
 x

1


x2
f xdx
x
X
F x X
f ξdξ



1
17
X
x
◊ Probability density function must always be a nonnegative function,
and with a total area of one.
Random Variables
◊ Example 5.2 Uniform Distribution
1

0,

x  a
, a  x  b
f X x  

ba
x  b


0,
x  a

0,

x a
FX x  , a  x  b
x  b

b a


0
,
18
Random Variables
◊ Several Random Variables
◊ Consider two random variables X and Y. We define the joint
distribution function FX,Y(x,y) as the probability that the random
variable X is less than or equal to a specified value x and that the
random variable Y is less than or equal to a specified value y.
FX ,Y x,y  P X  x,Y  y 5.23
◊ Suppose that joint distribution function FX,Y(x,y) is continuous
everywhere, and that the partial derivative
X ,Y
X ,Y
2
F x, y
f x, y 5.24
xy
19
exists and is continuous everywhere. We call the function fX,Y(x,y)
the joint probability density function of the random variables X
and Y.
Random Variables
◊ Several Random Variables
◊ The joint distribution function FX,Y(x,y) is a monotone-
nondecreasing function of both x and y.
◊
X,Y
f ,dd 1
 
 
◊ Marginal density fX(x)
       
X X ,Y X X ,Y
f f
, dd x, d 5.27
 x 
  
F x     f x 
  
◊ Suppose that X and Y are two continuous random variables with
joint probability density function fX,Y(x,y). The conditional
probability density function of Y given that X = x is defined by
 
f x,y
  5.28
20
X ,Y
Y
X
f x
f y x 
Random Variables
◊ Several Random Variables
◊ If the random variable X and Y are statistically independent, then
knowledge of the outcome of X can in no way affect the
distribution of Y.
fY y x fY y b
y5.
28
 f x, y f x f
y5.32
X ,Y X Y
Px  A, Y B  P X  A PY B
21
5.33
Random Variables
◊ Example 5.3 Binomial Random Variable
◊ Consider a sequence of coin-tossing experiments where the
probability of a head is p and let Xn be the Bernoulli random
variable representing the outcome of the nth toss.
◊ Let Y be the number of heads that occur on N tosses of the coins:
N
n1
Y   Xn
y 
 
PY  y
N 
py
1 pNy
N 

y 
N!
y!N  y!
 
22
Statistical Averages
◊ The expected value or mean of a random variable X is defined by
xdx 5.36
x X


  EX 

xf
◊ Function of a Random Variable
◊ Let X denote a random variable, and let g(X) denote a real-
valued function defined on the real line. We denote as
Y  g X  5.37
◊ To find the expected value of the random variable Y.
23
          5.38
Y x
E Y yf g x f x dx
 
 

g X  
   
y dy    E

Statistical Averages
◊ Example 5.4 Cosinusoidal Random Variable
◊ Let Y=g(X)=cos(X)
◊ X is a random variable uniformly distributed in the interval (-π, π)
  x  
f
 1
,
x 
2

otherwise
X

0,
EY   
1
cosx dx


 
2 
 
sin x
1
2

x
 
 0
24
Statistical Averages
◊ Moments
◊ For the special case of g(X) = X n, we obtain the nth moment of
the probability distribution of the random variable X; that is
xdx 5.39
X

  
E 
Xn
 

xn
f
◊ Mean-square value of X:
5.40
xdx
  
E 
X2
 

x2
f
◊ The nth central moment is
25
X

    5.41
n n
X X X

   x   f xdx
  
E X  
Statistical Averages
◊ For n = 2 the second central moment is referred to as the variance of
the random variable X, written as
     
2 2
X X

 
var X   x   f X xdx 5.42
  
E X  
◊ The variance of a random variable X is commonly denoted as .
2
X

◊ The square root of the variance is called the standard deviation of
the random variable X.
◊ 2
2  
 
X X
X
2
X
  varX
 

 2 X   
 E X  
 E

X2
 
26
2
X
2
X
 
 E X  2 E X  
2
5.44
2
X
 
   
 
 E X
Statistical Averages
◊ Chebyshev inequality
◊ Suppose X is an arbitrary random variable with finite mean mx
and finite variance σx
2. For any positive number δ:
2


2
x
P X  m  
x
◊ Proof:
2 2

2
x
x x x
|xm |
  (x  m ) p(x)dx  (x  m )2
p( x)dx
 
 
 p( x)dx   2
P(|X  m | )
x
27
|xmx |
Statistical Averages
◊ Chebyshev inequality
◊ Another way to view the Chebyshev bound is working with
the zero mean random variable Y=X-mx.
◊ Define a function g(Y) as:
Y 

gY

1
and EgY PY  
Y 

0
2
 
 


Y 
2
◊ Upper-bound g(Y) by the quadratic (Y/δ) , i.e. g Y
◊ The tail probability
2
2
2
 2
 2
2

EY2
y
x



 

Y 2

EgY E

28
Statistical Averages
◊ Chebychev inequality
◊ A quadratic upper bound on g(Y) used in obtaining the tail
probability (Chebyshev bound)
◊ For many practical applications, the Chebyshev bound is
extremely loose.
29
◊
Statistical Averages
Characteristic function X is defined as the expectation of the
complex exponential function exp( jυX ), as shown by
        5.45
X X
exp j x exp j
 X dx


 f
  
j  E  X 
◊ In other words , the characteristic function X  is the Fourier
transform of the probability density function fX(x).
◊ Analogous with the inverse Fourier transform:
5.46
30
X X


2 
f x 
1
jexp jX d

Statistical Averages
◊ Characteristic functions
◊ First moment (mean) can be obtained by:
v0
x
dv
( jv)
E(X )  m   j
d
◊ Since the differentiation process can be repeated, n-th
moment can be calculated by:
n
E( X )  ( j)n d n
( jv)
v0
31
dvn
Statistical Averages
◊ Characteristic functions
◊ Determining the PDF of a sum of statistically independent
random variables:
jvY
Y
i
n
Y  X  
i1 
 

X

( jv)  E(e )  E  expjv
n
 i 

 i1 

n
n
jvX
e 
i
e 
p(x1, x2 ,..., xn )dx1dx2 ...dxn
jvxi






 E 

 

...

i1  i1 
n
Since the random variables are statistically independent,
p(x1, x2 ,..., xn )  p(x1) p(x2 )...p(xn )   Y ( jv)   Xi
(jv)
32
i1
If Xi are iid (independent and identically distributed)
  ( jv)  
 ( jv)n
Y X
Statistical Averages
◊ Characteristic functions
◊ The PDF of Y is determined from the inverse Fourier transform of ΨY(jv).
◊ Since the characteristic function of the sum of n statistically independent random variables
is equal to the product of the characteristic functions of the individual random variables, it
follows that, in the transform domain, the PDF of Y is the n- fold convolution of the PDFs
of the Xi.
◊ Usually, the n-fold convolution is more difficult to perform than the characteristic function
method in determining the PDF of Y.
33
Statistical Averages
◊ Example 5.5 Gaussian Random Variable
◊ The probability density function of such a Gaussian random
variable is defined by:
1
X 2
X
X
2
2
 x   2

f x expX
,    x 
 
 
◊ The characteristic function of a Gaussian random variable with
mean mx and variance σ2 is (Problem 5.1):
1
2
exmx 2
/2 2 dx  e jvmx 1/ 2v2
 2




 
e jvx
 jv 
◊ It can be shown that the central moments of a Gaussian random
variable are given by:

13 (k 1) k
(even k)

34
 
0 (odd k)
k
k
E(X  m )  
x
Statistical Averages
◊ Example 5.5 Gaussian Random Variable (cont.)
◊ The sum of n statistically independent Gaussian random
variables is also a Gaussian random variable.
◊ Proof:
n
Y  Xi
i1
2 2
n
n
Y e i i
Xi 

jvmy v2 2 /2
y
jvm v  /2
 e
 jv
 jv
and 
i 1
i 1
n
n
y i
y 
2 2
i
 
where m  m
Therefore,Y is Gaussian-distributed withmeanmy
i1 i1
2
.
35
and variance y
Statistical Averages
◊ Joint Moments
◊ Consider next a pair of random variables X and Y. A set of
statistical averages of importance in this case are the joint
moments, namely, the expected value of Xi Y k, where i and k
may assume any positive integer values. We may thus write
X,Y x, ydxdy 5.51
   
E
X i
Y k
  xi
yk
f
◊ A joint moment of particular importance is the correlation
defined by E[XY], which corresponds to i = k = 1.
Y  EY 5.53
36
 X Y
◊ Covariance of X and Y :
covXY  E 
 X  EX  = EXY
Statistical Averages
◊ Correlation coefficient of X and Y :
 
cov XY
5.54
37
X Y
◊ σX and σY denote the variances of X and Y.
◊ We say X and Y are uncorrelated if and only if cov[XY] = 0.
◊ Note that if X and Y are statistically independent, then they are
uncorrelated.
◊ The converse of the above statement is not necessarily true.
◊ We say X and Y are orthogonal if and only if E[XY] = 0.
Statistical Averages
◊ Example 5.6 Moments of a Bernoulli Random Variable
◊ Consider the coin-tossing experiment where the probability of a
head is p. Let X be a random variable that takes the value 0 if the
result is a tail and 1 if it is a head. We say that X is a Bernoulli
random variable.

1 p 1
x  0
x  1
otherwise


0

P X  x 
p
EX  kPX  k 01 p1 p  p
k0
 
   
1
2
X
  k 
X 
2
P X k
j k
2
j
  
 j k 
 
 



E X  E X j  k
j  k
E
X X  
E X
k0

p2
 
j  k
p j  k
 0  p2
1 p 1 p2
 p 1 p
38
1
k0
  
 
j 
2
where the E X k2
PX  k.
Random Processes
An ensemble of sample functions.
39
Random Processes
◊ At any given time instant, the value of a stochastic process
is a random variable indexed by the parameter t. We
denote such a process by X(t).
◊ In general, the parameter t is continuous, whereas X may
be either continuous or discrete, depending on the
characteristics of the source that generates the stochastic
process.
◊ The noise voltage generated by a single resistor or a single
information source represents a single realization of the
stochastic process. It is called a sample function.
40
Random Processes
◊ The set of all possible sample functions constitutes an
ensemble of sample functions or, equivalently, the
stochastic process X(t).
◊ In general, the number of sample functions in the
ensemble is assumed to be extremely large; often it is
infinite.
◊ Having defined a stochastic process X(t) as an ensemble of
sample functions, we may consider the values of the
process at any set of time instants t1>t2>t3>…>tn, where n
is any positive integer.
◊ In general, the random variables Xt  X ti ,i 1,2,..., n, are
i
41
t1 t2 tn
characterized statistically by their joint PDF px , x ,..., x 
.
Random Processes
◊ Stationary stochastic processes
◊ Consider another set of n random variables Xt t  X ti  t,
i
i  1, 2,..., n, where t is an arbitrary time shift. These random
variables are characterized by the joint PDF pxt1t , xt2 t ,..., xtn t .
◊The jont PDFs of the random variables Xt and Xt t ,i  1,2,...,n,
i i
may or may not be identical. When they are identical, i.e., when
pxt1
, xt2
,..., xtn
 pxt1t , xt2 t ,..., xtn t 
for all t and all n, it is said to be stationary in the strict sense (SSS).
◊ When the joint PDFs are different, the stochastic process is
non-stationary.
42
Random Processes
◊ Averages for a stochastic process are called ensemble averages.
◊

i
n
n
The nth moment of the random variable Xt is defined as :
x dx
EX  x p 
◊

ti ti ti
ti
◊
In general, the value of the nth moment will depend on the
time instant ti if thePDF of Xt depends on ti.
i
When the processis stationary, p xt t  p x for allt.
i ti
Therefore,the PDFis independent of time,and,as a
consequence, the nth momentisindependent of time.
43
Random Processes
◊ Two random variables: Xt
i
 X ti ,i 1,2.
◊ The correlation is measured by the joint moment:
 
x x p x , x dx dx
 
E X X 
t1 t2 t1 t2 t1 t2 t1 t2
 
◊ Since this joint moment depends on the time instants t1 and
t2, it is denoted by RX(t1 ,t2).
◊ RX(t1 ,t2) is called the autocorrelation function of the
stochastic process.
44
◊ For a stationary stochastic process, the joint moment is:
E(Xt Xt )  RX (t1 ,t2 )  RX (t1  t2 )  RX ()
1 2
◊
' '
1 1 1 1 1 1
X
t
t  t  t t 
RX ( )  E(Xt X )  E(X X )  E(X X )  R ()
2
◊ Average power in the process X(t): RX(0)=E(Xt ).
Random Processes
45
◊ Wide-sense stationary (WSS)
◊ A wide-sense stationary process has the property that the
mean value of the process is independent of time (a
constant) and where the autocorrelation function satisfies
the condition that RX(t1,t2)=RX(t1-t2).
◊ Wide-sense stationarity is a less stringent condition than
strict-sense stationarity.
Random Processes
◊ Auto-covariance function
◊ The auto-covariance function of a stochastic process is
defined as:
t1,t2  E 
Xt1
 mt1 
 

Xt2
 mt2 

 RX t1,t2  mt1 mt2 
◊ When the process is stationary, the auto-covariance
function simplifies to:
2
(t1,t2 )  (t1 t2 )  ()  RX ()  m
◊ For a Gaussian random process, higher-order moments can
be expressed in terms of first and second moments.
Consequently, a Gaussian random process is completely
characterized by its first two moments.
46
Mean, Correlation and Covariance Functions
        5.57
X X t
xf x dx

◊ Consider a random process X(t). We define the mean of the
process X(t) as the expectation of the random variable obtained by
observing the process at some time t, as shown by


  
t  E
[X t 

◊ A random process is said to be stationary to first order if the
distribution function (and therefore density function) of X(t) does
not vary with time.
47
       
1 2 1 2
X t X t
f x for all t for all t
x  f and t  X t  X 5.59
◊ The mean of the random process is a constant.
◊ The variance of such a process is also constant.
Mean, Correlation and Covariance Functions
◊ We define the autocorrelation function of the process X(t) as the
expectation of the product of two random variables X(t1) and X(t2).
RX t1,t2  E
X t1 X t2 
◊ We say a random process X(t) is stationary to second order if the
     
1 2
1 2 1 2 1 2 5.60
X t , X t
x x f x , x dx dx
 
 
  
   
48
 
1 2 1 2
X t , X t
joint distribution f x , x depends on the difference between
the observation time t1 and t2.
RX t1,t2  RX t2 t1  for all t1 and t2 5.61
◊ The autocovariance function of a stationary random process X(t) is
written as

X 2 1
2
X
t 5.62
t  
CX t1,t2  E

X t1  X X t2  X 
 R
Mean, Correlation and Covariance Functions
◊ For convenience of notation, we redefine the autocorrelation function of
a stationary process X(t) as
for all t
RX  E 

X t  Xt
 5.63
◊ This autocorrelation function has several important properties:
5.64
5.65
49
1. RX 0  E 

X t

2
2. RX   RX 
3. RX   RX 0 5.67
◊ Proof of (5.64) can be obtained from (5.63) by putting τ = 0.
5.6 Mean, Correlation and Covariance Functions
◊ Proof of (5.65):
RX  E

X t  X t
 E

X tX t  
 RX 



◊ Proof of (5.67):
 
E
X t   X t2
  0
 E

X 2
t  

  2E

X t  X t
 E

X 2
t 

 0
 2RX 0  2RX  0
 RX 0  RX  RX 0
 RX  RX 0
50
5.6 Mean, Correlation and Covariance Functions
◊ The physical significance of the autocorrelation function RX(τ) is that
it provides a means of describing the “interdependence” of two
random variables obtained by observing a random process
X(t) at times τ seconds apart.
51
5.6 Mean, Correlation and Covariance Functions
◊ Example 5.7 Sinusoidal Signal with Random Phase
◊ Consider a sinusoidal signal with random phase:
X t  Acos 2 f t  f
 1
,     
c  
2
elsewhere
 

0,
 
2
2
cos 4
A
RX  E 

X t  X t 

    fc 

 fct   
2 fc  2 

 E A
   
 E

cos 2
2
cos 4 c c c
A2
 f 

2

A 1
 f t  2 f   2 d  cos 2
 2
cos 2 fc


2 2
2

A
2
52
5.6 Mean, Correlation and Covariance Functions
◊ Averages for joint stochastic processes
◊ Let X(t) and Y(t) denote two stochastic processes and let
Xti
≡X(ti), i=1,2,…,n, Yt’j
≡Y(t’j), j=1,2,…,m, represent the
random variables at times t1>t2>t3>…>tn, and
t’1>t’2>t’3>…>t’m , respectively. The two processes are
characterized statistically by their joint PDF:
◊ The cross-correlation function of X(t) and Y(t), denoted by
 
' '
1 2 n 1 2
m
t t t t t t
p x , x ,..., x , y , y ' ,..., y
 
Rxy(t1,t2), is defined as the joint moment:
1 2   t1 t2 t1 t2 t1 t2
Rxy (t1,t2 )  E(Xt Yt )  x y p(x , y )dx dy
◊ The cross-covariance is:
53
xy (t1,t2 )  Rxy (t1,t2 )  mx (t1)my (t2 )
5.6 Mean, Correlation and Covariance Functions
◊ Averages for joint stochastic processes
◊ When the process are jointly and individually stationary, we
have Rxy(t1,t2)=Rxy(t1-t2), and μxy(t1,t2)= μxy(t1-t2):
' ' ' '
1 1 1 1 1 1
yx
xy t t  t  t t t 
R ( )  E(X Y )
)  E(X Y )  E(Y X )  R (
◊ The stochastic processes X(t) and Y(t) are said to be
statistically independent if and only if :
' ' '
'
' '
1 2 n
1 2 m 1 2 m
t
t t
t t t t t
t
) p(y , y ,..., y )
,..., x
p( xt , xt ,..., xt , y , y ,..., y )  p(x , x
1 2 n
for all choices of ti and t’i and for all positive integers n andm.
◊ The processes are said to be uncorrelated if
54
1 2
t t
Rxy (t1,t2 )  E(X )E(Y )  xy (t1,t2 )  0
5.6 Mean, Correlation and Covariance Functions
◊ Example 5.9 Quadrature-Modulated Processes
◊ Consider a pair of quadrature-modulated processes X1(t) and X2(t):
X1 t X tcos2 fct  
X2 t X tsin2 fct  
R12  t X2 t  
 E

X1
 E

X tX t  cos2 fct  sin2 fct  2 fc  

 
 E

X tX t  

 E
c
o
s2 fct  sin2 fct  2 fc  


2
X c c c
 

1
R Esin4 f t  2 f t  2sin2 f  

2
X c
 
1
R sin 2 f  12 1 2
 
R 0 E 
X tX t  0
55
5.6 Mean, Correlation and Covariance Functions
◊ Ergodic Processes
◊ In many instances, it is difficult or impossible to observe all sample
functions of a random process at a given time.
◊ It is often more convenient to observe a single sample function for a
long period of time.
◊ For a sample function x t), the time average of the mean value over
an observation period 2T is
T
 
1
xtdt 5.84
56
x,T
2T T

◊ For many stochastic processes of interest in communications, the
time averages and ensemble averages are equal, a property known as
ergodicity.
◊ This property implies that whenever an ensemble average is required,
we may estimate it by using a time average.
5.6 Mean, Correlation and Covariance Functions
◊ Cyclostationary Processes (in the wide sense)
◊ There is another important class of random processes commonly
encountered in practice, the mean and autocorrelation function of
which exhibit periodicity:
X t1 T  X t1 
RX t1  T ,t2 T  RX t1,t2
for all t1 and t2.
◊ Modeling the process X(t) as cyclostationary adds a new
dimension, namely, period T to the partial description of the
process.
57
5.7 Transmission of a Random Process Through
a Linear Filter
◊ Suppose that a random process X(t) is applied as input to linear
time-invariant filter of impulse response h(t), producing a new
random process Y(t) at the filter output.
◊ Assume that X(t) is a wide-sense stationary random process.
◊ The mean of the output random process Y(t) is given by
   
1
Y 
h  X t  d

1 1


 

  
 t E
Yt  E
   
58
1 1 1
= d

  
 h  E
X t  
   
1 1 1 5.86
X
h  

t  d

 
5.7 Transmission of a Random Process Through
a Linear Filter
◊ When the input random process X(t) is wide-sense stationary, the
mean t
X Y
is a constant X , then mean t is also a constant Y .
   
Y 1 1 X
h  d

 t     H 0 5.87
X 
where H(0) is the zero-frequency (dc) response of the system.
◊ The autocorrelation function of the output random process Y(t) is
given by:
       
1 1 2
Y h  X
 
t  d h  X u  d
 


2 2 

1 
R t,u E
YtY u  E
       
1 1 2 2 1 2
d h  d h 


 

 

 
  E
X t  X u  
 
59
   
1 1 2 2 1 2
X
d h  d h 

 

 R t  ,u 
 
5.7 Transmission of a Random Process Through
a Linear Filter
◊ When the input X(t) is a wide-sense stationary random process, the
autocorrelation function of X(t) is only a function of the difference
between the observation times:
        5.90
1 2 X 1 2 1 2
Y
R   h  h  R    d d
 
 
 
◊ If the input to a stable linear time-invariant filter is a wide-sense
stationary random process, then the output of the filter is also a
wide-sense stationary random process.
60
5.8Power
Spectral Density
◊ The Fourier transform of the autocorrelation function RX(τ) is called
the power spectral density SX( f ) of the random process
X(t).
     
X X j2 f d
R  exp 
S f 


 5.91
5.92
RX   
SX f expj2 f df
◊ Equations (5.91) and (5.92) are basic relations in the theory of
spectral analysis of random processes, and together they constitute
what are usually called the Einstein-Wiener-Khintchine relations.
61
5
5.
. 8
8 P
P o
ow
we
er
r S
Sp
pe
ec
ct
tr
r a
a l
l D
D e
e n
n s
s i
i t
ty
y
◊ Properties of the Power Spectral Density

◊ Property 1:
◊ Proof: Let f =0 in Eq. (5.91)
    5.93
X X
S R  d

0  
◊ Property 2:    
2
5.94
X
S f df


 
t 
  
E X
◊ Proof: Let τ =0 in Eq. (5.92) and note that RX(0)=E[X2(t)].
62
◊ Property 3: 5.95
5.96
SX f  0 for all f
SX  f  SX f 
◊ Property 4:
◊ Proof: From (5.91)
         
     
X X
X X
R  exp X
R  exp  X
j2 f d
S  f   S f

R  R 
j2 f d 




 
Proof of Eq. (5.95)
◊ It can be shown that (see eq. 5.106) SY f  SX f H f  2
 
2
Y Y X
R   S S f H f  expj2 f df




f exp j2 f df 
 
   
2
Y X
R 0

 
 E Y t
   

 S f  H f 2
df  0 for any H f 
  
◊ Suppose we let |H( f )|2=1 for any arbitrarily small interval f1 ≤ f ≤ f2 ,
and H( f )=0 outside this interval. Then, we have:
1
63
X
f2
S f df  0
f
This is possible if an only if SX( f )≥0 for all f.
◊ Conclusion: SX( f )≥0 for all f.
5.8Power
Spectral Density
◊ Example 5.10 Sinusoidal Signal with Random Phase
◊ Consider the random process X(t)=Acos(2πfc t+Θ), where Θ is a
uniformly distributed random variable over the interval (-π,π).
◊ The autocorrelation function of this random process is given in
Example 5.7: A2
RX  cos2 fc  (5.74)
◊ Taking the Fourier transform of both sides of this relation:
2
A
     
X c c
S 

f  f  f  f  f  (5.97)
4  
64
5.8Power
Spectral Density
◊ Example 5.12 Mixing of a Random Process with a
Sinusoidal Process
◊ A situation that often arises in practice is that of mixing (i.e.,
multiplication) of a WSS random process X(t) with a sinusoidal
signal cos(2πfc t+Θ), where the phase Θ is a random variable that
is uniformly distributed over the interval (0,2π).
◊ Determining the power spectral density of the random process Y(t)
defined by:
(5.101)
65
Y f  X t cos2 fct  
◊ We note that random variable Θ is independent of X(t) .
5.8Power
Spectral Density
◊ Example 5.12 Mixing of a Random Process with a
Sinusoidal Process (continued)
◊ The autocorrelation function of Y(t) is given by:
RY  E 

Yt  Y t


 E 

X t  cos2 fct  2 fc  X tcos2 fct  

 
 E 

X t  X t 
E 
c
o
s2 fct  2 fc  cos2 fct  


2
X c c c
 

1
R E cos2 f   cos 4 f t  2 f  2
2
X c

1
R cos 2 f 
Fourier transform
(5.103)
66
Y X
S f 
1

S
4  c X
f  f S c
f  f 
5.8Power
Spectral Density
◊ Relation among the Power Spectral Densities of the Input and
Output Random Processes
◊ Let SY( f ) denote the power spectral density of the output random process Y(t)
obtained by passing the random process through a linear filter of transfer
function H( f ).
 
Y Y
R  e d

 j2 f
S f          
1 2 1 2 1 2
Y X  d 5.90
 
R   h  h  R    d
 
     
1 2 1 2
X 1 2
h  h  d d d
  
 j2 f
 R    e
  
 
Let  1 2 0
       
0 1 2
1 2 1 2 0
X 0
h  h  R  e d d d
    j 2 f   

      0
1 2
1 2 0
X 0 e
j2 f
h  e d h  e d R  d
  
    j2 f
 j2 f
  1  2 
5.106
67
 H f H 
f S f  H f 2
S f 
X X
5.8Power
Spectral Density
◊ Example 5.13 Comb Filter
◊ Consider the filter of Figure (a) consisting of a delay line and a
summing device. We wish to evaluate the power spectral density
of the filter output Y(t).
68
5.8Power
Spectral Density
◊ Example 5.13 Comb Filter (continued)
◊ The transfer function of this filter is
H f 1 exp j2 fT 1 cos2 fT  jsin2 fT 
   
2 2
H f cos 2

 
1  fT   sin2
2 fT 

 
 
=2 

1
cos 2 fT 

 =4sin2
fT
◊ Because of the periodic form of this frequency response (Fig. (b)),
the filter is sometimes referred to as a comb filter.
◊ The power spectral density of the filter output is:
SY f  H f  S f   4sin  fTS f 
2 2
X X
If fT is verysmall
69
SY ( f ) 4 f T S ( f )
2 2 2
X
(5.107)
5.9 Gaussian
Process
◊ A random variable Y is defined by:
N
0
T
Y   gt Xtdt
◊ We refer to Y as a linear functional of X(t).
Y ai Xi
i1
aiare constants
Xi are randomvariables
Y is a linearfunctionof theXi.
◊ If the weighting function g(t) is such that the mean-square value of
the random variable Y is finite, and if the random variable Y is a
Gaussian-distributed random variable for every g(t) in this class of
functions, then the process X(t) is said to be a Gaussian process.
◊ In other words, the process X(t) is a Gaussian process if every
linear functional of X(t) is a Gaussian random variable.
◊ The Gaussian process has many properties that make analytic
results possible.
◊ The random processes produced by physical phenomena are often
such that a Gaussian model is appropriate.
70

More Related Content

What's hot

Chapter4 - The Continuous-Time Fourier Transform
Chapter4 - The Continuous-Time Fourier TransformChapter4 - The Continuous-Time Fourier Transform
Chapter4 - The Continuous-Time Fourier TransformAttaporn Ninsuwan
 
Numerical Methods - Power Method for Eigen values
Numerical Methods - Power Method for Eigen valuesNumerical Methods - Power Method for Eigen values
Numerical Methods - Power Method for Eigen valuesDr. Nirav Vyas
 
discrete time signals and systems
 discrete time signals and systems  discrete time signals and systems
discrete time signals and systems Zlatan Ahmadovic
 
Digital communication systems unit 1
Digital communication systems unit 1Digital communication systems unit 1
Digital communication systems unit 1Anil Nigam
 
Bartlett's method pp ts
Bartlett's method pp tsBartlett's method pp ts
Bartlett's method pp tsDiwaker Pant
 
Dft and its applications
Dft and its applicationsDft and its applications
Dft and its applicationsAgam Goel
 
DSP Lab Manual (10ECL57) - VTU Syllabus (KSSEM)
DSP Lab Manual (10ECL57) - VTU Syllabus (KSSEM)DSP Lab Manual (10ECL57) - VTU Syllabus (KSSEM)
DSP Lab Manual (10ECL57) - VTU Syllabus (KSSEM)Ravikiran A
 
Ee443 phase locked loop - presentation - schwappach and brandy
Ee443   phase locked loop - presentation - schwappach and brandyEe443   phase locked loop - presentation - schwappach and brandy
Ee443 phase locked loop - presentation - schwappach and brandyLoren Schwappach
 
Digital modulation technique
Digital modulation techniqueDigital modulation technique
Digital modulation techniqueNidhi Baranwal
 
Chapter3 - Fourier Series Representation of Periodic Signals
Chapter3 - Fourier Series Representation of Periodic SignalsChapter3 - Fourier Series Representation of Periodic Signals
Chapter3 - Fourier Series Representation of Periodic SignalsAttaporn Ninsuwan
 
Channel capacity
Channel capacityChannel capacity
Channel capacityPALLAB DAS
 
DSP_2018_FOEHU - Lec 03 - Discrete-Time Signals and Systems
DSP_2018_FOEHU - Lec 03 - Discrete-Time Signals and SystemsDSP_2018_FOEHU - Lec 03 - Discrete-Time Signals and Systems
DSP_2018_FOEHU - Lec 03 - Discrete-Time Signals and SystemsAmr E. Mohamed
 
Digital System Processing
Digital System Processing Digital System Processing
Digital System Processing Muhammad Zubair
 
Markov Chain Monte Carlo Methods
Markov Chain Monte Carlo MethodsMarkov Chain Monte Carlo Methods
Markov Chain Monte Carlo MethodsFrancesco Casalegno
 
4.Sampling and Hilbert Transform
4.Sampling and Hilbert Transform4.Sampling and Hilbert Transform
4.Sampling and Hilbert TransformINDIAN NAVY
 
3.Properties of signals
3.Properties of signals3.Properties of signals
3.Properties of signalsINDIAN NAVY
 

What's hot (20)

Chapter4 - The Continuous-Time Fourier Transform
Chapter4 - The Continuous-Time Fourier TransformChapter4 - The Continuous-Time Fourier Transform
Chapter4 - The Continuous-Time Fourier Transform
 
Numerical Methods - Power Method for Eigen values
Numerical Methods - Power Method for Eigen valuesNumerical Methods - Power Method for Eigen values
Numerical Methods - Power Method for Eigen values
 
discrete time signals and systems
 discrete time signals and systems  discrete time signals and systems
discrete time signals and systems
 
Digital communication systems unit 1
Digital communication systems unit 1Digital communication systems unit 1
Digital communication systems unit 1
 
Bartlett's method pp ts
Bartlett's method pp tsBartlett's method pp ts
Bartlett's method pp ts
 
Dft and its applications
Dft and its applicationsDft and its applications
Dft and its applications
 
DSP Lab Manual (10ECL57) - VTU Syllabus (KSSEM)
DSP Lab Manual (10ECL57) - VTU Syllabus (KSSEM)DSP Lab Manual (10ECL57) - VTU Syllabus (KSSEM)
DSP Lab Manual (10ECL57) - VTU Syllabus (KSSEM)
 
Ee443 phase locked loop - presentation - schwappach and brandy
Ee443   phase locked loop - presentation - schwappach and brandyEe443   phase locked loop - presentation - schwappach and brandy
Ee443 phase locked loop - presentation - schwappach and brandy
 
Digital modulation technique
Digital modulation techniqueDigital modulation technique
Digital modulation technique
 
Chapter3 - Fourier Series Representation of Periodic Signals
Chapter3 - Fourier Series Representation of Periodic SignalsChapter3 - Fourier Series Representation of Periodic Signals
Chapter3 - Fourier Series Representation of Periodic Signals
 
Sampling theorem
Sampling theoremSampling theorem
Sampling theorem
 
Channel capacity
Channel capacityChannel capacity
Channel capacity
 
quantization
quantizationquantization
quantization
 
DSP_2018_FOEHU - Lec 03 - Discrete-Time Signals and Systems
DSP_2018_FOEHU - Lec 03 - Discrete-Time Signals and SystemsDSP_2018_FOEHU - Lec 03 - Discrete-Time Signals and Systems
DSP_2018_FOEHU - Lec 03 - Discrete-Time Signals and Systems
 
Digital System Processing
Digital System Processing Digital System Processing
Digital System Processing
 
Lti system
Lti systemLti system
Lti system
 
Markov Chain Monte Carlo Methods
Markov Chain Monte Carlo MethodsMarkov Chain Monte Carlo Methods
Markov Chain Monte Carlo Methods
 
4.Sampling and Hilbert Transform
4.Sampling and Hilbert Transform4.Sampling and Hilbert Transform
4.Sampling and Hilbert Transform
 
Information theory
Information theoryInformation theory
Information theory
 
3.Properties of signals
3.Properties of signals3.Properties of signals
3.Properties of signals
 

Similar to Random process.pptx

Doe02 statistics
Doe02 statisticsDoe02 statistics
Doe02 statisticsArif Rahman
 
Probability distribution
Probability distributionProbability distribution
Probability distributionManoj Bhambu
 
2 Review of Statistics. 2 Review of Statistics.
2 Review of Statistics. 2 Review of Statistics.2 Review of Statistics. 2 Review of Statistics.
2 Review of Statistics. 2 Review of Statistics.WeihanKhor2
 
Random variables and probability distributions Random Va.docx
Random variables and probability distributions Random Va.docxRandom variables and probability distributions Random Va.docx
Random variables and probability distributions Random Va.docxcatheryncouper
 
Probability and Statistics
Probability and StatisticsProbability and Statistics
Probability and StatisticsMalik Sb
 
Chapter 1 - Probability Distributions.pdf
Chapter 1 - Probability Distributions.pdfChapter 1 - Probability Distributions.pdf
Chapter 1 - Probability Distributions.pdfMayurMandhub
 
Chapter 4 part3- Means and Variances of Random Variables
Chapter 4 part3- Means and Variances of Random VariablesChapter 4 part3- Means and Variances of Random Variables
Chapter 4 part3- Means and Variances of Random Variablesnszakir
 
Chapter-4 combined.pptx
Chapter-4 combined.pptxChapter-4 combined.pptx
Chapter-4 combined.pptxHamzaHaji6
 
Qt random variables notes
Qt random variables notesQt random variables notes
Qt random variables notesRohan Bhatkar
 
02-Random Variables.ppt
02-Random Variables.ppt02-Random Variables.ppt
02-Random Variables.pptAkliluAyele3
 
Econometrics 2.pptx
Econometrics 2.pptxEconometrics 2.pptx
Econometrics 2.pptxfuad80
 
Ssp notes
Ssp notesSsp notes
Ssp notesbalu902
 
this materials is useful for the students who studying masters level in elect...
this materials is useful for the students who studying masters level in elect...this materials is useful for the students who studying masters level in elect...
this materials is useful for the students who studying masters level in elect...BhojRajAdhikari5
 
2 random variables notes 2p3
2 random variables notes 2p32 random variables notes 2p3
2 random variables notes 2p3MuhannadSaleh
 
Probability and statistics - Discrete Random Variables and Probability Distri...
Probability and statistics - Discrete Random Variables and Probability Distri...Probability and statistics - Discrete Random Variables and Probability Distri...
Probability and statistics - Discrete Random Variables and Probability Distri...Asma CHERIF
 
Random variables
Random variablesRandom variables
Random variablesMenglinLiu1
 
ISM_Session_5 _ 23rd and 24th December.pptx
ISM_Session_5 _ 23rd and 24th December.pptxISM_Session_5 _ 23rd and 24th December.pptx
ISM_Session_5 _ 23rd and 24th December.pptxssuser1eba67
 

Similar to Random process.pptx (20)

Doe02 statistics
Doe02 statisticsDoe02 statistics
Doe02 statistics
 
Probability distribution
Probability distributionProbability distribution
Probability distribution
 
2 Review of Statistics. 2 Review of Statistics.
2 Review of Statistics. 2 Review of Statistics.2 Review of Statistics. 2 Review of Statistics.
2 Review of Statistics. 2 Review of Statistics.
 
Random variables and probability distributions Random Va.docx
Random variables and probability distributions Random Va.docxRandom variables and probability distributions Random Va.docx
Random variables and probability distributions Random Va.docx
 
Unit II PPT.pptx
Unit II PPT.pptxUnit II PPT.pptx
Unit II PPT.pptx
 
Probability and Statistics
Probability and StatisticsProbability and Statistics
Probability and Statistics
 
Chapter 1 - Probability Distributions.pdf
Chapter 1 - Probability Distributions.pdfChapter 1 - Probability Distributions.pdf
Chapter 1 - Probability Distributions.pdf
 
Chapter 4 part3- Means and Variances of Random Variables
Chapter 4 part3- Means and Variances of Random VariablesChapter 4 part3- Means and Variances of Random Variables
Chapter 4 part3- Means and Variances of Random Variables
 
Chapter-4 combined.pptx
Chapter-4 combined.pptxChapter-4 combined.pptx
Chapter-4 combined.pptx
 
Qt random variables notes
Qt random variables notesQt random variables notes
Qt random variables notes
 
02-Random Variables.ppt
02-Random Variables.ppt02-Random Variables.ppt
02-Random Variables.ppt
 
U unit7 ssb
U unit7 ssbU unit7 ssb
U unit7 ssb
 
Econometrics 2.pptx
Econometrics 2.pptxEconometrics 2.pptx
Econometrics 2.pptx
 
Ssp notes
Ssp notesSsp notes
Ssp notes
 
this materials is useful for the students who studying masters level in elect...
this materials is useful for the students who studying masters level in elect...this materials is useful for the students who studying masters level in elect...
this materials is useful for the students who studying masters level in elect...
 
2 random variables notes 2p3
2 random variables notes 2p32 random variables notes 2p3
2 random variables notes 2p3
 
Probability and statistics - Discrete Random Variables and Probability Distri...
Probability and statistics - Discrete Random Variables and Probability Distri...Probability and statistics - Discrete Random Variables and Probability Distri...
Probability and statistics - Discrete Random Variables and Probability Distri...
 
Random variables
Random variablesRandom variables
Random variables
 
Stats chapter 7
Stats chapter 7Stats chapter 7
Stats chapter 7
 
ISM_Session_5 _ 23rd and 24th December.pptx
ISM_Session_5 _ 23rd and 24th December.pptxISM_Session_5 _ 23rd and 24th December.pptx
ISM_Session_5 _ 23rd and 24th December.pptx
 

Recently uploaded

College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...ZTE
 
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝soniya singh
 
Internship report on mechanical engineering
Internship report on mechanical engineeringInternship report on mechanical engineering
Internship report on mechanical engineeringmalavadedarshan25
 
Analog to Digital and Digital to Analog Converter
Analog to Digital and Digital to Analog ConverterAnalog to Digital and Digital to Analog Converter
Analog to Digital and Digital to Analog ConverterAbhinavSharma374939
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)Suman Mia
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSKurinjimalarL3
 
Coefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxCoefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxAsutosh Ranjan
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxJoão Esperancinha
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSCAESB
 
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSHARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSRajkumarAkumalla
 
Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile servicerehmti665
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerAnamika Sarkar
 
Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxpurnimasatapathy1234
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Dr.Costas Sachpazis
 
Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024hassan khalil
 

Recently uploaded (20)

College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
 
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
 
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
 
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
 
Internship report on mechanical engineering
Internship report on mechanical engineeringInternship report on mechanical engineering
Internship report on mechanical engineering
 
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINEDJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
 
Analog to Digital and Digital to Analog Converter
Analog to Digital and Digital to Analog ConverterAnalog to Digital and Digital to Analog Converter
Analog to Digital and Digital to Analog Converter
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
 
Coefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxCoefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptx
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentation
 
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCRCall Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
 
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSHARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
 
Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile service
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
 
Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptx
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
 
Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024
 

Random process.pptx

  • 2. Table of Contents 2 ◊ 1 Introduction ◊ 2 Probability ◊ 3 Random Variables ◊ 4 Statistical Averages ◊ 5 Random Processes ◊ 6 Mean, Correlation and Covariance Functions ◊ 7 Transmission of a Random Process through a Linear Filter ◊ 8 Power Spectral Density ◊ 9 Power Spectral Density ◊ 9 Gaussian Process ◊ 10 Noise ◊ 11 Narrowband Noise
  • 3. Introduction ◊ Fourier transform is a mathematical tool for the representation of deterministic signals. ◊ Deterministic signals: the class of signals that may be modeled as completely specified functions of time. A signal is “random” if it is not possible to predict its precise value ◊ 3 in advance. ◊ A random process consists of an ensemble (family) of sample functions, each of which varies randomly with time. ◊ A random variable is obtained by observing a random process at a fixed instant of time.
  • 4. Probability ◊ Probability theory is rooted in phenomena that, explicitly or implicitly, can be modeled by an experiment with an outcome that is subject to chance. ◊ Example: Experiment may be the observation of the result of tossing a fair coin. In this experiment, the possible outcomes of a trial are “heads” or “tails”. ◊ If an experiment has K possible outcomes, then for the kth possible outcome we have a point called the sample point, which we denote by sk. With this basic framework, we make the following definitions: ◊ The set of all possible outcomes of the experiment is calledthe sample space, which we denote by S. ◊ An event corresponds to either a single sample point or a set of sample points in the space S. 4
  • 5. Probability ◊ Probability theory is rooted in phenomena that, explicitly or implicitly, can be modeled by an experiment with an outcome that is subject to chance. ◊ Example: Experiment may be the observation of the result of tossing a fair coin. In this experiment, the possible outcomes of a trial are “heads” or “tails”. ◊ If an experiment has K possible outcomes, then for the kth possible outcome we have a point called the sample point, which we denote by sk. With this basic framework, we make the following definitions: ◊ The set of all possible outcomes of the experiment is calledthe sample space, which we denote by S. ◊ An event corresponds to either a single sample point or a set of sample points in the space S. 5
  • 6. Probability ◊ A single sample point is called an elementary event. ◊ The entire sample space S is called the sure event; and the null set is called the null or impossible event.  ◊ Two events are mutually exclusive if the occurrence of one event precludes the occurrence of the other event. ◊ A probability measure P is a function that assigns a non-negative number to an event A in the sample space S and satisfies the following three properties (axioms): 5.1 5.2 1. 0  PA1 2. PS 1 3. If A and B are two mutually exclusive events, then PA B PA PB 5.3 6
  • 7. 7
  • 8.
  • 9.
  • 11. Conditional Probability ◊ Example 5.1 Binary Symmetric Channel ◊ This channel is said to be discrete in that it is designed to handle discrete messages. ◊ The channel is memoryless in the sense that the channel output at any time depends only on the channel input at that time. ◊ The channel is symmetric, which means that the probability of receiving symbol 1 when 0 is sent is the same as the probability of receiving symbol 0 when symbol 1 is sent. 11
  • 13.
  • 14.
  • 15. Random Variables ◊ We denote the random variable as X(s) or just X. ◊ X is a function. ◊ Random variable may be discrete or continuous. ◊ Consider the random variable X and the probability of the event X ≤ x. We denote this probability by P[X ≤ x]. ◊ To simplify our notation, we write 5.15 FX x  PX  x ◊ The function FX(x) is called the cumulative distribution function (cdf) or simply the distribution function of the random variable X. ◊ The distribution function FX(x) has the following properties: 1. 0  Fx x 1 15 if x1 x2 2. Fx x1 Fx x2 
  • 16. Random Variables 16 There may be more than one random variable associated with the same random experiment.
  • 17. Random Variables ◊ If the distribution function is continuously differentiable, then X X dx f x d F x 5.17 ◊ fX(x) is called the probability density function (pdf) of the random variable X. ◊ Probability of the event x1 < X ≤ x2 equals Px1  X  x2  PX  x2  PX  x1  FX x2  FX x1 5.19  x  1   x2 f xdx x X F x X f ξdξ    1 17 X x ◊ Probability density function must always be a nonnegative function, and with a total area of one.
  • 18. Random Variables ◊ Example 5.2 Uniform Distribution 1  0,  x  a , a  x  b f X x    ba x  b   0, x  a  0,  x a FX x  , a  x  b x  b  b a   0 , 18
  • 19. Random Variables ◊ Several Random Variables ◊ Consider two random variables X and Y. We define the joint distribution function FX,Y(x,y) as the probability that the random variable X is less than or equal to a specified value x and that the random variable Y is less than or equal to a specified value y. FX ,Y x,y  P X  x,Y  y 5.23 ◊ Suppose that joint distribution function FX,Y(x,y) is continuous everywhere, and that the partial derivative X ,Y X ,Y 2 F x, y f x, y 5.24 xy 19 exists and is continuous everywhere. We call the function fX,Y(x,y) the joint probability density function of the random variables X and Y.
  • 20. Random Variables ◊ Several Random Variables ◊ The joint distribution function FX,Y(x,y) is a monotone- nondecreasing function of both x and y. ◊ X,Y f ,dd 1     ◊ Marginal density fX(x)         X X ,Y X X ,Y f f , dd x, d 5.27  x     F x     f x     ◊ Suppose that X and Y are two continuous random variables with joint probability density function fX,Y(x,y). The conditional probability density function of Y given that X = x is defined by   f x,y   5.28 20 X ,Y Y X f x f y x 
  • 21. Random Variables ◊ Several Random Variables ◊ If the random variable X and Y are statistically independent, then knowledge of the outcome of X can in no way affect the distribution of Y. fY y x fY y b y5. 28  f x, y f x f y5.32 X ,Y X Y Px  A, Y B  P X  A PY B 21 5.33
  • 22. Random Variables ◊ Example 5.3 Binomial Random Variable ◊ Consider a sequence of coin-tossing experiments where the probability of a head is p and let Xn be the Bernoulli random variable representing the outcome of the nth toss. ◊ Let Y be the number of heads that occur on N tosses of the coins: N n1 Y   Xn y    PY  y N  py 1 pNy N   y  N! y!N  y!   22
  • 23. Statistical Averages ◊ The expected value or mean of a random variable X is defined by xdx 5.36 x X     EX   xf ◊ Function of a Random Variable ◊ Let X denote a random variable, and let g(X) denote a real- valued function defined on the real line. We denote as Y  g X  5.37 ◊ To find the expected value of the random variable Y. 23           5.38 Y x E Y yf g x f x dx      g X       y dy    E 
  • 24. Statistical Averages ◊ Example 5.4 Cosinusoidal Random Variable ◊ Let Y=g(X)=cos(X) ◊ X is a random variable uniformly distributed in the interval (-π, π)   x   f  1 , x  2  otherwise X  0, EY    1 cosx dx     2    sin x 1 2  x    0 24
  • 25. Statistical Averages ◊ Moments ◊ For the special case of g(X) = X n, we obtain the nth moment of the probability distribution of the random variable X; that is xdx 5.39 X     E  Xn    xn f ◊ Mean-square value of X: 5.40 xdx    E  X2    x2 f ◊ The nth central moment is 25 X      5.41 n n X X X     x   f xdx    E X  
  • 26. Statistical Averages ◊ For n = 2 the second central moment is referred to as the variance of the random variable X, written as       2 2 X X    var X   x   f X xdx 5.42    E X   ◊ The variance of a random variable X is commonly denoted as . 2 X  ◊ The square root of the variance is called the standard deviation of the random variable X. ◊ 2 2     X X X 2 X   varX     2 X     E X    E  X2   26 2 X 2 X    E X  2 E X   2 5.44 2 X          E X
  • 27. Statistical Averages ◊ Chebyshev inequality ◊ Suppose X is an arbitrary random variable with finite mean mx and finite variance σx 2. For any positive number δ: 2   2 x P X  m   x ◊ Proof: 2 2  2 x x x x |xm |   (x  m ) p(x)dx  (x  m )2 p( x)dx      p( x)dx   2 P(|X  m | ) x 27 |xmx |
  • 28. Statistical Averages ◊ Chebyshev inequality ◊ Another way to view the Chebyshev bound is working with the zero mean random variable Y=X-mx. ◊ Define a function g(Y) as: Y   gY  1 and EgY PY   Y   0 2       Y  2 ◊ Upper-bound g(Y) by the quadratic (Y/δ) , i.e. g Y ◊ The tail probability 2 2 2  2  2 2  EY2 y x       Y 2  EgY E  28
  • 29. Statistical Averages ◊ Chebychev inequality ◊ A quadratic upper bound on g(Y) used in obtaining the tail probability (Chebyshev bound) ◊ For many practical applications, the Chebyshev bound is extremely loose. 29
  • 30. ◊ Statistical Averages Characteristic function X is defined as the expectation of the complex exponential function exp( jυX ), as shown by         5.45 X X exp j x exp j  X dx    f    j  E  X  ◊ In other words , the characteristic function X  is the Fourier transform of the probability density function fX(x). ◊ Analogous with the inverse Fourier transform: 5.46 30 X X   2  f x  1 jexp jX d 
  • 31. Statistical Averages ◊ Characteristic functions ◊ First moment (mean) can be obtained by: v0 x dv ( jv) E(X )  m   j d ◊ Since the differentiation process can be repeated, n-th moment can be calculated by: n E( X )  ( j)n d n ( jv) v0 31 dvn
  • 32. Statistical Averages ◊ Characteristic functions ◊ Determining the PDF of a sum of statistically independent random variables: jvY Y i n Y  X   i1     X  ( jv)  E(e )  E  expjv n  i    i1   n n jvX e  i e  p(x1, x2 ,..., xn )dx1dx2 ...dxn jvxi        E      ...  i1  i1  n Since the random variables are statistically independent, p(x1, x2 ,..., xn )  p(x1) p(x2 )...p(xn )   Y ( jv)   Xi (jv) 32 i1 If Xi are iid (independent and identically distributed)   ( jv)    ( jv)n Y X
  • 33. Statistical Averages ◊ Characteristic functions ◊ The PDF of Y is determined from the inverse Fourier transform of ΨY(jv). ◊ Since the characteristic function of the sum of n statistically independent random variables is equal to the product of the characteristic functions of the individual random variables, it follows that, in the transform domain, the PDF of Y is the n- fold convolution of the PDFs of the Xi. ◊ Usually, the n-fold convolution is more difficult to perform than the characteristic function method in determining the PDF of Y. 33
  • 34. Statistical Averages ◊ Example 5.5 Gaussian Random Variable ◊ The probability density function of such a Gaussian random variable is defined by: 1 X 2 X X 2 2  x   2  f x expX ,    x      ◊ The characteristic function of a Gaussian random variable with mean mx and variance σ2 is (Problem 5.1): 1 2 exmx 2 /2 2 dx  e jvmx 1/ 2v2  2       e jvx  jv  ◊ It can be shown that the central moments of a Gaussian random variable are given by:  13 (k 1) k (even k)  34   0 (odd k) k k E(X  m )   x
  • 35. Statistical Averages ◊ Example 5.5 Gaussian Random Variable (cont.) ◊ The sum of n statistically independent Gaussian random variables is also a Gaussian random variable. ◊ Proof: n Y  Xi i1 2 2 n n Y e i i Xi   jvmy v2 2 /2 y jvm v  /2  e  jv  jv and  i 1 i 1 n n y i y  2 2 i   where m  m Therefore,Y is Gaussian-distributed withmeanmy i1 i1 2 . 35 and variance y
  • 36. Statistical Averages ◊ Joint Moments ◊ Consider next a pair of random variables X and Y. A set of statistical averages of importance in this case are the joint moments, namely, the expected value of Xi Y k, where i and k may assume any positive integer values. We may thus write X,Y x, ydxdy 5.51     E X i Y k   xi yk f ◊ A joint moment of particular importance is the correlation defined by E[XY], which corresponds to i = k = 1. Y  EY 5.53 36  X Y ◊ Covariance of X and Y : covXY  E   X  EX  = EXY
  • 37. Statistical Averages ◊ Correlation coefficient of X and Y :   cov XY 5.54 37 X Y ◊ σX and σY denote the variances of X and Y. ◊ We say X and Y are uncorrelated if and only if cov[XY] = 0. ◊ Note that if X and Y are statistically independent, then they are uncorrelated. ◊ The converse of the above statement is not necessarily true. ◊ We say X and Y are orthogonal if and only if E[XY] = 0.
  • 38. Statistical Averages ◊ Example 5.6 Moments of a Bernoulli Random Variable ◊ Consider the coin-tossing experiment where the probability of a head is p. Let X be a random variable that takes the value 0 if the result is a tail and 1 if it is a head. We say that X is a Bernoulli random variable.  1 p 1 x  0 x  1 otherwise   0  P X  x  p EX  kPX  k 01 p1 p  p k0       1 2 X   k  X  2 P X k j k 2 j     j k         E X  E X j  k j  k E X X   E X k0  p2   j  k p j  k  0  p2 1 p 1 p2  p 1 p 38 1 k0      j  2 where the E X k2 PX  k.
  • 39. Random Processes An ensemble of sample functions. 39
  • 40. Random Processes ◊ At any given time instant, the value of a stochastic process is a random variable indexed by the parameter t. We denote such a process by X(t). ◊ In general, the parameter t is continuous, whereas X may be either continuous or discrete, depending on the characteristics of the source that generates the stochastic process. ◊ The noise voltage generated by a single resistor or a single information source represents a single realization of the stochastic process. It is called a sample function. 40
  • 41. Random Processes ◊ The set of all possible sample functions constitutes an ensemble of sample functions or, equivalently, the stochastic process X(t). ◊ In general, the number of sample functions in the ensemble is assumed to be extremely large; often it is infinite. ◊ Having defined a stochastic process X(t) as an ensemble of sample functions, we may consider the values of the process at any set of time instants t1>t2>t3>…>tn, where n is any positive integer. ◊ In general, the random variables Xt  X ti ,i 1,2,..., n, are i 41 t1 t2 tn characterized statistically by their joint PDF px , x ,..., x  .
  • 42. Random Processes ◊ Stationary stochastic processes ◊ Consider another set of n random variables Xt t  X ti  t, i i  1, 2,..., n, where t is an arbitrary time shift. These random variables are characterized by the joint PDF pxt1t , xt2 t ,..., xtn t . ◊The jont PDFs of the random variables Xt and Xt t ,i  1,2,...,n, i i may or may not be identical. When they are identical, i.e., when pxt1 , xt2 ,..., xtn  pxt1t , xt2 t ,..., xtn t  for all t and all n, it is said to be stationary in the strict sense (SSS). ◊ When the joint PDFs are different, the stochastic process is non-stationary. 42
  • 43. Random Processes ◊ Averages for a stochastic process are called ensemble averages. ◊  i n n The nth moment of the random variable Xt is defined as : x dx EX  x p  ◊  ti ti ti ti ◊ In general, the value of the nth moment will depend on the time instant ti if thePDF of Xt depends on ti. i When the processis stationary, p xt t  p x for allt. i ti Therefore,the PDFis independent of time,and,as a consequence, the nth momentisindependent of time. 43
  • 44. Random Processes ◊ Two random variables: Xt i  X ti ,i 1,2. ◊ The correlation is measured by the joint moment:   x x p x , x dx dx   E X X  t1 t2 t1 t2 t1 t2 t1 t2   ◊ Since this joint moment depends on the time instants t1 and t2, it is denoted by RX(t1 ,t2). ◊ RX(t1 ,t2) is called the autocorrelation function of the stochastic process. 44 ◊ For a stationary stochastic process, the joint moment is: E(Xt Xt )  RX (t1 ,t2 )  RX (t1  t2 )  RX () 1 2 ◊ ' ' 1 1 1 1 1 1 X t t  t  t t  RX ( )  E(Xt X )  E(X X )  E(X X )  R () 2 ◊ Average power in the process X(t): RX(0)=E(Xt ).
  • 45. Random Processes 45 ◊ Wide-sense stationary (WSS) ◊ A wide-sense stationary process has the property that the mean value of the process is independent of time (a constant) and where the autocorrelation function satisfies the condition that RX(t1,t2)=RX(t1-t2). ◊ Wide-sense stationarity is a less stringent condition than strict-sense stationarity.
  • 46. Random Processes ◊ Auto-covariance function ◊ The auto-covariance function of a stochastic process is defined as: t1,t2  E  Xt1  mt1     Xt2  mt2    RX t1,t2  mt1 mt2  ◊ When the process is stationary, the auto-covariance function simplifies to: 2 (t1,t2 )  (t1 t2 )  ()  RX ()  m ◊ For a Gaussian random process, higher-order moments can be expressed in terms of first and second moments. Consequently, a Gaussian random process is completely characterized by its first two moments. 46
  • 47. Mean, Correlation and Covariance Functions         5.57 X X t xf x dx  ◊ Consider a random process X(t). We define the mean of the process X(t) as the expectation of the random variable obtained by observing the process at some time t, as shown by      t  E [X t   ◊ A random process is said to be stationary to first order if the distribution function (and therefore density function) of X(t) does not vary with time. 47         1 2 1 2 X t X t f x for all t for all t x  f and t  X t  X 5.59 ◊ The mean of the random process is a constant. ◊ The variance of such a process is also constant.
  • 48. Mean, Correlation and Covariance Functions ◊ We define the autocorrelation function of the process X(t) as the expectation of the product of two random variables X(t1) and X(t2). RX t1,t2  E X t1 X t2  ◊ We say a random process X(t) is stationary to second order if the       1 2 1 2 1 2 1 2 5.60 X t , X t x x f x , x dx dx            48   1 2 1 2 X t , X t joint distribution f x , x depends on the difference between the observation time t1 and t2. RX t1,t2  RX t2 t1  for all t1 and t2 5.61 ◊ The autocovariance function of a stationary random process X(t) is written as  X 2 1 2 X t 5.62 t   CX t1,t2  E  X t1  X X t2  X   R
  • 49. Mean, Correlation and Covariance Functions ◊ For convenience of notation, we redefine the autocorrelation function of a stationary process X(t) as for all t RX  E   X t  Xt  5.63 ◊ This autocorrelation function has several important properties: 5.64 5.65 49 1. RX 0  E   X t  2 2. RX   RX  3. RX   RX 0 5.67 ◊ Proof of (5.64) can be obtained from (5.63) by putting τ = 0.
  • 50. 5.6 Mean, Correlation and Covariance Functions ◊ Proof of (5.65): RX  E  X t  X t  E  X tX t    RX     ◊ Proof of (5.67):   E X t   X t2   0  E  X 2 t      2E  X t  X t  E  X 2 t    0  2RX 0  2RX  0  RX 0  RX  RX 0  RX  RX 0 50
  • 51. 5.6 Mean, Correlation and Covariance Functions ◊ The physical significance of the autocorrelation function RX(τ) is that it provides a means of describing the “interdependence” of two random variables obtained by observing a random process X(t) at times τ seconds apart. 51
  • 52. 5.6 Mean, Correlation and Covariance Functions ◊ Example 5.7 Sinusoidal Signal with Random Phase ◊ Consider a sinusoidal signal with random phase: X t  Acos 2 f t  f  1 ,      c   2 elsewhere    0,   2 2 cos 4 A RX  E   X t  X t       fc    fct    2 fc  2    E A      E  cos 2 2 cos 4 c c c A2  f   2  A 1  f t  2 f   2 d  cos 2  2 cos 2 fc   2 2 2  A 2 52
  • 53. 5.6 Mean, Correlation and Covariance Functions ◊ Averages for joint stochastic processes ◊ Let X(t) and Y(t) denote two stochastic processes and let Xti ≡X(ti), i=1,2,…,n, Yt’j ≡Y(t’j), j=1,2,…,m, represent the random variables at times t1>t2>t3>…>tn, and t’1>t’2>t’3>…>t’m , respectively. The two processes are characterized statistically by their joint PDF: ◊ The cross-correlation function of X(t) and Y(t), denoted by   ' ' 1 2 n 1 2 m t t t t t t p x , x ,..., x , y , y ' ,..., y   Rxy(t1,t2), is defined as the joint moment: 1 2   t1 t2 t1 t2 t1 t2 Rxy (t1,t2 )  E(Xt Yt )  x y p(x , y )dx dy ◊ The cross-covariance is: 53 xy (t1,t2 )  Rxy (t1,t2 )  mx (t1)my (t2 )
  • 54. 5.6 Mean, Correlation and Covariance Functions ◊ Averages for joint stochastic processes ◊ When the process are jointly and individually stationary, we have Rxy(t1,t2)=Rxy(t1-t2), and μxy(t1,t2)= μxy(t1-t2): ' ' ' ' 1 1 1 1 1 1 yx xy t t  t  t t t  R ( )  E(X Y ) )  E(X Y )  E(Y X )  R ( ◊ The stochastic processes X(t) and Y(t) are said to be statistically independent if and only if : ' ' ' ' ' ' 1 2 n 1 2 m 1 2 m t t t t t t t t t ) p(y , y ,..., y ) ,..., x p( xt , xt ,..., xt , y , y ,..., y )  p(x , x 1 2 n for all choices of ti and t’i and for all positive integers n andm. ◊ The processes are said to be uncorrelated if 54 1 2 t t Rxy (t1,t2 )  E(X )E(Y )  xy (t1,t2 )  0
  • 55. 5.6 Mean, Correlation and Covariance Functions ◊ Example 5.9 Quadrature-Modulated Processes ◊ Consider a pair of quadrature-modulated processes X1(t) and X2(t): X1 t X tcos2 fct   X2 t X tsin2 fct   R12  t X2 t    E  X1  E  X tX t  cos2 fct  sin2 fct  2 fc       E  X tX t     E c o s2 fct  sin2 fct  2 fc     2 X c c c    1 R Esin4 f t  2 f t  2sin2 f    2 X c   1 R sin 2 f  12 1 2   R 0 E  X tX t  0 55
  • 56. 5.6 Mean, Correlation and Covariance Functions ◊ Ergodic Processes ◊ In many instances, it is difficult or impossible to observe all sample functions of a random process at a given time. ◊ It is often more convenient to observe a single sample function for a long period of time. ◊ For a sample function x t), the time average of the mean value over an observation period 2T is T   1 xtdt 5.84 56 x,T 2T T  ◊ For many stochastic processes of interest in communications, the time averages and ensemble averages are equal, a property known as ergodicity. ◊ This property implies that whenever an ensemble average is required, we may estimate it by using a time average.
  • 57. 5.6 Mean, Correlation and Covariance Functions ◊ Cyclostationary Processes (in the wide sense) ◊ There is another important class of random processes commonly encountered in practice, the mean and autocorrelation function of which exhibit periodicity: X t1 T  X t1  RX t1  T ,t2 T  RX t1,t2 for all t1 and t2. ◊ Modeling the process X(t) as cyclostationary adds a new dimension, namely, period T to the partial description of the process. 57
  • 58. 5.7 Transmission of a Random Process Through a Linear Filter ◊ Suppose that a random process X(t) is applied as input to linear time-invariant filter of impulse response h(t), producing a new random process Y(t) at the filter output. ◊ Assume that X(t) is a wide-sense stationary random process. ◊ The mean of the output random process Y(t) is given by     1 Y  h  X t  d  1 1          t E Yt  E     58 1 1 1 = d      h  E X t       1 1 1 5.86 X h    t  d   
  • 59. 5.7 Transmission of a Random Process Through a Linear Filter ◊ When the input random process X(t) is wide-sense stationary, the mean t X Y is a constant X , then mean t is also a constant Y .     Y 1 1 X h  d   t     H 0 5.87 X  where H(0) is the zero-frequency (dc) response of the system. ◊ The autocorrelation function of the output random process Y(t) is given by:         1 1 2 Y h  X   t  d h  X u  d     2 2   1  R t,u E YtY u  E         1 1 2 2 1 2 d h  d h              E X t  X u     59     1 1 2 2 1 2 X d h  d h       R t  ,u   
  • 60. 5.7 Transmission of a Random Process Through a Linear Filter ◊ When the input X(t) is a wide-sense stationary random process, the autocorrelation function of X(t) is only a function of the difference between the observation times:         5.90 1 2 X 1 2 1 2 Y R   h  h  R    d d       ◊ If the input to a stable linear time-invariant filter is a wide-sense stationary random process, then the output of the filter is also a wide-sense stationary random process. 60
  • 61. 5.8Power Spectral Density ◊ The Fourier transform of the autocorrelation function RX(τ) is called the power spectral density SX( f ) of the random process X(t).       X X j2 f d R  exp  S f     5.91 5.92 RX    SX f expj2 f df ◊ Equations (5.91) and (5.92) are basic relations in the theory of spectral analysis of random processes, and together they constitute what are usually called the Einstein-Wiener-Khintchine relations. 61
  • 62. 5 5. . 8 8 P P o ow we er r S Sp pe ec ct tr r a a l l D D e e n n s s i i t ty y ◊ Properties of the Power Spectral Density  ◊ Property 1: ◊ Proof: Let f =0 in Eq. (5.91)     5.93 X X S R  d  0   ◊ Property 2:     2 5.94 X S f df     t     E X ◊ Proof: Let τ =0 in Eq. (5.92) and note that RX(0)=E[X2(t)]. 62 ◊ Property 3: 5.95 5.96 SX f  0 for all f SX  f  SX f  ◊ Property 4: ◊ Proof: From (5.91)                 X X X X R  exp X R  exp  X j2 f d S  f   S f  R  R  j2 f d       
  • 63. Proof of Eq. (5.95) ◊ It can be shown that (see eq. 5.106) SY f  SX f H f  2   2 Y Y X R   S S f H f  expj2 f df     f exp j2 f df        2 Y X R 0     E Y t       S f  H f 2 df  0 for any H f     ◊ Suppose we let |H( f )|2=1 for any arbitrarily small interval f1 ≤ f ≤ f2 , and H( f )=0 outside this interval. Then, we have: 1 63 X f2 S f df  0 f This is possible if an only if SX( f )≥0 for all f. ◊ Conclusion: SX( f )≥0 for all f.
  • 64. 5.8Power Spectral Density ◊ Example 5.10 Sinusoidal Signal with Random Phase ◊ Consider the random process X(t)=Acos(2πfc t+Θ), where Θ is a uniformly distributed random variable over the interval (-π,π). ◊ The autocorrelation function of this random process is given in Example 5.7: A2 RX  cos2 fc  (5.74) ◊ Taking the Fourier transform of both sides of this relation: 2 A       X c c S   f  f  f  f  f  (5.97) 4   64
  • 65. 5.8Power Spectral Density ◊ Example 5.12 Mixing of a Random Process with a Sinusoidal Process ◊ A situation that often arises in practice is that of mixing (i.e., multiplication) of a WSS random process X(t) with a sinusoidal signal cos(2πfc t+Θ), where the phase Θ is a random variable that is uniformly distributed over the interval (0,2π). ◊ Determining the power spectral density of the random process Y(t) defined by: (5.101) 65 Y f  X t cos2 fct   ◊ We note that random variable Θ is independent of X(t) .
  • 66. 5.8Power Spectral Density ◊ Example 5.12 Mixing of a Random Process with a Sinusoidal Process (continued) ◊ The autocorrelation function of Y(t) is given by: RY  E   Yt  Y t    E   X t  cos2 fct  2 fc  X tcos2 fct       E   X t  X t  E  c o s2 fct  2 fc  cos2 fct     2 X c c c    1 R E cos2 f   cos 4 f t  2 f  2 2 X c  1 R cos 2 f  Fourier transform (5.103) 66 Y X S f  1  S 4  c X f  f S c f  f 
  • 67. 5.8Power Spectral Density ◊ Relation among the Power Spectral Densities of the Input and Output Random Processes ◊ Let SY( f ) denote the power spectral density of the output random process Y(t) obtained by passing the random process through a linear filter of transfer function H( f ).   Y Y R  e d   j2 f S f           1 2 1 2 1 2 Y X  d 5.90   R   h  h  R    d         1 2 1 2 X 1 2 h  h  d d d     j2 f  R    e      Let  1 2 0         0 1 2 1 2 1 2 0 X 0 h  h  R  e d d d     j 2 f           0 1 2 1 2 0 X 0 e j2 f h  e d h  e d R  d        j2 f  j2 f   1  2  5.106 67  H f H  f S f  H f 2 S f  X X
  • 68. 5.8Power Spectral Density ◊ Example 5.13 Comb Filter ◊ Consider the filter of Figure (a) consisting of a delay line and a summing device. We wish to evaluate the power spectral density of the filter output Y(t). 68
  • 69. 5.8Power Spectral Density ◊ Example 5.13 Comb Filter (continued) ◊ The transfer function of this filter is H f 1 exp j2 fT 1 cos2 fT  jsin2 fT      2 2 H f cos 2    1  fT   sin2 2 fT       =2   1 cos 2 fT    =4sin2 fT ◊ Because of the periodic form of this frequency response (Fig. (b)), the filter is sometimes referred to as a comb filter. ◊ The power spectral density of the filter output is: SY f  H f  S f   4sin  fTS f  2 2 X X If fT is verysmall 69 SY ( f ) 4 f T S ( f ) 2 2 2 X (5.107)
  • 70. 5.9 Gaussian Process ◊ A random variable Y is defined by: N 0 T Y   gt Xtdt ◊ We refer to Y as a linear functional of X(t). Y ai Xi i1 aiare constants Xi are randomvariables Y is a linearfunctionof theXi. ◊ If the weighting function g(t) is such that the mean-square value of the random variable Y is finite, and if the random variable Y is a Gaussian-distributed random variable for every g(t) in this class of functions, then the process X(t) is said to be a Gaussian process. ◊ In other words, the process X(t) is a Gaussian process if every linear functional of X(t) is a Gaussian random variable. ◊ The Gaussian process has many properties that make analytic results possible. ◊ The random processes produced by physical phenomena are often such that a Gaussian model is appropriate. 70