Let ξ denote the random outcome of an experiment. To ev-
ery such outcome suppose a waveform X(t, ξ) is assigned.
The collection of such waveforms form a stochastic process.
The set of {ξk} and the time index t can be continuous or
discrete (countably infinite or finite) as well. For fixed ξk ∈ S
(the set of all experimental outcomes), X(t, ξ) is a specific
time function. For fixed t, X1 = X(t1, ξi) is a random vari-
able.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.1/105
The ensemble of all such realizations X(t, ξ) over time
represents the stochastic process X(t). For example
X(t) = a cos(ω0t + φ), φ ∼ U(0, 2π)
Stochastic processes are everywhere: Brownian motion,
stock market fluctuations, various queuing systems all
represent stochastic phenomena.
If X(t) is a stochastic process, then for fixed t, X(t) repre-
sents a random variable.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.2/105
Its distribution function is given by
FX(x, t) = P{X(t) ≤ x}
Notice that FX(x, t) depends on t, since for a different t, we
obtain a different random variable. Further
fX(x, t) =
dFX(x, t)
dx
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.3/105
FX(x1, x2, t1, t2) = P{X(t1) ≤ x1, X(t2) ≤ x2}
fX(x1, x2, t1, t2) =
∂2
FX(x1, x2, t1, t2)
∂x1∂x2
represents the 2nd order density function of the process
X(t).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.4/105
Similarly fX(x1, x2, · · · xn, t1, t2 · · · , tn)represents the nth or-
der density function of the process X(t). Complete specifica-
tion of the stochastic process X(t) requires the knowledge
of fX(x1, x2, · · · xn, t1, t2 · · · , tn) for all ti, i = 1, 2, · · · , n
and for all n, an almost impossible task in reality.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.5/105
µ(t) = E{X(t)} =
Z +∞
−∞
xfX(x, t)dx
In general, the mean of a process can depend on the time
index t.
RXX(t1, t2)=E{X(t1)X∗
(t2)}=
Z Z
x1x∗
2fX(x1, x2, t1, t2)dx1dx2
It represents the interrelationship between the random vari-
ables X1 = X(t1) and X2 = X(t2) generated from the pro-
cess X(t).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.6/105
1. RXX(t1, t2) = R∗
XX(t2, t1) = [E{X(t2)X∗
(t1)}]∗
2. RXX(t, t) = E{|X(t)|2
} > 0.
3.
n
P
i=1
n
P
j=1
aia∗
j RXX(ti, tj) ≥ 0.
CXX(t1, t2) = RXX(t1, t2) − µX(t1)µ∗
X(t2)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.7/105
Example:
z =
Z T
−T
X(t)dt
E[|z|2
] =
Z T
−T
Z T
−T
E{X(t1)X∗
(t2)}dt1dt2
=
Z T
−T
Z T
−T
RXX(t1, t2)dt1dt2
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.8/105
Example:
X(t) = a cos(ω0t + ϕ), ϕ ∼ U(0, 2π).
µX(t) = E{X(t)} = aE{cos(ω0t + ϕ)}
= a cos ω0tE{cos ϕ} − a sin ω0tE{sin ϕ} = 0,
E{cos ϕ} = 1
2π
Z 2π
0
cos ϕdϕ = 0 = E{sin ϕ}.
RXX(t1, t2) = a2
E{cos(ω0t1 + ϕ) cos(ω0t2 + ϕ)}
= a2
2
E{cos ω0(t1 − t2) + cos(ω0(t1 + t2) + 2ϕ)}
= a2
2
cos ω0(t1 − t2).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.9/105
Stationary Stochastic Processes
Stationary processes exhibit statistical properties that are
invariant to shift in the time index. Thus, for example,
second-order stationarity implies that the statistical proper-
ties of the pairs {X(t1), X(t2)} and {X(t1 +c), X(t2 +c)} are
the same for any c. Similarly first-order stationarity implies
that the statistical properties of X(ti) and X(ti + c) are the
same for any c.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.10/105
Strict-Sense Stationary (S.S.S)
if
fX(x1, · · · xn, t1, · · · , tn) ≡ fX(x1, · · · xn, t1 +c, · · · , tn +c) (1)
for any c, where the left side represents the joint density
function of the random variables
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.11/105
X1 = X(t1), X2 = X(t2), · · · , Xn = X(tn) and the right side
corresponds to the joint density function of the random
variables
X′
1 = X(t1 + c), X′
2 = X(t2 + c), · · · , X′
n = X(tn + c).
A process X(t) is said to be strict-sense stationary if (1) is
true for all ti, i = 1, 2, · · · , n, n = 1, 2, · · · , for any c.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.12/105
For a first-order strict sense stationary process,
fX(x, t) ≡ fX(x, t + c)
for c = −t,
fX(x, t) = fX(x) (2)
the first-order density of X(t) is independent of t. In that
case
E[X(t)] =
Z +∞
−∞
xf(x)dx = µ, a constant.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.13/105
Similarly, for a 2nd-order strict-sense stationary process
fX(x1, x2, t1, t2) ≡ fX(x1, x2, t1 + c, t2 + c)
For c = −t2:
fX(x1, x2, t1, t2) ≡ fX(x1, x2, t1 − t2) (3)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.14/105
the second order density function of a strict sense
stationary process depends only on the difference of the
time indices t2 − t2 = τ. In that case the autocorrelation
function is given by
E{X(t1)X∗
(t2)} =
Z Z
x1x∗
2fX(x1, x2, τ = t1 − t2)dx1dx2
= RXX(t1 − t2) = RXX(τ) = R∗
XX(−τ),
the E{X(t1)X∗
(t2)} of a 2nd SSS depends only on τ.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.15/105
The conditions for 1st and 2nd order SSS are in equations
(2), (3), these are usually hard to verify. For this reason we
resort to wide-sense stationarity WSS.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.16/105
WSS:
1. E{X(t)} = µ
2. E{X(t1)X∗
(t2)} = RXX(|t2 − t1|)
SSS always implies WSS, but the converse is not true, but,
the only exception are Gaussian processes.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.17/105
This follows, since if X(t) is a Gaussian process, then by
definition
X1 = X(t1), X2 = X(t2), · · · , Xn = X(tn)
are jointly Gaussian random variables for any t1, t2 · · · , tn
whose joint characteristic function is given by
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.18/105
φX(ω1, ω2, · · · , ωn) = e
j
n
P
k=1
µ(tk)ωk−0.5
n
P
l=1
n
P
k=1
CXX (tl,tk)ωlωk
CXX(ti, tk) is the element of covariance matrix, for WSS
process CXX(ti, tk) = CXX(ti − tk)
φX(ω1, ω2, · · · , ωn) = e
j
n
P
k=1
µωk−0.5
n
P
l=1
n
P
k=1
CXX (tl−tk)ωlωk
(4)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.19/105
and hence if the set of time indices are shifted by a
constant c to generate a new set of jointly Gaussian
random variables X′
2 = X(t2 + c), · · · ,X′
n = X(tn + c) then
their joint characteristic function is identical to (4). Thus the
set of random variables {Xi}n
i=1 and {X′
i}n
i=1 have the
same JPDF for all n and all c, establishing the SSS of
Gaussian processes from it WSS.
For Gaussian processes: WSS⇒SSS
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.20/105
Why GPs?
1. Easy prior assumption
2. Regression
3. Classification
4. Optimization
5. Unimodal
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.21/105
N-dim Gaussian
e−0.5(x−µ)H C−1(x−µ)
p
(2π)N |C|
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.22/105
N-dim Gaussian
C=[.5,-.15;-.15,.2]; mu=[1 -1];
[X1,X2] = meshgrid(linspace(-2,2,100)’,
linspace(-2,2,100)’);
X = [X1(:) X2(:)];
p = mvnpdf(X, mu, C);
surf(X1,X2,reshape(p,100,100));
contour(X1,X2,reshape(p,100,100));
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.23/105
−2
−1
0
1
2
−2
−1.5
−1
−0.5
0
0.5
1
1.5
2
0
0.2
0.4
0.6
0.8
1
2D Gaussian PDF, surface level plot
X
2
X1
PDF
Figure 1: Multivariate Gaussian PDF.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.24/105
2D Gaussian PDF, contour plot
X1
X
2
−2 −1.5 −1 −0.5 0 0.5 1 1.5 2
−2
−1.5
−1
−0.5
0
0.5
1
1.5
2
Figure 2: Multivariate Gaussian PDF.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.25/105
C = V DV H
, V = [V1 V2]
C =

0.5 −.15
−0.15 0.2

V =

0.31625 −.9487
−0.9487 0.3162

D =

0.05 0
0 0.55

AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.26/105
.
. σ2
2
V2
σ
1
2
V
1
X1
X
2
−2 −1.5 −1 −0.5 0 0.5 1 1.5 2
−2
−1.5
−1
−0.5
0
0.5
1
1.5
2
Figure 3: Multivariate Gaussian PDF.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.27/105
For LTI sytems:
Y (t) =
Z +∞
−∞
h(t − τ)X(τ)dτ
=
Z +∞
−∞
h(τ)X(t − τ)dτ
X(t) =
Z +∞
−∞
X(τ)δ(t − τ)dτ
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.28/105
Y (t) = L{X(t)} = L{
Z +∞
−∞
X(τ)δ(t − τ)dτ}
=
Z +∞
−∞
L{X(τ)δ(t − τ)dτ}
=
Z +∞
−∞
X(τ)L{δ(t − τ)}dτ
=
Z +∞
−∞
X(τ)h(t − τ)dτ =
Z +∞
−∞
h(τ)X(t − τ)dτ
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.29/105
Output statistics:
µY (t) = E{Y (t)} =
Z +∞
−∞
E{X(τ)h(t − τ)dτ}
=
Z +∞
−∞
µX(τ)h(t − τ)dτ = µX(t) ∗ h(t).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.30/105
RXY (t1, t2) = E{X(t1)Y ∗
(t2)}
= E{X(t1)
Z +∞
−∞
X∗
(t2 − α)h∗
(α)dα}
=
Z +∞
−∞
E{X(t1)X∗
(t2 − α)}h∗
(α)dα
=
Z +∞
−∞
RXX(t1, t2 − α)h∗
(α)dα
= RXX(t1, t2) ∗ h∗
(t2)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.31/105
RY Y (t1, t2) = E{Y (t1)Y ∗
(t2)}
= E{
Z +∞
−∞
X(t1 − β)h(β)dβY ∗
(t2)}
=
Z +∞
−∞
E{X(t1 − β)Y ∗
(t2)}h(β)dβ
=
Z +∞
−∞
RXY (t1 − β, t2)h(β)dβ
= RXY (t1, t2) ∗ h(t1),
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.32/105
RY Y (t1, t2) = RXX(t1, t2) ∗ h∗
(t2) ∗ h(t1).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.33/105
If X(t) is WSS:
µY (t) = µX
Z +∞
−∞
h(τ)dτ = µXc, a constant .
RXY (t1, t2) =
Z +∞
−∞
RXX(t1 − t2 + α)h∗
(α)dα
= RXX(τ) ∗ h∗
(−τ) = RXY (τ), τ = t1 − t2
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.34/105
RY Y (t1, t2) =
Z +∞
−∞
RXY (t1 − β − t2)h(β)dβ, τ = t1 − t2
= RXY (τ) ∗ h(τ) = RY Y (τ)
RY Y (τ) = RXX(τ) ∗ h∗
(−τ) ∗ h(τ).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.35/105
Example: If X(t) is WSS:
z =
Z T
−T
X(t)dt.
Y (t) = X(t) ∗ h(t), h(t) = u(t + T) − u(t − T)
RY Y (τ) = RXX(τ) ∗ h(τ) ∗ h(−τ)
h(τ) ∗ h(−τ) = (2T − |τ|), |τ| ≤ 2T
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.36/105
Y (t) = g{X(t)}
SSS input SSS output
Memoryless
system
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.37/105
Y (t) = g{X(t)}
WSS input Can’t say about
stationarity
in any sense
Memoryless
system
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.38/105
Y (t) = g{X(t)}
Gaussian input Not Gaussian
Memoryless
system
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.39/105
If X(t) is a zero mean stationary Gaussian process, and
Y (t) = g{X(t)}, where g(·) represents a nonlinear
memoryless device, then
RXY (τ) = ηRXX(τ), η = E{g′
(X)}.
Proof:
RXY (τ) = E{X(t)Y (t − τ)} = E[X(t)g{X(t − τ)}]
=
Z Z
x1g(x2)fX1X2 (x1, x2)dx1dx2, (5)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.40/105
where X1 = X(t) and X2 = X(t − τ) are jointly Gaussian
RVs.
fX1X2 (x1, x2) = 1
2π
√
|A|
e−x∗A−1x/2
X = (X1, X2)T
, x = (x1, x2)T
A = E{X X∗
} =

RXX(0) RXX(τ)
RXX(τ) RXX(0)

= LL∗
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.41/105
where L is an upper triangular factor matrix with positive
diagonal entries.
L =

ℓ11 ℓ12
0 ℓ22

.
Consider the transformation:
Z = L−1
X = (Z1, Z2)T
, z = L−1
x = (z1, z2)T
so that
E{Z Z∗
} = L−1
E{XX∗
}L∗−1
= L−1
AL∗−1
= I
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.42/105
and hence Z1, Z2 are zero mean independent Gaussian
random variables. Also
x = Lz ⇒ x1 = ℓ11z1 + ℓ12z2, x2 = ℓ22z2
and hence,
x∗
A−1
x = z∗
L∗
A−1
Lz = z∗
z = z2
1 + z2
2
then the Jacobian of transformation
|J| = |L−1
| = |A|−1/2
.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.43/105
RXY (τ) =
Z +∞
−∞
Z +∞
−∞
(ℓ11z1 + ℓ12z2)g(ℓ22z2) 1
|J|
1
2π|A|1/2 e−z2
1/2
e−z2
2/2
= ℓ11
Z +∞
−∞
Z +∞
−∞
z1g(ℓ22z2)fz1 (z1)fz2 (z2)dz1dz2
+ ℓ12
Z +∞
−∞
Z +∞
−∞
z2g(ℓ22z2)fz1 (z1)fz2 (z2)dz1dz2
= ℓ11
Z +∞
−∞
z1fz1 (z1)dz1
Z +∞
−∞
g(ℓ22z2)fz2 (z2)dz2
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.44/105
+ ℓ12
Z +∞
−∞
z2g(ℓ22z2) fz2 (z2)
| {z }
1
√
2π
e−z2
2/2
dz2
= ℓ12
ℓ2
22
Z +∞
−∞
ug(u) 1
√
2π
e−u2/2ℓ2
22 du
where u = ℓ22z2
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.45/105
This gives
RXY (τ) = ℓ12ℓ22
R +∞
−∞
g(u) u
ℓ2
22
fu(u)
z }| {
1
√
2πℓ2
22
e−u2/2ℓ2
22
| {z }
−
dfu(u)
du
=−f′
u(u)
du
= −RXX(τ)
R +∞
−∞
g(u)f′
u(u)du,
Since A = LL∗
gives ℓ12ℓ22 = RXX(τ). Hence,
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.46/105
RXY (τ) = RXX(τ){−g(u)fu(u)|+∞
−∞ +
R +∞
−∞
g′
(u)fu(u)du}
= RXX(τ)E{g′
(X)} = ηRXX(τ),
This is the desired result, where η = E{g′
(X)}. Thus if the
input to a memoryless device is stationary Gaussian, the
cross correlation function between the input and the output
is proportional to the input autocorrelation function.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.47/105
WSS input WSS output
LTI system
h(t)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.48/105
SSS input SSS output
LTI system
h(t)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.49/105
Gaussian input Gaussian output
LTI system
h(t)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.50/105
White noise W(t) is said to be a white noise process if
RWW (t1, t2) = q(t1)δ(t2 − t1)
W(t) is said to be WSS white noise if E{W(t)} = constant,
and
RWW (t1, t2) = qδ(t1 − t2) = qδ(τ).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.51/105
If W(t) is also a Gaussian process (white Gaussian pro-
cess), then all of its samples are independent random vari-
ables(Why?)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.52/105
White noise Colored noise
W(t) N(t) = h(t) ∗ W(t)
LTI system
h(t)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.53/105
E[N(t)] = µW
Z +∞
−∞
h(τ)dτ = a constant
RNN (τ) = qδ(τ) ∗ h∗
(−τ) ∗ h(τ)
= qh∗
(−τ) ∗ h(τ) = qρ(τ)
ρ(τ) = h(τ) ∗ h∗
(−τ) =
Z +∞
−∞
h(α)h∗
(α + τ)dα.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.54/105
Thus the output of a white noise process through an LTI
system represents a (colored) noise process.
Note: White noise need not be Gaussian.
“White” and “Gaussian” are two different concepts!
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.55/105
♦ ♦
  ♦

X(t)
t
Figure 4: Upcrossing (♦), Downcrossing ()
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.56/105
Upcrossings and Downcrossings of a stationary Gaussian pro-
cess: Consider a zero mean stationary Gaussian process
X(t) with autocorrelation function RXX(τ). An upcrossing
over the mean value occurs whenever the realization X(t)
passes through zero with positive slope. Let ρ∆t represent
the probability of such an upcrossing in the interval (t, t+∆t)
We wish to determine ρ.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.57/105
Since X(t) is a stationary Gaussian process, its derivative
process X′
(t) is also zero mean stationary Gaussian with
autocorrelation function
RXX(τ) =
Z ∞
−∞
SXX(ω)ejωτ dω
2π
R
′′
XX(τ) =
Z ∞
−∞
(jω)2
SXX(ω)ejωτ dω
2π
SX′X′ (ω) = |jω|2
SXX(ω) ⇒
RX′X′ (τ) = −R
′′
XX(τ)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.58/105
Further X(t) and X′
(t) and are jointly Gaussian stationary
processes, and since
RXX′ (τ) = −
dRXX(τ)
dτ
,
RXX′ (−τ) = −
dRXX(−τ)
d(−τ)
=
dRXX(τ)
dτ
= −RXX′ (τ)
which for τ = 0 gives
RXX′ (0) = 0 ⇒ E[X(t)X′
(t)] = 0
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.59/105
the jointly Gaussian zero mean random variables
X1 = X(t), X2 = X′
(t)
are uncorrelated hence independent with variances
σ2
1 = RXX(0) and σ2
2 = RX′X′ (0) = −R′′
XX(0)  0
fX1X2 (x1, x2) = fX(x1)fX(x2) =
1
2πσ1σ2
e
−
x2
1
2σ2
1
+
x2
1
2σ2
2
!
.
To determine ρ the probability of upcrossing rate,
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.60/105
we argue as follows: In an interval (t, t + ∆t) the realization
moves from X(t) = X1 to
X(t + ∆t) = X(t) + X′
(t)∆t = X1 + X2∆t, and hence the
realization intersects with the zero level somewhere in that
interval if
X1  0, X2  0, and X(t + ∆t) = X1 + X2∆t  0
i.e., X1  −X2∆t Hence the probability of upcrossing in
(t, t + ∆t) is given by
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.61/105
ρ∆t =
Z ∞
x2=0
Z 0
x1=−x2∆t
fX1X2 (x1, x2)dx1dx2
=
Z ∞
0
fX2 (x2)dx2
Z ∞
−x2∆t
fX1 (x1)dx1. (6)
Differentiating both sides of 6 with respect to ∆t we get
ρ =
Z ∞
0
fX2 (x2)x2fX1 (−x2∆t)dx2
and letting ∆t → 0, we have
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.62/105
c
ρ =
Z ∞
0
x2fX(x2)fX(0)dx2 =
1
p
2πRXX(0)
Z ∞
0
x2fX(x2)dx2
=
1
p
2πRXX(0)
1
2
(σ2
p
2/π) =
1
2π
s
−R′′
XX(0)
RXX(0)
There is an equal probability for downcrossings, and hence
the total probability for crossing the zero line in an interval
(t, t + ∆t) equals ρ0∆t where
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.63/105
ρ0 =
1
π
q
−R′′
XX(0)/RXX(0)  0.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.64/105
A discrete time stochastic process Xn = X(nT) is a
sequence of random variables. The mean, autocorrelation
and auto-covariance functions of a discrete-time process
are gives by
µn = E{X(nT)}
R(n1, n2) = E{X(n1T)X∗
(n2T)}
C(n1, n2) = R(n1, n2) − µn1 µ∗
n2
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.65/105
WSS:
E{X(nT)} = µ, a constant
E[X{(k + n)T}X∗
{(k)T}] = R(n) = rn = r∗
−n
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.66/105
The positive-definite property of the autocorrelation
sequence can be expressed in terms of certain
Hermitian-Toeplitz matrices as follows:
Theorem: A sequence {rn} |∞
−∞ forms an autocorrelation se-
quence of a wide sense stationary stochastic process if and
only if every Hermitian-Toeplitz matrix Tn given by
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.67/105
Tn =





r0 r1 r2 · · · rn
r∗
1 r0 r1 · · · rn−1
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
r∗
n r∗
n−1 · · · r∗
1 r0





= T∗
n
is non-negative (positive) definite for n = 0, 1, 2 · · · , ∞
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.68/105
a∗
Tna =
n
X
i=0
n
X
k=0
aia∗
krk−i, ∀~
a
a∗
Tna =
n
X
i=0
n
X
k=0
aia∗
kE{X(kT)X∗
(iT)}
= E



n
X
k=0
a∗
kX(kT)
2



≥ 0.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.69/105
Systems and discrete WSS processes:
RXY (n) = RXX(n) ∗ h∗
(−n)
RY Y (n) = RXY (n) ∗ h(n)
RY Y (n) = RXX(n) ∗ h∗
(−n) ∗ h(n).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.70/105
ARMA processes: input: W(n), output: X(n)
X(n) = −
p
X
k=1
akX(n − k) +
q
X
k=0
bkW(n − k),
X(z)
p
X
k=0
akz−k
= W(z)
q
X
k=0
bkz−k
, a0 ≡ 1
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.71/105
H(z) =
∞
X
k=0
h(k)z−k
=
X(z)
W(z)
=
b0 + b1z−1
+ b2z−2
+ · · · + bqz−q
1 + a1z−1 + a2z−2 + · · · + apz−p
=
B(z)
A(z)
X(n) =
∞
X
k=0
h(n − k)W(k)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.72/105
the transfer function H(z) is rational with p poles and q zeros
that determine the model order of the underlying system.
X(n) undergoes regression over p of its previous values and
at the same time a moving average based on W(n), W(n −
1), · · · , W(n − q) of the input over (q + 1) values is added
to it, thus generating an Auto Regressive Moving Average
(ARMA (p, q)) process X(n).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.73/105
Generally the input {W(n)} represents a sequence of
uncorrelated random variables of zero mean and constant
variance σ2
W so that
RWW (n) = σ2
W δ(n).
If in addition, {W(n)} is normally distributed then the
output {X(n)} also represents a strict-sense stationary
normal process.
If q = 0, then we have an AR(p) process (all-pole process),
and if p = 0, then an MA(q)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.74/105
AR(1) process: An AR(1) process has the form
X(n) = aX(n − 1) + W(n)
H(z) =
1
1 − az−1
=
∞
X
n=0
an
z−n
h(n) = an
, |a|  1
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.75/105
RXX(n) = σ2
W δ(n) ∗ {a−n
} ∗ {an
} = σ2
W
∞
X
k=0
a|n|+k
ak
= σ2
W
a|n|
1 − a2
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.76/105
The normalized autocorrelation sequence:
ρX(n) =
RXX(n)
RXX(0)
= a|n|
, |n| ≥ 0. (7)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.77/105
It is instructive to compare an AR(1) model discussed
above by superimposing a random component to it, which
may be an error term associated with observing a first
order AR process X(n). Thus
Y (n) = X(n) + V (n), X(n) ∼ AR(1), V (n) ∼ White noise (8)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.78/105
RY Y (n) = RXX(n) + RV V (n) = RXX(n) + σ2
V δ(n)
= σ2
W
a|n|
1 − a2
+ σ2
V δ(n)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.79/105
ρY (n) =
RY Y (n)
RY Y (0)
=

1, n = 0
ca|n|
, n = ±1, ±2, · · ·
(9)
c =
σ2
W
σ2
W + σ2
V (1 − a2)
 1.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.80/105
Eqs. (7) and (8) demonstrate the effect of superimposing
an error sequence on an AR(1) model. For non-zero lags,
the autocorrelation of the observed sequence {Y (n)} is re-
duced by a constant factor compared to the original process
{X(n)}. From (8), the superimposed error sequence V (n)
only affects the corresponding term in Y (n) (term by term).
However, a particular term in the “input sequence” W(n)
affects X(n) and Y (n) as well as all subsequent observa-
tions.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.81/105
AR(2) Process: An AR(2) process has the form, W(n) is a
zero-mean white process
X(n) = a1X(n − 1) + a2X(n − 2) + W(n)
H(z) =
∞
X
n=0
h(n)z−n
=
1
1 − a1z−1 − a2z−2
=
b1
1 − λ1z−1
+
b2
1 − λ2z−1
h(0) = 1, h(1) = a1, h(n) = a1h(n − 1) + a2h(n − 2), n ≥ 2
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.82/105
h(n) = b1λn
1 + b2λn
2 , n ≥ 0
b1 + b2 = 1, b1λ1 + b2λ2 = a1.
λ1 + λ2 = a1, λ1λ2 = −a2,
Stability condition:
|λ1|  1, |λ2|  1.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.83/105
RXX(n) = E{X(n + m)X∗
(m)}
= E{[a1X(n + m − 1) + a2X(n + m − 2)]X∗
(m)}
+ E{W(n + m)X∗
(m)}
= a1RXX(n − 1) + a2RXX(n − 2)
Because W(n) and X(n) are uncorrelated
E{W(n + m)X∗
(m)} = 0
ρX(n) =
RXX(n)
RXX(0)
= a1ρX(n − 1) + a2ρX(n − 2).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.84/105
RXX(n) = RWW (n) ∗ h∗
(−n) ∗ h(n) = σ2
W h∗
(−n) ∗ h(n)
= σ2
W
∞
X
k=0
h∗
(n + k)h(k)
= σ2
W

|b1|2
(λ∗
1)n
1 − |λ1|2
+
b∗
1b2(λ∗
1)n
1 − λ∗
1λ2
+
b1b∗
2(λ∗
2)n
1 − λ1λ∗
2
+
|b2|2
(λ∗
2)n
1 − |λ2|2

AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.85/105
The normalized autocorrelation function:
ρX(n) =
RXX(n)
RXX(0)
= c1λ∗n
1 + c2λ∗n
2
where c1 and c2 are the appropriate constant.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.86/105
ARMA(p, q) system has only p + q + 1 independent
coefficients, and hence its impulse response sequence
{hk} also must exhibit a similar dependence among them.
An old result due to Kronecker (1881) states that the nec-
essary and sufficient condition for H(z) =
P∞
k=0 h(n)z−n
to
represent a rational system (ARMA) is that
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.87/105
det Hn = 0, n ≥ N (for all sufficiently large n),
where
Hn =





h0 h1 h2 · · · hn
h1 h2 h3 · · · hn+1
.
.
.
hn hn+1 hn+2 · · · h2n





. (10)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.88/105
In the case of rational systems for all sufficiently large n, the
Hankel matrices Hn in (10) all have the same rank.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.89/105
The necessary part easily follows from
∞
X
k=0
h(n)z−n
=
Pq
k=0 bkz−k
Pp
k=0 akz−k
by cross multiplying and equating coefficients of like
powers of z−n
for n = 0, 1, 2, · · · .
We have
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.90/105
b0 = h0
b1 = h0a1 + h1
.
.
.
bq = h0aq + h1aq−1 + · · · + hm
0 = h0aq+i + h1aq+i−1 + · · · + hq+i−1a1 + hq+i, i ≥ 1.
For systems with q ≤ p−1, letting i = p−q, p−q+1, · · · , 2p−q
in the last equation we get
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.91/105
h0ap + h1ap−1 + · · · + hp−1a1 + hp = 0
.
.
.
hpap + hp+1ap−1 + · · · + h2p−1a1 + h2p = 0
which gives det(Hp) = 0, and similarly for i = p − q + 1, · · · .
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.92/105
h0ap+1 + h1ap + · · · + hp+1 = 0
h1ap+1 + h2ap + · · · + hp+2 = 0
.
.
.
hp+1ap+1 + hp+2ap + · · · + h2p+2 = 0, †
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.93/105
the ARMA (p, q) model, the input white noise process W(n)
is uncorrelated with its own past sample values as well as
the past values of the system output. This gives
E{W(n)W∗
(n − k)} = 0, k ≥ 1
E{W(n)X∗
(n − k)} = 0, k ≥ 1.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.94/105
ri = E{x(n)x∗
(n − i)}
= −
p
X
k=1
ak{x(n − k)x∗
(n − i)} +
q
X
k=0
bk{w(n − k)w∗
(n − i)}
= −
p
X
k=1
akri−k +
q
X
k=0
bk{w(n − k)x∗
(n − i)}
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.95/105

Stochastic Processes - part 1

  • 1.
    Let ξ denotethe random outcome of an experiment. To ev- ery such outcome suppose a waveform X(t, ξ) is assigned. The collection of such waveforms form a stochastic process. The set of {ξk} and the time index t can be continuous or discrete (countably infinite or finite) as well. For fixed ξk ∈ S (the set of all experimental outcomes), X(t, ξ) is a specific time function. For fixed t, X1 = X(t1, ξi) is a random vari- able. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.1/105
  • 2.
    The ensemble ofall such realizations X(t, ξ) over time represents the stochastic process X(t). For example X(t) = a cos(ω0t + φ), φ ∼ U(0, 2π) Stochastic processes are everywhere: Brownian motion, stock market fluctuations, various queuing systems all represent stochastic phenomena. If X(t) is a stochastic process, then for fixed t, X(t) repre- sents a random variable. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.2/105
  • 3.
    Its distribution functionis given by FX(x, t) = P{X(t) ≤ x} Notice that FX(x, t) depends on t, since for a different t, we obtain a different random variable. Further fX(x, t) = dFX(x, t) dx AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.3/105
  • 4.
    FX(x1, x2, t1,t2) = P{X(t1) ≤ x1, X(t2) ≤ x2} fX(x1, x2, t1, t2) = ∂2 FX(x1, x2, t1, t2) ∂x1∂x2 represents the 2nd order density function of the process X(t). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.4/105
  • 5.
    Similarly fX(x1, x2,· · · xn, t1, t2 · · · , tn)represents the nth or- der density function of the process X(t). Complete specifica- tion of the stochastic process X(t) requires the knowledge of fX(x1, x2, · · · xn, t1, t2 · · · , tn) for all ti, i = 1, 2, · · · , n and for all n, an almost impossible task in reality. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.5/105
  • 6.
    µ(t) = E{X(t)}= Z +∞ −∞ xfX(x, t)dx In general, the mean of a process can depend on the time index t. RXX(t1, t2)=E{X(t1)X∗ (t2)}= Z Z x1x∗ 2fX(x1, x2, t1, t2)dx1dx2 It represents the interrelationship between the random vari- ables X1 = X(t1) and X2 = X(t2) generated from the pro- cess X(t). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.6/105
  • 7.
    1. RXX(t1, t2)= R∗ XX(t2, t1) = [E{X(t2)X∗ (t1)}]∗ 2. RXX(t, t) = E{|X(t)|2 } > 0. 3. n P i=1 n P j=1 aia∗ j RXX(ti, tj) ≥ 0. CXX(t1, t2) = RXX(t1, t2) − µX(t1)µ∗ X(t2) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.7/105
  • 8.
    Example: z = Z T −T X(t)dt E[|z|2 ]= Z T −T Z T −T E{X(t1)X∗ (t2)}dt1dt2 = Z T −T Z T −T RXX(t1, t2)dt1dt2 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.8/105
  • 9.
    Example: X(t) = acos(ω0t + ϕ), ϕ ∼ U(0, 2π). µX(t) = E{X(t)} = aE{cos(ω0t + ϕ)} = a cos ω0tE{cos ϕ} − a sin ω0tE{sin ϕ} = 0, E{cos ϕ} = 1 2π Z 2π 0 cos ϕdϕ = 0 = E{sin ϕ}. RXX(t1, t2) = a2 E{cos(ω0t1 + ϕ) cos(ω0t2 + ϕ)} = a2 2 E{cos ω0(t1 − t2) + cos(ω0(t1 + t2) + 2ϕ)} = a2 2 cos ω0(t1 − t2). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.9/105
  • 10.
    Stationary Stochastic Processes Stationaryprocesses exhibit statistical properties that are invariant to shift in the time index. Thus, for example, second-order stationarity implies that the statistical proper- ties of the pairs {X(t1), X(t2)} and {X(t1 +c), X(t2 +c)} are the same for any c. Similarly first-order stationarity implies that the statistical properties of X(ti) and X(ti + c) are the same for any c. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.10/105
  • 11.
    Strict-Sense Stationary (S.S.S) if fX(x1,· · · xn, t1, · · · , tn) ≡ fX(x1, · · · xn, t1 +c, · · · , tn +c) (1) for any c, where the left side represents the joint density function of the random variables AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.11/105
  • 12.
    X1 = X(t1),X2 = X(t2), · · · , Xn = X(tn) and the right side corresponds to the joint density function of the random variables X′ 1 = X(t1 + c), X′ 2 = X(t2 + c), · · · , X′ n = X(tn + c). A process X(t) is said to be strict-sense stationary if (1) is true for all ti, i = 1, 2, · · · , n, n = 1, 2, · · · , for any c. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.12/105
  • 13.
    For a first-orderstrict sense stationary process, fX(x, t) ≡ fX(x, t + c) for c = −t, fX(x, t) = fX(x) (2) the first-order density of X(t) is independent of t. In that case E[X(t)] = Z +∞ −∞ xf(x)dx = µ, a constant. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.13/105
  • 14.
    Similarly, for a2nd-order strict-sense stationary process fX(x1, x2, t1, t2) ≡ fX(x1, x2, t1 + c, t2 + c) For c = −t2: fX(x1, x2, t1, t2) ≡ fX(x1, x2, t1 − t2) (3) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.14/105
  • 15.
    the second orderdensity function of a strict sense stationary process depends only on the difference of the time indices t2 − t2 = τ. In that case the autocorrelation function is given by E{X(t1)X∗ (t2)} = Z Z x1x∗ 2fX(x1, x2, τ = t1 − t2)dx1dx2 = RXX(t1 − t2) = RXX(τ) = R∗ XX(−τ), the E{X(t1)X∗ (t2)} of a 2nd SSS depends only on τ. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.15/105
  • 16.
    The conditions for1st and 2nd order SSS are in equations (2), (3), these are usually hard to verify. For this reason we resort to wide-sense stationarity WSS. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.16/105
  • 17.
    WSS: 1. E{X(t)} =µ 2. E{X(t1)X∗ (t2)} = RXX(|t2 − t1|) SSS always implies WSS, but the converse is not true, but, the only exception are Gaussian processes. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.17/105
  • 18.
    This follows, sinceif X(t) is a Gaussian process, then by definition X1 = X(t1), X2 = X(t2), · · · , Xn = X(tn) are jointly Gaussian random variables for any t1, t2 · · · , tn whose joint characteristic function is given by AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.18/105
  • 19.
    φX(ω1, ω2, ·· · , ωn) = e j n P k=1 µ(tk)ωk−0.5 n P l=1 n P k=1 CXX (tl,tk)ωlωk CXX(ti, tk) is the element of covariance matrix, for WSS process CXX(ti, tk) = CXX(ti − tk) φX(ω1, ω2, · · · , ωn) = e j n P k=1 µωk−0.5 n P l=1 n P k=1 CXX (tl−tk)ωlωk (4) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.19/105
  • 20.
    and hence ifthe set of time indices are shifted by a constant c to generate a new set of jointly Gaussian random variables X′ 2 = X(t2 + c), · · · ,X′ n = X(tn + c) then their joint characteristic function is identical to (4). Thus the set of random variables {Xi}n i=1 and {X′ i}n i=1 have the same JPDF for all n and all c, establishing the SSS of Gaussian processes from it WSS. For Gaussian processes: WSS⇒SSS AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.20/105
  • 21.
    Why GPs? 1. Easyprior assumption 2. Regression 3. Classification 4. Optimization 5. Unimodal AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.21/105
  • 22.
    N-dim Gaussian e−0.5(x−µ)H C−1(x−µ) p (2π)N|C| AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.22/105
  • 23.
    N-dim Gaussian C=[.5,-.15;-.15,.2]; mu=[1-1]; [X1,X2] = meshgrid(linspace(-2,2,100)’, linspace(-2,2,100)’); X = [X1(:) X2(:)]; p = mvnpdf(X, mu, C); surf(X1,X2,reshape(p,100,100)); contour(X1,X2,reshape(p,100,100)); AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.23/105
  • 24.
    −2 −1 0 1 2 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2 0 0.2 0.4 0.6 0.8 1 2D Gaussian PDF,surface level plot X 2 X1 PDF Figure 1: Multivariate Gaussian PDF. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.24/105
  • 25.
    2D Gaussian PDF,contour plot X1 X 2 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2 Figure 2: Multivariate Gaussian PDF. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.25/105
  • 26.
    C = VDV H , V = [V1 V2] C = 0.5 −.15 −0.15 0.2 V = 0.31625 −.9487 −0.9487 0.3162 D = 0.05 0 0 0.55 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.26/105
  • 27.
    . . σ2 2 V2 σ 1 2 V 1 X1 X 2 −2 −1.5−1 −0.5 0 0.5 1 1.5 2 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2 Figure 3: Multivariate Gaussian PDF. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.27/105
  • 28.
    For LTI sytems: Y(t) = Z +∞ −∞ h(t − τ)X(τ)dτ = Z +∞ −∞ h(τ)X(t − τ)dτ X(t) = Z +∞ −∞ X(τ)δ(t − τ)dτ AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.28/105
  • 29.
    Y (t) =L{X(t)} = L{ Z +∞ −∞ X(τ)δ(t − τ)dτ} = Z +∞ −∞ L{X(τ)δ(t − τ)dτ} = Z +∞ −∞ X(τ)L{δ(t − τ)}dτ = Z +∞ −∞ X(τ)h(t − τ)dτ = Z +∞ −∞ h(τ)X(t − τ)dτ AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.29/105
  • 30.
    Output statistics: µY (t)= E{Y (t)} = Z +∞ −∞ E{X(τ)h(t − τ)dτ} = Z +∞ −∞ µX(τ)h(t − τ)dτ = µX(t) ∗ h(t). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.30/105
  • 31.
    RXY (t1, t2)= E{X(t1)Y ∗ (t2)} = E{X(t1) Z +∞ −∞ X∗ (t2 − α)h∗ (α)dα} = Z +∞ −∞ E{X(t1)X∗ (t2 − α)}h∗ (α)dα = Z +∞ −∞ RXX(t1, t2 − α)h∗ (α)dα = RXX(t1, t2) ∗ h∗ (t2) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.31/105
  • 32.
    RY Y (t1,t2) = E{Y (t1)Y ∗ (t2)} = E{ Z +∞ −∞ X(t1 − β)h(β)dβY ∗ (t2)} = Z +∞ −∞ E{X(t1 − β)Y ∗ (t2)}h(β)dβ = Z +∞ −∞ RXY (t1 − β, t2)h(β)dβ = RXY (t1, t2) ∗ h(t1), AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.32/105
  • 33.
    RY Y (t1,t2) = RXX(t1, t2) ∗ h∗ (t2) ∗ h(t1). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.33/105
  • 34.
    If X(t) isWSS: µY (t) = µX Z +∞ −∞ h(τ)dτ = µXc, a constant . RXY (t1, t2) = Z +∞ −∞ RXX(t1 − t2 + α)h∗ (α)dα = RXX(τ) ∗ h∗ (−τ) = RXY (τ), τ = t1 − t2 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.34/105
  • 35.
    RY Y (t1,t2) = Z +∞ −∞ RXY (t1 − β − t2)h(β)dβ, τ = t1 − t2 = RXY (τ) ∗ h(τ) = RY Y (τ) RY Y (τ) = RXX(τ) ∗ h∗ (−τ) ∗ h(τ). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.35/105
  • 36.
    Example: If X(t)is WSS: z = Z T −T X(t)dt. Y (t) = X(t) ∗ h(t), h(t) = u(t + T) − u(t − T) RY Y (τ) = RXX(τ) ∗ h(τ) ∗ h(−τ) h(τ) ∗ h(−τ) = (2T − |τ|), |τ| ≤ 2T AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.36/105
  • 37.
    Y (t) =g{X(t)} SSS input SSS output Memoryless system AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.37/105
  • 38.
    Y (t) =g{X(t)} WSS input Can’t say about stationarity in any sense Memoryless system AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.38/105
  • 39.
    Y (t) =g{X(t)} Gaussian input Not Gaussian Memoryless system AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.39/105
  • 40.
    If X(t) isa zero mean stationary Gaussian process, and Y (t) = g{X(t)}, where g(·) represents a nonlinear memoryless device, then RXY (τ) = ηRXX(τ), η = E{g′ (X)}. Proof: RXY (τ) = E{X(t)Y (t − τ)} = E[X(t)g{X(t − τ)}] = Z Z x1g(x2)fX1X2 (x1, x2)dx1dx2, (5) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.40/105
  • 41.
    where X1 =X(t) and X2 = X(t − τ) are jointly Gaussian RVs. fX1X2 (x1, x2) = 1 2π √ |A| e−x∗A−1x/2 X = (X1, X2)T , x = (x1, x2)T A = E{X X∗ } = RXX(0) RXX(τ) RXX(τ) RXX(0) = LL∗ AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.41/105
  • 42.
    where L isan upper triangular factor matrix with positive diagonal entries. L = ℓ11 ℓ12 0 ℓ22 . Consider the transformation: Z = L−1 X = (Z1, Z2)T , z = L−1 x = (z1, z2)T so that E{Z Z∗ } = L−1 E{XX∗ }L∗−1 = L−1 AL∗−1 = I AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.42/105
  • 43.
    and hence Z1,Z2 are zero mean independent Gaussian random variables. Also x = Lz ⇒ x1 = ℓ11z1 + ℓ12z2, x2 = ℓ22z2 and hence, x∗ A−1 x = z∗ L∗ A−1 Lz = z∗ z = z2 1 + z2 2 then the Jacobian of transformation |J| = |L−1 | = |A|−1/2 . AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.43/105
  • 44.
    RXY (τ) = Z+∞ −∞ Z +∞ −∞ (ℓ11z1 + ℓ12z2)g(ℓ22z2) 1 |J| 1 2π|A|1/2 e−z2 1/2 e−z2 2/2 = ℓ11 Z +∞ −∞ Z +∞ −∞ z1g(ℓ22z2)fz1 (z1)fz2 (z2)dz1dz2 + ℓ12 Z +∞ −∞ Z +∞ −∞ z2g(ℓ22z2)fz1 (z1)fz2 (z2)dz1dz2 = ℓ11 Z +∞ −∞ z1fz1 (z1)dz1 Z +∞ −∞ g(ℓ22z2)fz2 (z2)dz2 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.44/105
  • 45.
    + ℓ12 Z +∞ −∞ z2g(ℓ22z2)fz2 (z2) | {z } 1 √ 2π e−z2 2/2 dz2 = ℓ12 ℓ2 22 Z +∞ −∞ ug(u) 1 √ 2π e−u2/2ℓ2 22 du where u = ℓ22z2 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.45/105
  • 46.
    This gives RXY (τ)= ℓ12ℓ22 R +∞ −∞ g(u) u ℓ2 22 fu(u) z }| { 1 √ 2πℓ2 22 e−u2/2ℓ2 22 | {z } − dfu(u) du =−f′ u(u) du = −RXX(τ) R +∞ −∞ g(u)f′ u(u)du, Since A = LL∗ gives ℓ12ℓ22 = RXX(τ). Hence, AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.46/105
  • 47.
    RXY (τ) =RXX(τ){−g(u)fu(u)|+∞ −∞ + R +∞ −∞ g′ (u)fu(u)du} = RXX(τ)E{g′ (X)} = ηRXX(τ), This is the desired result, where η = E{g′ (X)}. Thus if the input to a memoryless device is stationary Gaussian, the cross correlation function between the input and the output is proportional to the input autocorrelation function. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.47/105
  • 48.
    WSS input WSSoutput LTI system h(t) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.48/105
  • 49.
    SSS input SSSoutput LTI system h(t) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.49/105
  • 50.
    Gaussian input Gaussianoutput LTI system h(t) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.50/105
  • 51.
    White noise W(t)is said to be a white noise process if RWW (t1, t2) = q(t1)δ(t2 − t1) W(t) is said to be WSS white noise if E{W(t)} = constant, and RWW (t1, t2) = qδ(t1 − t2) = qδ(τ). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.51/105
  • 52.
    If W(t) isalso a Gaussian process (white Gaussian pro- cess), then all of its samples are independent random vari- ables(Why?) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.52/105
  • 53.
    White noise Colorednoise W(t) N(t) = h(t) ∗ W(t) LTI system h(t) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.53/105
  • 54.
    E[N(t)] = µW Z+∞ −∞ h(τ)dτ = a constant RNN (τ) = qδ(τ) ∗ h∗ (−τ) ∗ h(τ) = qh∗ (−τ) ∗ h(τ) = qρ(τ) ρ(τ) = h(τ) ∗ h∗ (−τ) = Z +∞ −∞ h(α)h∗ (α + τ)dα. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.54/105
  • 55.
    Thus the outputof a white noise process through an LTI system represents a (colored) noise process. Note: White noise need not be Gaussian. “White” and “Gaussian” are two different concepts! AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.55/105
  • 56.
    ♦ ♦ ♦ X(t) t Figure 4: Upcrossing (♦), Downcrossing () AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.56/105
  • 57.
    Upcrossings and Downcrossingsof a stationary Gaussian pro- cess: Consider a zero mean stationary Gaussian process X(t) with autocorrelation function RXX(τ). An upcrossing over the mean value occurs whenever the realization X(t) passes through zero with positive slope. Let ρ∆t represent the probability of such an upcrossing in the interval (t, t+∆t) We wish to determine ρ. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.57/105
  • 58.
    Since X(t) isa stationary Gaussian process, its derivative process X′ (t) is also zero mean stationary Gaussian with autocorrelation function RXX(τ) = Z ∞ −∞ SXX(ω)ejωτ dω 2π R ′′ XX(τ) = Z ∞ −∞ (jω)2 SXX(ω)ejωτ dω 2π SX′X′ (ω) = |jω|2 SXX(ω) ⇒ RX′X′ (τ) = −R ′′ XX(τ) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.58/105
  • 59.
    Further X(t) andX′ (t) and are jointly Gaussian stationary processes, and since RXX′ (τ) = − dRXX(τ) dτ , RXX′ (−τ) = − dRXX(−τ) d(−τ) = dRXX(τ) dτ = −RXX′ (τ) which for τ = 0 gives RXX′ (0) = 0 ⇒ E[X(t)X′ (t)] = 0 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.59/105
  • 60.
    the jointly Gaussianzero mean random variables X1 = X(t), X2 = X′ (t) are uncorrelated hence independent with variances σ2 1 = RXX(0) and σ2 2 = RX′X′ (0) = −R′′ XX(0) 0 fX1X2 (x1, x2) = fX(x1)fX(x2) = 1 2πσ1σ2 e − x2 1 2σ2 1 + x2 1 2σ2 2 ! . To determine ρ the probability of upcrossing rate, AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.60/105
  • 61.
    we argue asfollows: In an interval (t, t + ∆t) the realization moves from X(t) = X1 to X(t + ∆t) = X(t) + X′ (t)∆t = X1 + X2∆t, and hence the realization intersects with the zero level somewhere in that interval if X1 0, X2 0, and X(t + ∆t) = X1 + X2∆t 0 i.e., X1 −X2∆t Hence the probability of upcrossing in (t, t + ∆t) is given by AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.61/105
  • 62.
    ρ∆t = Z ∞ x2=0 Z0 x1=−x2∆t fX1X2 (x1, x2)dx1dx2 = Z ∞ 0 fX2 (x2)dx2 Z ∞ −x2∆t fX1 (x1)dx1. (6) Differentiating both sides of 6 with respect to ∆t we get ρ = Z ∞ 0 fX2 (x2)x2fX1 (−x2∆t)dx2 and letting ∆t → 0, we have AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.62/105
  • 63.
    c ρ = Z ∞ 0 x2fX(x2)fX(0)dx2= 1 p 2πRXX(0) Z ∞ 0 x2fX(x2)dx2 = 1 p 2πRXX(0) 1 2 (σ2 p 2/π) = 1 2π s −R′′ XX(0) RXX(0) There is an equal probability for downcrossings, and hence the total probability for crossing the zero line in an interval (t, t + ∆t) equals ρ0∆t where AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.63/105
  • 64.
    ρ0 = 1 π q −R′′ XX(0)/RXX(0) 0. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.64/105
  • 65.
    A discrete timestochastic process Xn = X(nT) is a sequence of random variables. The mean, autocorrelation and auto-covariance functions of a discrete-time process are gives by µn = E{X(nT)} R(n1, n2) = E{X(n1T)X∗ (n2T)} C(n1, n2) = R(n1, n2) − µn1 µ∗ n2 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.65/105
  • 66.
    WSS: E{X(nT)} = µ,a constant E[X{(k + n)T}X∗ {(k)T}] = R(n) = rn = r∗ −n AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.66/105
  • 67.
    The positive-definite propertyof the autocorrelation sequence can be expressed in terms of certain Hermitian-Toeplitz matrices as follows: Theorem: A sequence {rn} |∞ −∞ forms an autocorrelation se- quence of a wide sense stationary stochastic process if and only if every Hermitian-Toeplitz matrix Tn given by AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.67/105
  • 68.
    Tn =      r0 r1r2 · · · rn r∗ 1 r0 r1 · · · rn−1 . . . . . . . . . . . . . . . r∗ n r∗ n−1 · · · r∗ 1 r0      = T∗ n is non-negative (positive) definite for n = 0, 1, 2 · · · , ∞ AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.68/105
  • 69.
    a∗ Tna = n X i=0 n X k=0 aia∗ krk−i, ∀~ a a∗ Tna= n X i=0 n X k=0 aia∗ kE{X(kT)X∗ (iT)} = E   
  • 74.
  • 79.
  • 80.
    Systems and discreteWSS processes: RXY (n) = RXX(n) ∗ h∗ (−n) RY Y (n) = RXY (n) ∗ h(n) RY Y (n) = RXX(n) ∗ h∗ (−n) ∗ h(n). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.70/105
  • 81.
    ARMA processes: input:W(n), output: X(n) X(n) = − p X k=1 akX(n − k) + q X k=0 bkW(n − k), X(z) p X k=0 akz−k = W(z) q X k=0 bkz−k , a0 ≡ 1 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.71/105
  • 82.
    H(z) = ∞ X k=0 h(k)z−k = X(z) W(z) = b0 +b1z−1 + b2z−2 + · · · + bqz−q 1 + a1z−1 + a2z−2 + · · · + apz−p = B(z) A(z) X(n) = ∞ X k=0 h(n − k)W(k) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.72/105
  • 83.
    the transfer functionH(z) is rational with p poles and q zeros that determine the model order of the underlying system. X(n) undergoes regression over p of its previous values and at the same time a moving average based on W(n), W(n − 1), · · · , W(n − q) of the input over (q + 1) values is added to it, thus generating an Auto Regressive Moving Average (ARMA (p, q)) process X(n). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.73/105
  • 84.
    Generally the input{W(n)} represents a sequence of uncorrelated random variables of zero mean and constant variance σ2 W so that RWW (n) = σ2 W δ(n). If in addition, {W(n)} is normally distributed then the output {X(n)} also represents a strict-sense stationary normal process. If q = 0, then we have an AR(p) process (all-pole process), and if p = 0, then an MA(q) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.74/105
  • 85.
    AR(1) process: AnAR(1) process has the form X(n) = aX(n − 1) + W(n) H(z) = 1 1 − az−1 = ∞ X n=0 an z−n h(n) = an , |a| 1 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.75/105
  • 86.
    RXX(n) = σ2 Wδ(n) ∗ {a−n } ∗ {an } = σ2 W ∞ X k=0 a|n|+k ak = σ2 W a|n| 1 − a2 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.76/105
  • 87.
    The normalized autocorrelationsequence: ρX(n) = RXX(n) RXX(0) = a|n| , |n| ≥ 0. (7) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.77/105
  • 88.
    It is instructiveto compare an AR(1) model discussed above by superimposing a random component to it, which may be an error term associated with observing a first order AR process X(n). Thus Y (n) = X(n) + V (n), X(n) ∼ AR(1), V (n) ∼ White noise (8) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.78/105
  • 89.
    RY Y (n)= RXX(n) + RV V (n) = RXX(n) + σ2 V δ(n) = σ2 W a|n| 1 − a2 + σ2 V δ(n) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.79/105
  • 90.
    ρY (n) = RYY (n) RY Y (0) = 1, n = 0 ca|n| , n = ±1, ±2, · · · (9) c = σ2 W σ2 W + σ2 V (1 − a2) 1. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.80/105
  • 91.
    Eqs. (7) and(8) demonstrate the effect of superimposing an error sequence on an AR(1) model. For non-zero lags, the autocorrelation of the observed sequence {Y (n)} is re- duced by a constant factor compared to the original process {X(n)}. From (8), the superimposed error sequence V (n) only affects the corresponding term in Y (n) (term by term). However, a particular term in the “input sequence” W(n) affects X(n) and Y (n) as well as all subsequent observa- tions. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.81/105
  • 92.
    AR(2) Process: AnAR(2) process has the form, W(n) is a zero-mean white process X(n) = a1X(n − 1) + a2X(n − 2) + W(n) H(z) = ∞ X n=0 h(n)z−n = 1 1 − a1z−1 − a2z−2 = b1 1 − λ1z−1 + b2 1 − λ2z−1 h(0) = 1, h(1) = a1, h(n) = a1h(n − 1) + a2h(n − 2), n ≥ 2 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.82/105
  • 93.
    h(n) = b1λn 1+ b2λn 2 , n ≥ 0 b1 + b2 = 1, b1λ1 + b2λ2 = a1. λ1 + λ2 = a1, λ1λ2 = −a2, Stability condition: |λ1| 1, |λ2| 1. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.83/105
  • 94.
    RXX(n) = E{X(n+ m)X∗ (m)} = E{[a1X(n + m − 1) + a2X(n + m − 2)]X∗ (m)} + E{W(n + m)X∗ (m)} = a1RXX(n − 1) + a2RXX(n − 2) Because W(n) and X(n) are uncorrelated E{W(n + m)X∗ (m)} = 0 ρX(n) = RXX(n) RXX(0) = a1ρX(n − 1) + a2ρX(n − 2). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.84/105
  • 95.
    RXX(n) = RWW(n) ∗ h∗ (−n) ∗ h(n) = σ2 W h∗ (−n) ∗ h(n) = σ2 W ∞ X k=0 h∗ (n + k)h(k) = σ2 W |b1|2 (λ∗ 1)n 1 − |λ1|2 + b∗ 1b2(λ∗ 1)n 1 − λ∗ 1λ2 + b1b∗ 2(λ∗ 2)n 1 − λ1λ∗ 2 + |b2|2 (λ∗ 2)n 1 − |λ2|2 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.85/105
  • 96.
    The normalized autocorrelationfunction: ρX(n) = RXX(n) RXX(0) = c1λ∗n 1 + c2λ∗n 2 where c1 and c2 are the appropriate constant. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.86/105
  • 97.
    ARMA(p, q) systemhas only p + q + 1 independent coefficients, and hence its impulse response sequence {hk} also must exhibit a similar dependence among them. An old result due to Kronecker (1881) states that the nec- essary and sufficient condition for H(z) = P∞ k=0 h(n)z−n to represent a rational system (ARMA) is that AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.87/105
  • 98.
    det Hn =0, n ≥ N (for all sufficiently large n), where Hn =      h0 h1 h2 · · · hn h1 h2 h3 · · · hn+1 . . . hn hn+1 hn+2 · · · h2n      . (10) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.88/105
  • 99.
    In the caseof rational systems for all sufficiently large n, the Hankel matrices Hn in (10) all have the same rank. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.89/105
  • 100.
    The necessary parteasily follows from ∞ X k=0 h(n)z−n = Pq k=0 bkz−k Pp k=0 akz−k by cross multiplying and equating coefficients of like powers of z−n for n = 0, 1, 2, · · · . We have AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.90/105
  • 101.
    b0 = h0 b1= h0a1 + h1 . . . bq = h0aq + h1aq−1 + · · · + hm 0 = h0aq+i + h1aq+i−1 + · · · + hq+i−1a1 + hq+i, i ≥ 1. For systems with q ≤ p−1, letting i = p−q, p−q+1, · · · , 2p−q in the last equation we get AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.91/105
  • 102.
    h0ap + h1ap−1+ · · · + hp−1a1 + hp = 0 . . . hpap + hp+1ap−1 + · · · + h2p−1a1 + h2p = 0 which gives det(Hp) = 0, and similarly for i = p − q + 1, · · · . AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.92/105
  • 103.
    h0ap+1 + h1ap+ · · · + hp+1 = 0 h1ap+1 + h2ap + · · · + hp+2 = 0 . . . hp+1ap+1 + hp+2ap + · · · + h2p+2 = 0, † AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.93/105
  • 104.
    the ARMA (p,q) model, the input white noise process W(n) is uncorrelated with its own past sample values as well as the past values of the system output. This gives E{W(n)W∗ (n − k)} = 0, k ≥ 1 E{W(n)X∗ (n − k)} = 0, k ≥ 1. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.94/105
  • 105.
    ri = E{x(n)x∗ (n− i)} = − p X k=1 ak{x(n − k)x∗ (n − i)} + q X k=0 bk{w(n − k)w∗ (n − i)} = − p X k=1 akri−k + q X k=0 bk{w(n − k)x∗ (n − i)} AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.95/105