SlideShare a Scribd company logo
Let ξ denote the random outcome of an experiment. To ev-
ery such outcome suppose a waveform X(t, ξ) is assigned.
The collection of such waveforms form a stochastic process.
The set of {ξk} and the time index t can be continuous or
discrete (countably infinite or finite) as well. For fixed ξk ∈ S
(the set of all experimental outcomes), X(t, ξ) is a specific
time function. For fixed t, X1 = X(t1, ξi) is a random vari-
able.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.1/105
The ensemble of all such realizations X(t, ξ) over time
represents the stochastic process X(t). For example
X(t) = a cos(ω0t + φ), φ ∼ U(0, 2π)
Stochastic processes are everywhere: Brownian motion,
stock market fluctuations, various queuing systems all
represent stochastic phenomena.
If X(t) is a stochastic process, then for fixed t, X(t) repre-
sents a random variable.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.2/105
Its distribution function is given by
FX(x, t) = P{X(t) ≤ x}
Notice that FX(x, t) depends on t, since for a different t, we
obtain a different random variable. Further
fX(x, t) =
dFX(x, t)
dx
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.3/105
FX(x1, x2, t1, t2) = P{X(t1) ≤ x1, X(t2) ≤ x2}
fX(x1, x2, t1, t2) =
∂2
FX(x1, x2, t1, t2)
∂x1∂x2
represents the 2nd order density function of the process
X(t).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.4/105
Similarly fX(x1, x2, · · · xn, t1, t2 · · · , tn)represents the nth or-
der density function of the process X(t). Complete specifica-
tion of the stochastic process X(t) requires the knowledge
of fX(x1, x2, · · · xn, t1, t2 · · · , tn) for all ti, i = 1, 2, · · · , n
and for all n, an almost impossible task in reality.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.5/105
µ(t) = E{X(t)} =
Z +∞
−∞
xfX(x, t)dx
In general, the mean of a process can depend on the time
index t.
RXX(t1, t2)=E{X(t1)X∗
(t2)}=
Z Z
x1x∗
2fX(x1, x2, t1, t2)dx1dx2
It represents the interrelationship between the random vari-
ables X1 = X(t1) and X2 = X(t2) generated from the pro-
cess X(t).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.6/105
1. RXX(t1, t2) = R∗
XX(t2, t1) = [E{X(t2)X∗
(t1)}]∗
2. RXX(t, t) = E{|X(t)|2
} > 0.
3.
n
P
i=1
n
P
j=1
aia∗
j RXX(ti, tj) ≥ 0.
CXX(t1, t2) = RXX(t1, t2) − µX(t1)µ∗
X(t2)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.7/105
Example:
z =
Z T
−T
X(t)dt
E[|z|2
] =
Z T
−T
Z T
−T
E{X(t1)X∗
(t2)}dt1dt2
=
Z T
−T
Z T
−T
RXX(t1, t2)dt1dt2
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.8/105
Example:
X(t) = a cos(ω0t + ϕ), ϕ ∼ U(0, 2π).
µX(t) = E{X(t)} = aE{cos(ω0t + ϕ)}
= a cos ω0tE{cos ϕ} − a sin ω0tE{sin ϕ} = 0,
E{cos ϕ} = 1
2π
Z 2π
0
cos ϕdϕ = 0 = E{sin ϕ}.
RXX(t1, t2) = a2
E{cos(ω0t1 + ϕ) cos(ω0t2 + ϕ)}
= a2
2
E{cos ω0(t1 − t2) + cos(ω0(t1 + t2) + 2ϕ)}
= a2
2
cos ω0(t1 − t2).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.9/105
Stationary Stochastic Processes
Stationary processes exhibit statistical properties that are
invariant to shift in the time index. Thus, for example,
second-order stationarity implies that the statistical proper-
ties of the pairs {X(t1), X(t2)} and {X(t1 +c), X(t2 +c)} are
the same for any c. Similarly first-order stationarity implies
that the statistical properties of X(ti) and X(ti + c) are the
same for any c.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.10/105
Strict-Sense Stationary (S.S.S)
if
fX(x1, · · · xn, t1, · · · , tn) ≡ fX(x1, · · · xn, t1 +c, · · · , tn +c) (1)
for any c, where the left side represents the joint density
function of the random variables
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.11/105
X1 = X(t1), X2 = X(t2), · · · , Xn = X(tn) and the right side
corresponds to the joint density function of the random
variables
X′
1 = X(t1 + c), X′
2 = X(t2 + c), · · · , X′
n = X(tn + c).
A process X(t) is said to be strict-sense stationary if (1) is
true for all ti, i = 1, 2, · · · , n, n = 1, 2, · · · , for any c.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.12/105
For a first-order strict sense stationary process,
fX(x, t) ≡ fX(x, t + c)
for c = −t,
fX(x, t) = fX(x) (2)
the first-order density of X(t) is independent of t. In that
case
E[X(t)] =
Z +∞
−∞
xf(x)dx = µ, a constant.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.13/105
Similarly, for a 2nd-order strict-sense stationary process
fX(x1, x2, t1, t2) ≡ fX(x1, x2, t1 + c, t2 + c)
For c = −t2:
fX(x1, x2, t1, t2) ≡ fX(x1, x2, t1 − t2) (3)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.14/105
the second order density function of a strict sense
stationary process depends only on the difference of the
time indices t2 − t2 = τ. In that case the autocorrelation
function is given by
E{X(t1)X∗
(t2)} =
Z Z
x1x∗
2fX(x1, x2, τ = t1 − t2)dx1dx2
= RXX(t1 − t2) = RXX(τ) = R∗
XX(−τ),
the E{X(t1)X∗
(t2)} of a 2nd SSS depends only on τ.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.15/105
The conditions for 1st and 2nd order SSS are in equations
(2), (3), these are usually hard to verify. For this reason we
resort to wide-sense stationarity WSS.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.16/105
WSS:
1. E{X(t)} = µ
2. E{X(t1)X∗
(t2)} = RXX(|t2 − t1|)
SSS always implies WSS, but the converse is not true, but,
the only exception are Gaussian processes.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.17/105
This follows, since if X(t) is a Gaussian process, then by
definition
X1 = X(t1), X2 = X(t2), · · · , Xn = X(tn)
are jointly Gaussian random variables for any t1, t2 · · · , tn
whose joint characteristic function is given by
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.18/105
φX(ω1, ω2, · · · , ωn) = e
j
n
P
k=1
µ(tk)ωk−0.5
n
P
l=1
n
P
k=1
CXX (tl,tk)ωlωk
CXX(ti, tk) is the element of covariance matrix, for WSS
process CXX(ti, tk) = CXX(ti − tk)
φX(ω1, ω2, · · · , ωn) = e
j
n
P
k=1
µωk−0.5
n
P
l=1
n
P
k=1
CXX (tl−tk)ωlωk
(4)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.19/105
and hence if the set of time indices are shifted by a
constant c to generate a new set of jointly Gaussian
random variables X′
2 = X(t2 + c), · · · ,X′
n = X(tn + c) then
their joint characteristic function is identical to (4). Thus the
set of random variables {Xi}n
i=1 and {X′
i}n
i=1 have the
same JPDF for all n and all c, establishing the SSS of
Gaussian processes from it WSS.
For Gaussian processes: WSS⇒SSS
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.20/105
Why GPs?
1. Easy prior assumption
2. Regression
3. Classification
4. Optimization
5. Unimodal
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.21/105
N-dim Gaussian
e−0.5(x−µ)H C−1(x−µ)
p
(2π)N |C|
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.22/105
N-dim Gaussian
C=[.5,-.15;-.15,.2]; mu=[1 -1];
[X1,X2] = meshgrid(linspace(-2,2,100)’,
linspace(-2,2,100)’);
X = [X1(:) X2(:)];
p = mvnpdf(X, mu, C);
surf(X1,X2,reshape(p,100,100));
contour(X1,X2,reshape(p,100,100));
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.23/105
−2
−1
0
1
2
−2
−1.5
−1
−0.5
0
0.5
1
1.5
2
0
0.2
0.4
0.6
0.8
1
2D Gaussian PDF, surface level plot
X
2
X1
PDF
Figure 1: Multivariate Gaussian PDF.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.24/105
2D Gaussian PDF, contour plot
X1
X
2
−2 −1.5 −1 −0.5 0 0.5 1 1.5 2
−2
−1.5
−1
−0.5
0
0.5
1
1.5
2
Figure 2: Multivariate Gaussian PDF.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.25/105
C = V DV H
, V = [V1 V2]
C =

0.5 −.15
−0.15 0.2

V =

0.31625 −.9487
−0.9487 0.3162

D =

0.05 0
0 0.55

AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.26/105
.
. σ2
2
V2
σ
1
2
V
1
X1
X
2
−2 −1.5 −1 −0.5 0 0.5 1 1.5 2
−2
−1.5
−1
−0.5
0
0.5
1
1.5
2
Figure 3: Multivariate Gaussian PDF.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.27/105
For LTI sytems:
Y (t) =
Z +∞
−∞
h(t − τ)X(τ)dτ
=
Z +∞
−∞
h(τ)X(t − τ)dτ
X(t) =
Z +∞
−∞
X(τ)δ(t − τ)dτ
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.28/105
Y (t) = L{X(t)} = L{
Z +∞
−∞
X(τ)δ(t − τ)dτ}
=
Z +∞
−∞
L{X(τ)δ(t − τ)dτ}
=
Z +∞
−∞
X(τ)L{δ(t − τ)}dτ
=
Z +∞
−∞
X(τ)h(t − τ)dτ =
Z +∞
−∞
h(τ)X(t − τ)dτ
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.29/105
Output statistics:
µY (t) = E{Y (t)} =
Z +∞
−∞
E{X(τ)h(t − τ)dτ}
=
Z +∞
−∞
µX(τ)h(t − τ)dτ = µX(t) ∗ h(t).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.30/105
RXY (t1, t2) = E{X(t1)Y ∗
(t2)}
= E{X(t1)
Z +∞
−∞
X∗
(t2 − α)h∗
(α)dα}
=
Z +∞
−∞
E{X(t1)X∗
(t2 − α)}h∗
(α)dα
=
Z +∞
−∞
RXX(t1, t2 − α)h∗
(α)dα
= RXX(t1, t2) ∗ h∗
(t2)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.31/105
RY Y (t1, t2) = E{Y (t1)Y ∗
(t2)}
= E{
Z +∞
−∞
X(t1 − β)h(β)dβY ∗
(t2)}
=
Z +∞
−∞
E{X(t1 − β)Y ∗
(t2)}h(β)dβ
=
Z +∞
−∞
RXY (t1 − β, t2)h(β)dβ
= RXY (t1, t2) ∗ h(t1),
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.32/105
RY Y (t1, t2) = RXX(t1, t2) ∗ h∗
(t2) ∗ h(t1).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.33/105
If X(t) is WSS:
µY (t) = µX
Z +∞
−∞
h(τ)dτ = µXc, a constant .
RXY (t1, t2) =
Z +∞
−∞
RXX(t1 − t2 + α)h∗
(α)dα
= RXX(τ) ∗ h∗
(−τ) = RXY (τ), τ = t1 − t2
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.34/105
RY Y (t1, t2) =
Z +∞
−∞
RXY (t1 − β − t2)h(β)dβ, τ = t1 − t2
= RXY (τ) ∗ h(τ) = RY Y (τ)
RY Y (τ) = RXX(τ) ∗ h∗
(−τ) ∗ h(τ).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.35/105
Example: If X(t) is WSS:
z =
Z T
−T
X(t)dt.
Y (t) = X(t) ∗ h(t), h(t) = u(t + T) − u(t − T)
RY Y (τ) = RXX(τ) ∗ h(τ) ∗ h(−τ)
h(τ) ∗ h(−τ) = (2T − |τ|), |τ| ≤ 2T
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.36/105
Y (t) = g{X(t)}
SSS input SSS output
Memoryless
system
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.37/105
Y (t) = g{X(t)}
WSS input Can’t say about
stationarity
in any sense
Memoryless
system
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.38/105
Y (t) = g{X(t)}
Gaussian input Not Gaussian
Memoryless
system
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.39/105
If X(t) is a zero mean stationary Gaussian process, and
Y (t) = g{X(t)}, where g(·) represents a nonlinear
memoryless device, then
RXY (τ) = ηRXX(τ), η = E{g′
(X)}.
Proof:
RXY (τ) = E{X(t)Y (t − τ)} = E[X(t)g{X(t − τ)}]
=
Z Z
x1g(x2)fX1X2 (x1, x2)dx1dx2, (5)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.40/105
where X1 = X(t) and X2 = X(t − τ) are jointly Gaussian
RVs.
fX1X2 (x1, x2) = 1
2π
√
|A|
e−x∗A−1x/2
X = (X1, X2)T
, x = (x1, x2)T
A = E{X X∗
} =

RXX(0) RXX(τ)
RXX(τ) RXX(0)

= LL∗
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.41/105
where L is an upper triangular factor matrix with positive
diagonal entries.
L =

ℓ11 ℓ12
0 ℓ22

.
Consider the transformation:
Z = L−1
X = (Z1, Z2)T
, z = L−1
x = (z1, z2)T
so that
E{Z Z∗
} = L−1
E{XX∗
}L∗−1
= L−1
AL∗−1
= I
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.42/105
and hence Z1, Z2 are zero mean independent Gaussian
random variables. Also
x = Lz ⇒ x1 = ℓ11z1 + ℓ12z2, x2 = ℓ22z2
and hence,
x∗
A−1
x = z∗
L∗
A−1
Lz = z∗
z = z2
1 + z2
2
then the Jacobian of transformation
|J| = |L−1
| = |A|−1/2
.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.43/105
RXY (τ) =
Z +∞
−∞
Z +∞
−∞
(ℓ11z1 + ℓ12z2)g(ℓ22z2) 1
|J|
1
2π|A|1/2 e−z2
1/2
e−z2
2/2
= ℓ11
Z +∞
−∞
Z +∞
−∞
z1g(ℓ22z2)fz1 (z1)fz2 (z2)dz1dz2
+ ℓ12
Z +∞
−∞
Z +∞
−∞
z2g(ℓ22z2)fz1 (z1)fz2 (z2)dz1dz2
= ℓ11
Z +∞
−∞
z1fz1 (z1)dz1
Z +∞
−∞
g(ℓ22z2)fz2 (z2)dz2
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.44/105
+ ℓ12
Z +∞
−∞
z2g(ℓ22z2) fz2 (z2)
| {z }
1
√
2π
e−z2
2/2
dz2
= ℓ12
ℓ2
22
Z +∞
−∞
ug(u) 1
√
2π
e−u2/2ℓ2
22 du
where u = ℓ22z2
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.45/105
This gives
RXY (τ) = ℓ12ℓ22
R +∞
−∞
g(u) u
ℓ2
22
fu(u)
z }| {
1
√
2πℓ2
22
e−u2/2ℓ2
22
| {z }
−
dfu(u)
du
=−f′
u(u)
du
= −RXX(τ)
R +∞
−∞
g(u)f′
u(u)du,
Since A = LL∗
gives ℓ12ℓ22 = RXX(τ). Hence,
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.46/105
RXY (τ) = RXX(τ){−g(u)fu(u)|+∞
−∞ +
R +∞
−∞
g′
(u)fu(u)du}
= RXX(τ)E{g′
(X)} = ηRXX(τ),
This is the desired result, where η = E{g′
(X)}. Thus if the
input to a memoryless device is stationary Gaussian, the
cross correlation function between the input and the output
is proportional to the input autocorrelation function.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.47/105
WSS input WSS output
LTI system
h(t)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.48/105
SSS input SSS output
LTI system
h(t)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.49/105
Gaussian input Gaussian output
LTI system
h(t)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.50/105
White noise W(t) is said to be a white noise process if
RWW (t1, t2) = q(t1)δ(t2 − t1)
W(t) is said to be WSS white noise if E{W(t)} = constant,
and
RWW (t1, t2) = qδ(t1 − t2) = qδ(τ).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.51/105
If W(t) is also a Gaussian process (white Gaussian pro-
cess), then all of its samples are independent random vari-
ables(Why?)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.52/105
White noise Colored noise
W(t) N(t) = h(t) ∗ W(t)
LTI system
h(t)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.53/105
E[N(t)] = µW
Z +∞
−∞
h(τ)dτ = a constant
RNN (τ) = qδ(τ) ∗ h∗
(−τ) ∗ h(τ)
= qh∗
(−τ) ∗ h(τ) = qρ(τ)
ρ(τ) = h(τ) ∗ h∗
(−τ) =
Z +∞
−∞
h(α)h∗
(α + τ)dα.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.54/105
Thus the output of a white noise process through an LTI
system represents a (colored) noise process.
Note: White noise need not be Gaussian.
“White” and “Gaussian” are two different concepts!
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.55/105
♦ ♦
  ♦

X(t)
t
Figure 4: Upcrossing (♦), Downcrossing ()
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.56/105
Upcrossings and Downcrossings of a stationary Gaussian pro-
cess: Consider a zero mean stationary Gaussian process
X(t) with autocorrelation function RXX(τ). An upcrossing
over the mean value occurs whenever the realization X(t)
passes through zero with positive slope. Let ρ∆t represent
the probability of such an upcrossing in the interval (t, t+∆t)
We wish to determine ρ.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.57/105
Since X(t) is a stationary Gaussian process, its derivative
process X′
(t) is also zero mean stationary Gaussian with
autocorrelation function
RXX(τ) =
Z ∞
−∞
SXX(ω)ejωτ dω
2π
R
′′
XX(τ) =
Z ∞
−∞
(jω)2
SXX(ω)ejωτ dω
2π
SX′X′ (ω) = |jω|2
SXX(ω) ⇒
RX′X′ (τ) = −R
′′
XX(τ)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.58/105
Further X(t) and X′
(t) and are jointly Gaussian stationary
processes, and since
RXX′ (τ) = −
dRXX(τ)
dτ
,
RXX′ (−τ) = −
dRXX(−τ)
d(−τ)
=
dRXX(τ)
dτ
= −RXX′ (τ)
which for τ = 0 gives
RXX′ (0) = 0 ⇒ E[X(t)X′
(t)] = 0
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.59/105
the jointly Gaussian zero mean random variables
X1 = X(t), X2 = X′
(t)
are uncorrelated hence independent with variances
σ2
1 = RXX(0) and σ2
2 = RX′X′ (0) = −R′′
XX(0)  0
fX1X2 (x1, x2) = fX(x1)fX(x2) =
1
2πσ1σ2
e
−
x2
1
2σ2
1
+
x2
1
2σ2
2
!
.
To determine ρ the probability of upcrossing rate,
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.60/105
we argue as follows: In an interval (t, t + ∆t) the realization
moves from X(t) = X1 to
X(t + ∆t) = X(t) + X′
(t)∆t = X1 + X2∆t, and hence the
realization intersects with the zero level somewhere in that
interval if
X1  0, X2  0, and X(t + ∆t) = X1 + X2∆t  0
i.e., X1  −X2∆t Hence the probability of upcrossing in
(t, t + ∆t) is given by
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.61/105
ρ∆t =
Z ∞
x2=0
Z 0
x1=−x2∆t
fX1X2 (x1, x2)dx1dx2
=
Z ∞
0
fX2 (x2)dx2
Z ∞
−x2∆t
fX1 (x1)dx1. (6)
Differentiating both sides of 6 with respect to ∆t we get
ρ =
Z ∞
0
fX2 (x2)x2fX1 (−x2∆t)dx2
and letting ∆t → 0, we have
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.62/105
c
ρ =
Z ∞
0
x2fX(x2)fX(0)dx2 =
1
p
2πRXX(0)
Z ∞
0
x2fX(x2)dx2
=
1
p
2πRXX(0)
1
2
(σ2
p
2/π) =
1
2π
s
−R′′
XX(0)
RXX(0)
There is an equal probability for downcrossings, and hence
the total probability for crossing the zero line in an interval
(t, t + ∆t) equals ρ0∆t where
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.63/105
ρ0 =
1
π
q
−R′′
XX(0)/RXX(0)  0.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.64/105
A discrete time stochastic process Xn = X(nT) is a
sequence of random variables. The mean, autocorrelation
and auto-covariance functions of a discrete-time process
are gives by
µn = E{X(nT)}
R(n1, n2) = E{X(n1T)X∗
(n2T)}
C(n1, n2) = R(n1, n2) − µn1 µ∗
n2
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.65/105
WSS:
E{X(nT)} = µ, a constant
E[X{(k + n)T}X∗
{(k)T}] = R(n) = rn = r∗
−n
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.66/105
The positive-definite property of the autocorrelation
sequence can be expressed in terms of certain
Hermitian-Toeplitz matrices as follows:
Theorem: A sequence {rn} |∞
−∞ forms an autocorrelation se-
quence of a wide sense stationary stochastic process if and
only if every Hermitian-Toeplitz matrix Tn given by
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.67/105
Tn =





r0 r1 r2 · · · rn
r∗
1 r0 r1 · · · rn−1
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
r∗
n r∗
n−1 · · · r∗
1 r0





= T∗
n
is non-negative (positive) definite for n = 0, 1, 2 · · · , ∞
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.68/105
a∗
Tna =
n
X
i=0
n
X
k=0
aia∗
krk−i, ∀~
a
a∗
Tna =
n
X
i=0
n
X
k=0
aia∗
kE{X(kT)X∗
(iT)}
= E



n
X
k=0
a∗
kX(kT)
2



≥ 0.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.69/105
Systems and discrete WSS processes:
RXY (n) = RXX(n) ∗ h∗
(−n)
RY Y (n) = RXY (n) ∗ h(n)
RY Y (n) = RXX(n) ∗ h∗
(−n) ∗ h(n).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.70/105
ARMA processes: input: W(n), output: X(n)
X(n) = −
p
X
k=1
akX(n − k) +
q
X
k=0
bkW(n − k),
X(z)
p
X
k=0
akz−k
= W(z)
q
X
k=0
bkz−k
, a0 ≡ 1
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.71/105
H(z) =
∞
X
k=0
h(k)z−k
=
X(z)
W(z)
=
b0 + b1z−1
+ b2z−2
+ · · · + bqz−q
1 + a1z−1 + a2z−2 + · · · + apz−p
=
B(z)
A(z)
X(n) =
∞
X
k=0
h(n − k)W(k)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.72/105
the transfer function H(z) is rational with p poles and q zeros
that determine the model order of the underlying system.
X(n) undergoes regression over p of its previous values and
at the same time a moving average based on W(n), W(n −
1), · · · , W(n − q) of the input over (q + 1) values is added
to it, thus generating an Auto Regressive Moving Average
(ARMA (p, q)) process X(n).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.73/105
Generally the input {W(n)} represents a sequence of
uncorrelated random variables of zero mean and constant
variance σ2
W so that
RWW (n) = σ2
W δ(n).
If in addition, {W(n)} is normally distributed then the
output {X(n)} also represents a strict-sense stationary
normal process.
If q = 0, then we have an AR(p) process (all-pole process),
and if p = 0, then an MA(q)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.74/105
AR(1) process: An AR(1) process has the form
X(n) = aX(n − 1) + W(n)
H(z) =
1
1 − az−1
=
∞
X
n=0
an
z−n
h(n) = an
, |a|  1
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.75/105
RXX(n) = σ2
W δ(n) ∗ {a−n
} ∗ {an
} = σ2
W
∞
X
k=0
a|n|+k
ak
= σ2
W
a|n|
1 − a2
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.76/105
The normalized autocorrelation sequence:
ρX(n) =
RXX(n)
RXX(0)
= a|n|
, |n| ≥ 0. (7)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.77/105
It is instructive to compare an AR(1) model discussed
above by superimposing a random component to it, which
may be an error term associated with observing a first
order AR process X(n). Thus
Y (n) = X(n) + V (n), X(n) ∼ AR(1), V (n) ∼ White noise (8)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.78/105
RY Y (n) = RXX(n) + RV V (n) = RXX(n) + σ2
V δ(n)
= σ2
W
a|n|
1 − a2
+ σ2
V δ(n)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.79/105
ρY (n) =
RY Y (n)
RY Y (0)
=

1, n = 0
ca|n|
, n = ±1, ±2, · · ·
(9)
c =
σ2
W
σ2
W + σ2
V (1 − a2)
 1.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.80/105
Eqs. (7) and (8) demonstrate the effect of superimposing
an error sequence on an AR(1) model. For non-zero lags,
the autocorrelation of the observed sequence {Y (n)} is re-
duced by a constant factor compared to the original process
{X(n)}. From (8), the superimposed error sequence V (n)
only affects the corresponding term in Y (n) (term by term).
However, a particular term in the “input sequence” W(n)
affects X(n) and Y (n) as well as all subsequent observa-
tions.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.81/105
AR(2) Process: An AR(2) process has the form, W(n) is a
zero-mean white process
X(n) = a1X(n − 1) + a2X(n − 2) + W(n)
H(z) =
∞
X
n=0
h(n)z−n
=
1
1 − a1z−1 − a2z−2
=
b1
1 − λ1z−1
+
b2
1 − λ2z−1
h(0) = 1, h(1) = a1, h(n) = a1h(n − 1) + a2h(n − 2), n ≥ 2
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.82/105
h(n) = b1λn
1 + b2λn
2 , n ≥ 0
b1 + b2 = 1, b1λ1 + b2λ2 = a1.
λ1 + λ2 = a1, λ1λ2 = −a2,
Stability condition:
|λ1|  1, |λ2|  1.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.83/105
RXX(n) = E{X(n + m)X∗
(m)}
= E{[a1X(n + m − 1) + a2X(n + m − 2)]X∗
(m)}
+ E{W(n + m)X∗
(m)}
= a1RXX(n − 1) + a2RXX(n − 2)
Because W(n) and X(n) are uncorrelated
E{W(n + m)X∗
(m)} = 0
ρX(n) =
RXX(n)
RXX(0)
= a1ρX(n − 1) + a2ρX(n − 2).
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.84/105
RXX(n) = RWW (n) ∗ h∗
(−n) ∗ h(n) = σ2
W h∗
(−n) ∗ h(n)
= σ2
W
∞
X
k=0
h∗
(n + k)h(k)
= σ2
W

|b1|2
(λ∗
1)n
1 − |λ1|2
+
b∗
1b2(λ∗
1)n
1 − λ∗
1λ2
+
b1b∗
2(λ∗
2)n
1 − λ1λ∗
2
+
|b2|2
(λ∗
2)n
1 − |λ2|2

AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.85/105
The normalized autocorrelation function:
ρX(n) =
RXX(n)
RXX(0)
= c1λ∗n
1 + c2λ∗n
2
where c1 and c2 are the appropriate constant.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.86/105
ARMA(p, q) system has only p + q + 1 independent
coefficients, and hence its impulse response sequence
{hk} also must exhibit a similar dependence among them.
An old result due to Kronecker (1881) states that the nec-
essary and sufficient condition for H(z) =
P∞
k=0 h(n)z−n
to
represent a rational system (ARMA) is that
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.87/105
det Hn = 0, n ≥ N (for all sufficiently large n),
where
Hn =





h0 h1 h2 · · · hn
h1 h2 h3 · · · hn+1
.
.
.
hn hn+1 hn+2 · · · h2n





. (10)
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.88/105
In the case of rational systems for all sufficiently large n, the
Hankel matrices Hn in (10) all have the same rank.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.89/105
The necessary part easily follows from
∞
X
k=0
h(n)z−n
=
Pq
k=0 bkz−k
Pp
k=0 akz−k
by cross multiplying and equating coefficients of like
powers of z−n
for n = 0, 1, 2, · · · .
We have
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.90/105
b0 = h0
b1 = h0a1 + h1
.
.
.
bq = h0aq + h1aq−1 + · · · + hm
0 = h0aq+i + h1aq+i−1 + · · · + hq+i−1a1 + hq+i, i ≥ 1.
For systems with q ≤ p−1, letting i = p−q, p−q+1, · · · , 2p−q
in the last equation we get
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.91/105
h0ap + h1ap−1 + · · · + hp−1a1 + hp = 0
.
.
.
hpap + hp+1ap−1 + · · · + h2p−1a1 + h2p = 0
which gives det(Hp) = 0, and similarly for i = p − q + 1, · · · .
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.92/105
h0ap+1 + h1ap + · · · + hp+1 = 0
h1ap+1 + h2ap + · · · + hp+2 = 0
.
.
.
hp+1ap+1 + hp+2ap + · · · + h2p+2 = 0, †
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.93/105
the ARMA (p, q) model, the input white noise process W(n)
is uncorrelated with its own past sample values as well as
the past values of the system output. This gives
E{W(n)W∗
(n − k)} = 0, k ≥ 1
E{W(n)X∗
(n − k)} = 0, k ≥ 1.
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.94/105
ri = E{x(n)x∗
(n − i)}
= −
p
X
k=1
ak{x(n − k)x∗
(n − i)} +
q
X
k=0
bk{w(n − k)w∗
(n − i)}
= −
p
X
k=1
akri−k +
q
X
k=0
bk{w(n − k)x∗
(n − i)}
AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.95/105

More Related Content

What's hot

Stat 2153 Stochastic Process and Markov chain
Stat 2153 Stochastic Process and Markov chainStat 2153 Stochastic Process and Markov chain
Stat 2153 Stochastic Process and Markov chain
Khulna University
 
Numerical Analysis and Its application to Boundary Value Problems
Numerical Analysis and Its application to Boundary Value ProblemsNumerical Analysis and Its application to Boundary Value Problems
Numerical Analysis and Its application to Boundary Value Problems
Gobinda Debnath
 
Lesson 11: Markov Chains
Lesson 11: Markov ChainsLesson 11: Markov Chains
Lesson 11: Markov Chains
Matthew Leingang
 
Binomial probability distributions
Binomial probability distributions  Binomial probability distributions
Binomial probability distributions
Long Beach City College
 
Probability distribution
Probability distributionProbability distribution
Probability distribution
Rohit kumar
 
Least square method
Least square methodLeast square method
Least square methodSomya Bagai
 
Exponential probability distribution
Exponential probability distributionExponential probability distribution
Exponential probability distribution
Muhammad Bilal Tariq
 
21 monotone sequences x
21 monotone sequences x21 monotone sequences x
21 monotone sequences x
math266
 
Normal Distribution
Normal DistributionNormal Distribution
Normal Distribution
DataminingTools Inc
 
Binomial distribution
Binomial distributionBinomial distribution
Binomial distribution
Dr. Satyanarayan Pandey
 
Probability distribution
Probability distributionProbability distribution
Probability distribution
Manoj Bhambu
 
Geometric Distribution
Geometric DistributionGeometric Distribution
Geometric Distribution
Ratul Basak
 
Random Variables
Random VariablesRandom Variables
Random Variables
HAmindavarLectures
 
Binomial probability distributions ppt
Binomial probability distributions pptBinomial probability distributions ppt
Binomial probability distributions pptTayab Ali
 
Discrete probability distribution (complete)
Discrete probability distribution (complete)Discrete probability distribution (complete)
Discrete probability distribution (complete)
ISYousafzai
 
Normal probability distribution
Normal probability distributionNormal probability distribution
Normal probability distribution
Nadeem Uddin
 
Geometric Mean
Geometric MeanGeometric Mean
Geometric Mean
Sumit Kumar
 
Probability mass functions and probability density functions
Probability mass functions and probability density functionsProbability mass functions and probability density functions
Probability mass functions and probability density functionsAnkit Katiyar
 
Different kind of distance and Statistical Distance
Different kind of distance and Statistical DistanceDifferent kind of distance and Statistical Distance
Different kind of distance and Statistical Distance
Khulna University
 

What's hot (20)

Stat 2153 Stochastic Process and Markov chain
Stat 2153 Stochastic Process and Markov chainStat 2153 Stochastic Process and Markov chain
Stat 2153 Stochastic Process and Markov chain
 
Numerical Analysis and Its application to Boundary Value Problems
Numerical Analysis and Its application to Boundary Value ProblemsNumerical Analysis and Its application to Boundary Value Problems
Numerical Analysis and Its application to Boundary Value Problems
 
Lesson 11: Markov Chains
Lesson 11: Markov ChainsLesson 11: Markov Chains
Lesson 11: Markov Chains
 
Binomial probability distributions
Binomial probability distributions  Binomial probability distributions
Binomial probability distributions
 
Probability distribution
Probability distributionProbability distribution
Probability distribution
 
Least square method
Least square methodLeast square method
Least square method
 
Exponential probability distribution
Exponential probability distributionExponential probability distribution
Exponential probability distribution
 
21 monotone sequences x
21 monotone sequences x21 monotone sequences x
21 monotone sequences x
 
Normal Distribution
Normal DistributionNormal Distribution
Normal Distribution
 
Binomial distribution
Binomial distributionBinomial distribution
Binomial distribution
 
Numerical method
Numerical methodNumerical method
Numerical method
 
Probability distribution
Probability distributionProbability distribution
Probability distribution
 
Geometric Distribution
Geometric DistributionGeometric Distribution
Geometric Distribution
 
Random Variables
Random VariablesRandom Variables
Random Variables
 
Binomial probability distributions ppt
Binomial probability distributions pptBinomial probability distributions ppt
Binomial probability distributions ppt
 
Discrete probability distribution (complete)
Discrete probability distribution (complete)Discrete probability distribution (complete)
Discrete probability distribution (complete)
 
Normal probability distribution
Normal probability distributionNormal probability distribution
Normal probability distribution
 
Geometric Mean
Geometric MeanGeometric Mean
Geometric Mean
 
Probability mass functions and probability density functions
Probability mass functions and probability density functionsProbability mass functions and probability density functions
Probability mass functions and probability density functions
 
Different kind of distance and Statistical Distance
Different kind of distance and Statistical DistanceDifferent kind of distance and Statistical Distance
Different kind of distance and Statistical Distance
 

Similar to Stochastic Processes - part 1

Stochastic Processes - part 3
Stochastic Processes - part 3Stochastic Processes - part 3
Stochastic Processes - part 3
HAmindavarLectures
 
Stochastic Processes - part 2
Stochastic Processes - part 2Stochastic Processes - part 2
Stochastic Processes - part 2
HAmindavarLectures
 
Stochastic Processes - part 5
Stochastic Processes - part 5Stochastic Processes - part 5
Stochastic Processes - part 5
HAmindavarLectures
 
Stochastic Processes - part 4
Stochastic Processes - part 4Stochastic Processes - part 4
Stochastic Processes - part 4
HAmindavarLectures
 
research paper publication
research paper publicationresearch paper publication
research paper publication
samuu45sam
 
ch9.pdf
ch9.pdfch9.pdf
ch9.pdf
KavS14
 
Stochastic Processes - part 6
Stochastic Processes - part 6Stochastic Processes - part 6
Stochastic Processes - part 6
HAmindavarLectures
 
Scholastic process and explaination lectr14.ppt
Scholastic process and explaination lectr14.pptScholastic process and explaination lectr14.ppt
Scholastic process and explaination lectr14.ppt
sadafshahbaz7777
 
lecture about different methods and explainanation.ppt
lecture about different methods and explainanation.pptlecture about different methods and explainanation.ppt
lecture about different methods and explainanation.ppt
sadafshahbaz7777
 
160511 hasegawa lab_seminar
160511 hasegawa lab_seminar160511 hasegawa lab_seminar
160511 hasegawa lab_seminar
Tomohiro Koana
 
MCQMC 2020 talk: Importance Sampling for a Robust and Efficient Multilevel Mo...
MCQMC 2020 talk: Importance Sampling for a Robust and Efficient Multilevel Mo...MCQMC 2020 talk: Importance Sampling for a Robust and Efficient Multilevel Mo...
MCQMC 2020 talk: Importance Sampling for a Robust and Efficient Multilevel Mo...
Chiheb Ben Hammouda
 
On Twisted Paraproducts and some other Multilinear Singular Integrals
On Twisted Paraproducts and some other Multilinear Singular IntegralsOn Twisted Paraproducts and some other Multilinear Singular Integrals
On Twisted Paraproducts and some other Multilinear Singular Integrals
VjekoslavKovac1
 
Existance Theory for First Order Nonlinear Random Dfferential Equartion
Existance Theory for First Order Nonlinear Random Dfferential EquartionExistance Theory for First Order Nonlinear Random Dfferential Equartion
Existance Theory for First Order Nonlinear Random Dfferential Equartion
inventionjournals
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
Wide sense stationary process in digital communication
Wide sense stationary process in digital communicationWide sense stationary process in digital communication
Wide sense stationary process in digital communication
VitthalGavhane1
 
Seminar Talk: Multilevel Hybrid Split Step Implicit Tau-Leap for Stochastic R...
Seminar Talk: Multilevel Hybrid Split Step Implicit Tau-Leap for Stochastic R...Seminar Talk: Multilevel Hybrid Split Step Implicit Tau-Leap for Stochastic R...
Seminar Talk: Multilevel Hybrid Split Step Implicit Tau-Leap for Stochastic R...
Chiheb Ben Hammouda
 
The feedback-control-for-distributed-systems
The feedback-control-for-distributed-systemsThe feedback-control-for-distributed-systems
The feedback-control-for-distributed-systemsCemal Ardil
 
Geometric and viscosity solutions for the Cauchy problem of first order
Geometric and viscosity solutions for the Cauchy problem of first orderGeometric and viscosity solutions for the Cauchy problem of first order
Geometric and viscosity solutions for the Cauchy problem of first order
Juliho Castillo
 
Tensor train to solve stochastic PDEs
Tensor train to solve stochastic PDEsTensor train to solve stochastic PDEs
Tensor train to solve stochastic PDEs
Alexander Litvinenko
 

Similar to Stochastic Processes - part 1 (20)

Stochastic Processes - part 3
Stochastic Processes - part 3Stochastic Processes - part 3
Stochastic Processes - part 3
 
Stochastic Processes - part 2
Stochastic Processes - part 2Stochastic Processes - part 2
Stochastic Processes - part 2
 
Stochastic Processes - part 5
Stochastic Processes - part 5Stochastic Processes - part 5
Stochastic Processes - part 5
 
Stochastic Processes - part 4
Stochastic Processes - part 4Stochastic Processes - part 4
Stochastic Processes - part 4
 
research paper publication
research paper publicationresearch paper publication
research paper publication
 
ch9.pdf
ch9.pdfch9.pdf
ch9.pdf
 
Stochastic Processes - part 6
Stochastic Processes - part 6Stochastic Processes - part 6
Stochastic Processes - part 6
 
Scholastic process and explaination lectr14.ppt
Scholastic process and explaination lectr14.pptScholastic process and explaination lectr14.ppt
Scholastic process and explaination lectr14.ppt
 
lecture about different methods and explainanation.ppt
lecture about different methods and explainanation.pptlecture about different methods and explainanation.ppt
lecture about different methods and explainanation.ppt
 
160511 hasegawa lab_seminar
160511 hasegawa lab_seminar160511 hasegawa lab_seminar
160511 hasegawa lab_seminar
 
MCQMC 2020 talk: Importance Sampling for a Robust and Efficient Multilevel Mo...
MCQMC 2020 talk: Importance Sampling for a Robust and Efficient Multilevel Mo...MCQMC 2020 talk: Importance Sampling for a Robust and Efficient Multilevel Mo...
MCQMC 2020 talk: Importance Sampling for a Robust and Efficient Multilevel Mo...
 
On Twisted Paraproducts and some other Multilinear Singular Integrals
On Twisted Paraproducts and some other Multilinear Singular IntegralsOn Twisted Paraproducts and some other Multilinear Singular Integrals
On Twisted Paraproducts and some other Multilinear Singular Integrals
 
Existance Theory for First Order Nonlinear Random Dfferential Equartion
Existance Theory for First Order Nonlinear Random Dfferential EquartionExistance Theory for First Order Nonlinear Random Dfferential Equartion
Existance Theory for First Order Nonlinear Random Dfferential Equartion
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Wide sense stationary process in digital communication
Wide sense stationary process in digital communicationWide sense stationary process in digital communication
Wide sense stationary process in digital communication
 
Seminar Talk: Multilevel Hybrid Split Step Implicit Tau-Leap for Stochastic R...
Seminar Talk: Multilevel Hybrid Split Step Implicit Tau-Leap for Stochastic R...Seminar Talk: Multilevel Hybrid Split Step Implicit Tau-Leap for Stochastic R...
Seminar Talk: Multilevel Hybrid Split Step Implicit Tau-Leap for Stochastic R...
 
The feedback-control-for-distributed-systems
The feedback-control-for-distributed-systemsThe feedback-control-for-distributed-systems
The feedback-control-for-distributed-systems
 
Geometric and viscosity solutions for the Cauchy problem of first order
Geometric and viscosity solutions for the Cauchy problem of first orderGeometric and viscosity solutions for the Cauchy problem of first order
Geometric and viscosity solutions for the Cauchy problem of first order
 
Tensor train to solve stochastic PDEs
Tensor train to solve stochastic PDEsTensor train to solve stochastic PDEs
Tensor train to solve stochastic PDEs
 

More from HAmindavarLectures

"Digital communications" undergarduate course lecture notes
"Digital communications" undergarduate course lecture notes"Digital communications" undergarduate course lecture notes
"Digital communications" undergarduate course lecture notes
HAmindavarLectures
 
Wavelet Signal Processing
Wavelet Signal ProcessingWavelet Signal Processing
Wavelet Signal Processing
HAmindavarLectures
 
Detection & Estimation Theory
Detection & Estimation TheoryDetection & Estimation Theory
Detection & Estimation Theory
HAmindavarLectures
 
Cyclo-stationary processes
Cyclo-stationary processesCyclo-stationary processes
Cyclo-stationary processes
HAmindavarLectures
 
Multivariate Gaussin, Rayleigh & Rician distributions
Multivariate Gaussin, Rayleigh & Rician distributionsMultivariate Gaussin, Rayleigh & Rician distributions
Multivariate Gaussin, Rayleigh & Rician distributions
HAmindavarLectures
 
Introduction to communication systems
Introduction to communication systemsIntroduction to communication systems
Introduction to communication systems
HAmindavarLectures
 
Advanced Communications Theory
Advanced Communications Theory Advanced Communications Theory
Advanced Communications Theory
HAmindavarLectures
 

More from HAmindavarLectures (7)

"Digital communications" undergarduate course lecture notes
"Digital communications" undergarduate course lecture notes"Digital communications" undergarduate course lecture notes
"Digital communications" undergarduate course lecture notes
 
Wavelet Signal Processing
Wavelet Signal ProcessingWavelet Signal Processing
Wavelet Signal Processing
 
Detection & Estimation Theory
Detection & Estimation TheoryDetection & Estimation Theory
Detection & Estimation Theory
 
Cyclo-stationary processes
Cyclo-stationary processesCyclo-stationary processes
Cyclo-stationary processes
 
Multivariate Gaussin, Rayleigh & Rician distributions
Multivariate Gaussin, Rayleigh & Rician distributionsMultivariate Gaussin, Rayleigh & Rician distributions
Multivariate Gaussin, Rayleigh & Rician distributions
 
Introduction to communication systems
Introduction to communication systemsIntroduction to communication systems
Introduction to communication systems
 
Advanced Communications Theory
Advanced Communications Theory Advanced Communications Theory
Advanced Communications Theory
 

Recently uploaded

Introduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp NetworkIntroduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp Network
TechSoup
 
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
National Information Standards Organization (NISO)
 
clinical examination of hip joint (1).pdf
clinical examination of hip joint (1).pdfclinical examination of hip joint (1).pdf
clinical examination of hip joint (1).pdf
Priyankaranawat4
 
Assignment_4_ArianaBusciglio Marvel(1).docx
Assignment_4_ArianaBusciglio Marvel(1).docxAssignment_4_ArianaBusciglio Marvel(1).docx
Assignment_4_ArianaBusciglio Marvel(1).docx
ArianaBusciglio
 
Top five deadliest dog breeds in America
Top five deadliest dog breeds in AmericaTop five deadliest dog breeds in America
Top five deadliest dog breeds in America
Bisnar Chase Personal Injury Attorneys
 
Aficamten in HCM (SEQUOIA HCM TRIAL 2024)
Aficamten in HCM (SEQUOIA HCM TRIAL 2024)Aficamten in HCM (SEQUOIA HCM TRIAL 2024)
Aficamten in HCM (SEQUOIA HCM TRIAL 2024)
Ashish Kohli
 
writing about opinions about Australia the movie
writing about opinions about Australia the moviewriting about opinions about Australia the movie
writing about opinions about Australia the movie
Nicholas Montgomery
 
PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.
Dr. Shivangi Singh Parihar
 
The basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptxThe basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptx
heathfieldcps1
 
PIMS Job Advertisement 2024.pdf Islamabad
PIMS Job Advertisement 2024.pdf IslamabadPIMS Job Advertisement 2024.pdf Islamabad
PIMS Job Advertisement 2024.pdf Islamabad
AyyanKhan40
 
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
Levi Shapiro
 
South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)
Academy of Science of South Africa
 
DRUGS AND ITS classification slide share
DRUGS AND ITS classification slide shareDRUGS AND ITS classification slide share
DRUGS AND ITS classification slide share
taiba qazi
 
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdfবাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
eBook.com.bd (প্রয়োজনীয় বাংলা বই)
 
Lapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdfLapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdf
Jean Carlos Nunes Paixão
 
CACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdfCACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdf
camakaiclarkmusic
 
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
Dr. Vinod Kumar Kanvaria
 
How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17
Celine George
 
Azure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHatAzure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHat
Scholarhat
 
Film vocab for eal 3 students: Australia the movie
Film vocab for eal 3 students: Australia the movieFilm vocab for eal 3 students: Australia the movie
Film vocab for eal 3 students: Australia the movie
Nicholas Montgomery
 

Recently uploaded (20)

Introduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp NetworkIntroduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp Network
 
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
 
clinical examination of hip joint (1).pdf
clinical examination of hip joint (1).pdfclinical examination of hip joint (1).pdf
clinical examination of hip joint (1).pdf
 
Assignment_4_ArianaBusciglio Marvel(1).docx
Assignment_4_ArianaBusciglio Marvel(1).docxAssignment_4_ArianaBusciglio Marvel(1).docx
Assignment_4_ArianaBusciglio Marvel(1).docx
 
Top five deadliest dog breeds in America
Top five deadliest dog breeds in AmericaTop five deadliest dog breeds in America
Top five deadliest dog breeds in America
 
Aficamten in HCM (SEQUOIA HCM TRIAL 2024)
Aficamten in HCM (SEQUOIA HCM TRIAL 2024)Aficamten in HCM (SEQUOIA HCM TRIAL 2024)
Aficamten in HCM (SEQUOIA HCM TRIAL 2024)
 
writing about opinions about Australia the movie
writing about opinions about Australia the moviewriting about opinions about Australia the movie
writing about opinions about Australia the movie
 
PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.
 
The basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptxThe basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptx
 
PIMS Job Advertisement 2024.pdf Islamabad
PIMS Job Advertisement 2024.pdf IslamabadPIMS Job Advertisement 2024.pdf Islamabad
PIMS Job Advertisement 2024.pdf Islamabad
 
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
 
South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)
 
DRUGS AND ITS classification slide share
DRUGS AND ITS classification slide shareDRUGS AND ITS classification slide share
DRUGS AND ITS classification slide share
 
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdfবাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
 
Lapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdfLapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdf
 
CACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdfCACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdf
 
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
 
How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17
 
Azure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHatAzure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHat
 
Film vocab for eal 3 students: Australia the movie
Film vocab for eal 3 students: Australia the movieFilm vocab for eal 3 students: Australia the movie
Film vocab for eal 3 students: Australia the movie
 

Stochastic Processes - part 1

  • 1. Let ξ denote the random outcome of an experiment. To ev- ery such outcome suppose a waveform X(t, ξ) is assigned. The collection of such waveforms form a stochastic process. The set of {ξk} and the time index t can be continuous or discrete (countably infinite or finite) as well. For fixed ξk ∈ S (the set of all experimental outcomes), X(t, ξ) is a specific time function. For fixed t, X1 = X(t1, ξi) is a random vari- able. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.1/105
  • 2. The ensemble of all such realizations X(t, ξ) over time represents the stochastic process X(t). For example X(t) = a cos(ω0t + φ), φ ∼ U(0, 2π) Stochastic processes are everywhere: Brownian motion, stock market fluctuations, various queuing systems all represent stochastic phenomena. If X(t) is a stochastic process, then for fixed t, X(t) repre- sents a random variable. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.2/105
  • 3. Its distribution function is given by FX(x, t) = P{X(t) ≤ x} Notice that FX(x, t) depends on t, since for a different t, we obtain a different random variable. Further fX(x, t) = dFX(x, t) dx AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.3/105
  • 4. FX(x1, x2, t1, t2) = P{X(t1) ≤ x1, X(t2) ≤ x2} fX(x1, x2, t1, t2) = ∂2 FX(x1, x2, t1, t2) ∂x1∂x2 represents the 2nd order density function of the process X(t). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.4/105
  • 5. Similarly fX(x1, x2, · · · xn, t1, t2 · · · , tn)represents the nth or- der density function of the process X(t). Complete specifica- tion of the stochastic process X(t) requires the knowledge of fX(x1, x2, · · · xn, t1, t2 · · · , tn) for all ti, i = 1, 2, · · · , n and for all n, an almost impossible task in reality. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.5/105
  • 6. µ(t) = E{X(t)} = Z +∞ −∞ xfX(x, t)dx In general, the mean of a process can depend on the time index t. RXX(t1, t2)=E{X(t1)X∗ (t2)}= Z Z x1x∗ 2fX(x1, x2, t1, t2)dx1dx2 It represents the interrelationship between the random vari- ables X1 = X(t1) and X2 = X(t2) generated from the pro- cess X(t). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.6/105
  • 7. 1. RXX(t1, t2) = R∗ XX(t2, t1) = [E{X(t2)X∗ (t1)}]∗ 2. RXX(t, t) = E{|X(t)|2 } > 0. 3. n P i=1 n P j=1 aia∗ j RXX(ti, tj) ≥ 0. CXX(t1, t2) = RXX(t1, t2) − µX(t1)µ∗ X(t2) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.7/105
  • 8. Example: z = Z T −T X(t)dt E[|z|2 ] = Z T −T Z T −T E{X(t1)X∗ (t2)}dt1dt2 = Z T −T Z T −T RXX(t1, t2)dt1dt2 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.8/105
  • 9. Example: X(t) = a cos(ω0t + ϕ), ϕ ∼ U(0, 2π). µX(t) = E{X(t)} = aE{cos(ω0t + ϕ)} = a cos ω0tE{cos ϕ} − a sin ω0tE{sin ϕ} = 0, E{cos ϕ} = 1 2π Z 2π 0 cos ϕdϕ = 0 = E{sin ϕ}. RXX(t1, t2) = a2 E{cos(ω0t1 + ϕ) cos(ω0t2 + ϕ)} = a2 2 E{cos ω0(t1 − t2) + cos(ω0(t1 + t2) + 2ϕ)} = a2 2 cos ω0(t1 − t2). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.9/105
  • 10. Stationary Stochastic Processes Stationary processes exhibit statistical properties that are invariant to shift in the time index. Thus, for example, second-order stationarity implies that the statistical proper- ties of the pairs {X(t1), X(t2)} and {X(t1 +c), X(t2 +c)} are the same for any c. Similarly first-order stationarity implies that the statistical properties of X(ti) and X(ti + c) are the same for any c. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.10/105
  • 11. Strict-Sense Stationary (S.S.S) if fX(x1, · · · xn, t1, · · · , tn) ≡ fX(x1, · · · xn, t1 +c, · · · , tn +c) (1) for any c, where the left side represents the joint density function of the random variables AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.11/105
  • 12. X1 = X(t1), X2 = X(t2), · · · , Xn = X(tn) and the right side corresponds to the joint density function of the random variables X′ 1 = X(t1 + c), X′ 2 = X(t2 + c), · · · , X′ n = X(tn + c). A process X(t) is said to be strict-sense stationary if (1) is true for all ti, i = 1, 2, · · · , n, n = 1, 2, · · · , for any c. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.12/105
  • 13. For a first-order strict sense stationary process, fX(x, t) ≡ fX(x, t + c) for c = −t, fX(x, t) = fX(x) (2) the first-order density of X(t) is independent of t. In that case E[X(t)] = Z +∞ −∞ xf(x)dx = µ, a constant. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.13/105
  • 14. Similarly, for a 2nd-order strict-sense stationary process fX(x1, x2, t1, t2) ≡ fX(x1, x2, t1 + c, t2 + c) For c = −t2: fX(x1, x2, t1, t2) ≡ fX(x1, x2, t1 − t2) (3) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.14/105
  • 15. the second order density function of a strict sense stationary process depends only on the difference of the time indices t2 − t2 = τ. In that case the autocorrelation function is given by E{X(t1)X∗ (t2)} = Z Z x1x∗ 2fX(x1, x2, τ = t1 − t2)dx1dx2 = RXX(t1 − t2) = RXX(τ) = R∗ XX(−τ), the E{X(t1)X∗ (t2)} of a 2nd SSS depends only on τ. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.15/105
  • 16. The conditions for 1st and 2nd order SSS are in equations (2), (3), these are usually hard to verify. For this reason we resort to wide-sense stationarity WSS. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.16/105
  • 17. WSS: 1. E{X(t)} = µ 2. E{X(t1)X∗ (t2)} = RXX(|t2 − t1|) SSS always implies WSS, but the converse is not true, but, the only exception are Gaussian processes. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.17/105
  • 18. This follows, since if X(t) is a Gaussian process, then by definition X1 = X(t1), X2 = X(t2), · · · , Xn = X(tn) are jointly Gaussian random variables for any t1, t2 · · · , tn whose joint characteristic function is given by AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.18/105
  • 19. φX(ω1, ω2, · · · , ωn) = e j n P k=1 µ(tk)ωk−0.5 n P l=1 n P k=1 CXX (tl,tk)ωlωk CXX(ti, tk) is the element of covariance matrix, for WSS process CXX(ti, tk) = CXX(ti − tk) φX(ω1, ω2, · · · , ωn) = e j n P k=1 µωk−0.5 n P l=1 n P k=1 CXX (tl−tk)ωlωk (4) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.19/105
  • 20. and hence if the set of time indices are shifted by a constant c to generate a new set of jointly Gaussian random variables X′ 2 = X(t2 + c), · · · ,X′ n = X(tn + c) then their joint characteristic function is identical to (4). Thus the set of random variables {Xi}n i=1 and {X′ i}n i=1 have the same JPDF for all n and all c, establishing the SSS of Gaussian processes from it WSS. For Gaussian processes: WSS⇒SSS AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.20/105
  • 21. Why GPs? 1. Easy prior assumption 2. Regression 3. Classification 4. Optimization 5. Unimodal AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.21/105
  • 22. N-dim Gaussian e−0.5(x−µ)H C−1(x−µ) p (2π)N |C| AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.22/105
  • 23. N-dim Gaussian C=[.5,-.15;-.15,.2]; mu=[1 -1]; [X1,X2] = meshgrid(linspace(-2,2,100)’, linspace(-2,2,100)’); X = [X1(:) X2(:)]; p = mvnpdf(X, mu, C); surf(X1,X2,reshape(p,100,100)); contour(X1,X2,reshape(p,100,100)); AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.23/105
  • 24. −2 −1 0 1 2 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2 0 0.2 0.4 0.6 0.8 1 2D Gaussian PDF, surface level plot X 2 X1 PDF Figure 1: Multivariate Gaussian PDF. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.24/105
  • 25. 2D Gaussian PDF, contour plot X1 X 2 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2 Figure 2: Multivariate Gaussian PDF. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.25/105
  • 26. C = V DV H , V = [V1 V2] C = 0.5 −.15 −0.15 0.2 V = 0.31625 −.9487 −0.9487 0.3162 D = 0.05 0 0 0.55 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.26/105
  • 27. . . σ2 2 V2 σ 1 2 V 1 X1 X 2 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2 Figure 3: Multivariate Gaussian PDF. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.27/105
  • 28. For LTI sytems: Y (t) = Z +∞ −∞ h(t − τ)X(τ)dτ = Z +∞ −∞ h(τ)X(t − τ)dτ X(t) = Z +∞ −∞ X(τ)δ(t − τ)dτ AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.28/105
  • 29. Y (t) = L{X(t)} = L{ Z +∞ −∞ X(τ)δ(t − τ)dτ} = Z +∞ −∞ L{X(τ)δ(t − τ)dτ} = Z +∞ −∞ X(τ)L{δ(t − τ)}dτ = Z +∞ −∞ X(τ)h(t − τ)dτ = Z +∞ −∞ h(τ)X(t − τ)dτ AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.29/105
  • 30. Output statistics: µY (t) = E{Y (t)} = Z +∞ −∞ E{X(τ)h(t − τ)dτ} = Z +∞ −∞ µX(τ)h(t − τ)dτ = µX(t) ∗ h(t). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.30/105
  • 31. RXY (t1, t2) = E{X(t1)Y ∗ (t2)} = E{X(t1) Z +∞ −∞ X∗ (t2 − α)h∗ (α)dα} = Z +∞ −∞ E{X(t1)X∗ (t2 − α)}h∗ (α)dα = Z +∞ −∞ RXX(t1, t2 − α)h∗ (α)dα = RXX(t1, t2) ∗ h∗ (t2) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.31/105
  • 32. RY Y (t1, t2) = E{Y (t1)Y ∗ (t2)} = E{ Z +∞ −∞ X(t1 − β)h(β)dβY ∗ (t2)} = Z +∞ −∞ E{X(t1 − β)Y ∗ (t2)}h(β)dβ = Z +∞ −∞ RXY (t1 − β, t2)h(β)dβ = RXY (t1, t2) ∗ h(t1), AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.32/105
  • 33. RY Y (t1, t2) = RXX(t1, t2) ∗ h∗ (t2) ∗ h(t1). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.33/105
  • 34. If X(t) is WSS: µY (t) = µX Z +∞ −∞ h(τ)dτ = µXc, a constant . RXY (t1, t2) = Z +∞ −∞ RXX(t1 − t2 + α)h∗ (α)dα = RXX(τ) ∗ h∗ (−τ) = RXY (τ), τ = t1 − t2 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.34/105
  • 35. RY Y (t1, t2) = Z +∞ −∞ RXY (t1 − β − t2)h(β)dβ, τ = t1 − t2 = RXY (τ) ∗ h(τ) = RY Y (τ) RY Y (τ) = RXX(τ) ∗ h∗ (−τ) ∗ h(τ). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.35/105
  • 36. Example: If X(t) is WSS: z = Z T −T X(t)dt. Y (t) = X(t) ∗ h(t), h(t) = u(t + T) − u(t − T) RY Y (τ) = RXX(τ) ∗ h(τ) ∗ h(−τ) h(τ) ∗ h(−τ) = (2T − |τ|), |τ| ≤ 2T AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.36/105
  • 37. Y (t) = g{X(t)} SSS input SSS output Memoryless system AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.37/105
  • 38. Y (t) = g{X(t)} WSS input Can’t say about stationarity in any sense Memoryless system AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.38/105
  • 39. Y (t) = g{X(t)} Gaussian input Not Gaussian Memoryless system AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.39/105
  • 40. If X(t) is a zero mean stationary Gaussian process, and Y (t) = g{X(t)}, where g(·) represents a nonlinear memoryless device, then RXY (τ) = ηRXX(τ), η = E{g′ (X)}. Proof: RXY (τ) = E{X(t)Y (t − τ)} = E[X(t)g{X(t − τ)}] = Z Z x1g(x2)fX1X2 (x1, x2)dx1dx2, (5) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.40/105
  • 41. where X1 = X(t) and X2 = X(t − τ) are jointly Gaussian RVs. fX1X2 (x1, x2) = 1 2π √ |A| e−x∗A−1x/2 X = (X1, X2)T , x = (x1, x2)T A = E{X X∗ } = RXX(0) RXX(τ) RXX(τ) RXX(0) = LL∗ AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.41/105
  • 42. where L is an upper triangular factor matrix with positive diagonal entries. L = ℓ11 ℓ12 0 ℓ22 . Consider the transformation: Z = L−1 X = (Z1, Z2)T , z = L−1 x = (z1, z2)T so that E{Z Z∗ } = L−1 E{XX∗ }L∗−1 = L−1 AL∗−1 = I AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.42/105
  • 43. and hence Z1, Z2 are zero mean independent Gaussian random variables. Also x = Lz ⇒ x1 = ℓ11z1 + ℓ12z2, x2 = ℓ22z2 and hence, x∗ A−1 x = z∗ L∗ A−1 Lz = z∗ z = z2 1 + z2 2 then the Jacobian of transformation |J| = |L−1 | = |A|−1/2 . AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.43/105
  • 44. RXY (τ) = Z +∞ −∞ Z +∞ −∞ (ℓ11z1 + ℓ12z2)g(ℓ22z2) 1 |J| 1 2π|A|1/2 e−z2 1/2 e−z2 2/2 = ℓ11 Z +∞ −∞ Z +∞ −∞ z1g(ℓ22z2)fz1 (z1)fz2 (z2)dz1dz2 + ℓ12 Z +∞ −∞ Z +∞ −∞ z2g(ℓ22z2)fz1 (z1)fz2 (z2)dz1dz2 = ℓ11 Z +∞ −∞ z1fz1 (z1)dz1 Z +∞ −∞ g(ℓ22z2)fz2 (z2)dz2 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.44/105
  • 45. + ℓ12 Z +∞ −∞ z2g(ℓ22z2) fz2 (z2) | {z } 1 √ 2π e−z2 2/2 dz2 = ℓ12 ℓ2 22 Z +∞ −∞ ug(u) 1 √ 2π e−u2/2ℓ2 22 du where u = ℓ22z2 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.45/105
  • 46. This gives RXY (τ) = ℓ12ℓ22 R +∞ −∞ g(u) u ℓ2 22 fu(u) z }| { 1 √ 2πℓ2 22 e−u2/2ℓ2 22 | {z } − dfu(u) du =−f′ u(u) du = −RXX(τ) R +∞ −∞ g(u)f′ u(u)du, Since A = LL∗ gives ℓ12ℓ22 = RXX(τ). Hence, AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.46/105
  • 47. RXY (τ) = RXX(τ){−g(u)fu(u)|+∞ −∞ + R +∞ −∞ g′ (u)fu(u)du} = RXX(τ)E{g′ (X)} = ηRXX(τ), This is the desired result, where η = E{g′ (X)}. Thus if the input to a memoryless device is stationary Gaussian, the cross correlation function between the input and the output is proportional to the input autocorrelation function. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.47/105
  • 48. WSS input WSS output LTI system h(t) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.48/105
  • 49. SSS input SSS output LTI system h(t) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.49/105
  • 50. Gaussian input Gaussian output LTI system h(t) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.50/105
  • 51. White noise W(t) is said to be a white noise process if RWW (t1, t2) = q(t1)δ(t2 − t1) W(t) is said to be WSS white noise if E{W(t)} = constant, and RWW (t1, t2) = qδ(t1 − t2) = qδ(τ). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.51/105
  • 52. If W(t) is also a Gaussian process (white Gaussian pro- cess), then all of its samples are independent random vari- ables(Why?) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.52/105
  • 53. White noise Colored noise W(t) N(t) = h(t) ∗ W(t) LTI system h(t) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.53/105
  • 54. E[N(t)] = µW Z +∞ −∞ h(τ)dτ = a constant RNN (τ) = qδ(τ) ∗ h∗ (−τ) ∗ h(τ) = qh∗ (−τ) ∗ h(τ) = qρ(τ) ρ(τ) = h(τ) ∗ h∗ (−τ) = Z +∞ −∞ h(α)h∗ (α + τ)dα. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.54/105
  • 55. Thus the output of a white noise process through an LTI system represents a (colored) noise process. Note: White noise need not be Gaussian. “White” and “Gaussian” are two different concepts! AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.55/105
  • 56. ♦ ♦ ♦ X(t) t Figure 4: Upcrossing (♦), Downcrossing () AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.56/105
  • 57. Upcrossings and Downcrossings of a stationary Gaussian pro- cess: Consider a zero mean stationary Gaussian process X(t) with autocorrelation function RXX(τ). An upcrossing over the mean value occurs whenever the realization X(t) passes through zero with positive slope. Let ρ∆t represent the probability of such an upcrossing in the interval (t, t+∆t) We wish to determine ρ. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.57/105
  • 58. Since X(t) is a stationary Gaussian process, its derivative process X′ (t) is also zero mean stationary Gaussian with autocorrelation function RXX(τ) = Z ∞ −∞ SXX(ω)ejωτ dω 2π R ′′ XX(τ) = Z ∞ −∞ (jω)2 SXX(ω)ejωτ dω 2π SX′X′ (ω) = |jω|2 SXX(ω) ⇒ RX′X′ (τ) = −R ′′ XX(τ) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.58/105
  • 59. Further X(t) and X′ (t) and are jointly Gaussian stationary processes, and since RXX′ (τ) = − dRXX(τ) dτ , RXX′ (−τ) = − dRXX(−τ) d(−τ) = dRXX(τ) dτ = −RXX′ (τ) which for τ = 0 gives RXX′ (0) = 0 ⇒ E[X(t)X′ (t)] = 0 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.59/105
  • 60. the jointly Gaussian zero mean random variables X1 = X(t), X2 = X′ (t) are uncorrelated hence independent with variances σ2 1 = RXX(0) and σ2 2 = RX′X′ (0) = −R′′ XX(0) 0 fX1X2 (x1, x2) = fX(x1)fX(x2) = 1 2πσ1σ2 e − x2 1 2σ2 1 + x2 1 2σ2 2 ! . To determine ρ the probability of upcrossing rate, AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.60/105
  • 61. we argue as follows: In an interval (t, t + ∆t) the realization moves from X(t) = X1 to X(t + ∆t) = X(t) + X′ (t)∆t = X1 + X2∆t, and hence the realization intersects with the zero level somewhere in that interval if X1 0, X2 0, and X(t + ∆t) = X1 + X2∆t 0 i.e., X1 −X2∆t Hence the probability of upcrossing in (t, t + ∆t) is given by AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.61/105
  • 62. ρ∆t = Z ∞ x2=0 Z 0 x1=−x2∆t fX1X2 (x1, x2)dx1dx2 = Z ∞ 0 fX2 (x2)dx2 Z ∞ −x2∆t fX1 (x1)dx1. (6) Differentiating both sides of 6 with respect to ∆t we get ρ = Z ∞ 0 fX2 (x2)x2fX1 (−x2∆t)dx2 and letting ∆t → 0, we have AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.62/105
  • 63. c ρ = Z ∞ 0 x2fX(x2)fX(0)dx2 = 1 p 2πRXX(0) Z ∞ 0 x2fX(x2)dx2 = 1 p 2πRXX(0) 1 2 (σ2 p 2/π) = 1 2π s −R′′ XX(0) RXX(0) There is an equal probability for downcrossings, and hence the total probability for crossing the zero line in an interval (t, t + ∆t) equals ρ0∆t where AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.63/105
  • 64. ρ0 = 1 π q −R′′ XX(0)/RXX(0) 0. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.64/105
  • 65. A discrete time stochastic process Xn = X(nT) is a sequence of random variables. The mean, autocorrelation and auto-covariance functions of a discrete-time process are gives by µn = E{X(nT)} R(n1, n2) = E{X(n1T)X∗ (n2T)} C(n1, n2) = R(n1, n2) − µn1 µ∗ n2 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.65/105
  • 66. WSS: E{X(nT)} = µ, a constant E[X{(k + n)T}X∗ {(k)T}] = R(n) = rn = r∗ −n AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.66/105
  • 67. The positive-definite property of the autocorrelation sequence can be expressed in terms of certain Hermitian-Toeplitz matrices as follows: Theorem: A sequence {rn} |∞ −∞ forms an autocorrelation se- quence of a wide sense stationary stochastic process if and only if every Hermitian-Toeplitz matrix Tn given by AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.67/105
  • 68. Tn =      r0 r1 r2 · · · rn r∗ 1 r0 r1 · · · rn−1 . . . . . . . . . . . . . . . r∗ n r∗ n−1 · · · r∗ 1 r0      = T∗ n is non-negative (positive) definite for n = 0, 1, 2 · · · , ∞ AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.68/105
  • 69. a∗ Tna = n X i=0 n X k=0 aia∗ krk−i, ∀~ a a∗ Tna = n X i=0 n X k=0 aia∗ kE{X(kT)X∗ (iT)} = E   
  • 70.
  • 71.
  • 72.
  • 73.
  • 75.
  • 76.
  • 77.
  • 78.
  • 79. 2    ≥ 0. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.69/105
  • 80. Systems and discrete WSS processes: RXY (n) = RXX(n) ∗ h∗ (−n) RY Y (n) = RXY (n) ∗ h(n) RY Y (n) = RXX(n) ∗ h∗ (−n) ∗ h(n). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.70/105
  • 81. ARMA processes: input: W(n), output: X(n) X(n) = − p X k=1 akX(n − k) + q X k=0 bkW(n − k), X(z) p X k=0 akz−k = W(z) q X k=0 bkz−k , a0 ≡ 1 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.71/105
  • 82. H(z) = ∞ X k=0 h(k)z−k = X(z) W(z) = b0 + b1z−1 + b2z−2 + · · · + bqz−q 1 + a1z−1 + a2z−2 + · · · + apz−p = B(z) A(z) X(n) = ∞ X k=0 h(n − k)W(k) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.72/105
  • 83. the transfer function H(z) is rational with p poles and q zeros that determine the model order of the underlying system. X(n) undergoes regression over p of its previous values and at the same time a moving average based on W(n), W(n − 1), · · · , W(n − q) of the input over (q + 1) values is added to it, thus generating an Auto Regressive Moving Average (ARMA (p, q)) process X(n). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.73/105
  • 84. Generally the input {W(n)} represents a sequence of uncorrelated random variables of zero mean and constant variance σ2 W so that RWW (n) = σ2 W δ(n). If in addition, {W(n)} is normally distributed then the output {X(n)} also represents a strict-sense stationary normal process. If q = 0, then we have an AR(p) process (all-pole process), and if p = 0, then an MA(q) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.74/105
  • 85. AR(1) process: An AR(1) process has the form X(n) = aX(n − 1) + W(n) H(z) = 1 1 − az−1 = ∞ X n=0 an z−n h(n) = an , |a| 1 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.75/105
  • 86. RXX(n) = σ2 W δ(n) ∗ {a−n } ∗ {an } = σ2 W ∞ X k=0 a|n|+k ak = σ2 W a|n| 1 − a2 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.76/105
  • 87. The normalized autocorrelation sequence: ρX(n) = RXX(n) RXX(0) = a|n| , |n| ≥ 0. (7) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.77/105
  • 88. It is instructive to compare an AR(1) model discussed above by superimposing a random component to it, which may be an error term associated with observing a first order AR process X(n). Thus Y (n) = X(n) + V (n), X(n) ∼ AR(1), V (n) ∼ White noise (8) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.78/105
  • 89. RY Y (n) = RXX(n) + RV V (n) = RXX(n) + σ2 V δ(n) = σ2 W a|n| 1 − a2 + σ2 V δ(n) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.79/105
  • 90. ρY (n) = RY Y (n) RY Y (0) = 1, n = 0 ca|n| , n = ±1, ±2, · · · (9) c = σ2 W σ2 W + σ2 V (1 − a2) 1. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.80/105
  • 91. Eqs. (7) and (8) demonstrate the effect of superimposing an error sequence on an AR(1) model. For non-zero lags, the autocorrelation of the observed sequence {Y (n)} is re- duced by a constant factor compared to the original process {X(n)}. From (8), the superimposed error sequence V (n) only affects the corresponding term in Y (n) (term by term). However, a particular term in the “input sequence” W(n) affects X(n) and Y (n) as well as all subsequent observa- tions. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.81/105
  • 92. AR(2) Process: An AR(2) process has the form, W(n) is a zero-mean white process X(n) = a1X(n − 1) + a2X(n − 2) + W(n) H(z) = ∞ X n=0 h(n)z−n = 1 1 − a1z−1 − a2z−2 = b1 1 − λ1z−1 + b2 1 − λ2z−1 h(0) = 1, h(1) = a1, h(n) = a1h(n − 1) + a2h(n − 2), n ≥ 2 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.82/105
  • 93. h(n) = b1λn 1 + b2λn 2 , n ≥ 0 b1 + b2 = 1, b1λ1 + b2λ2 = a1. λ1 + λ2 = a1, λ1λ2 = −a2, Stability condition: |λ1| 1, |λ2| 1. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.83/105
  • 94. RXX(n) = E{X(n + m)X∗ (m)} = E{[a1X(n + m − 1) + a2X(n + m − 2)]X∗ (m)} + E{W(n + m)X∗ (m)} = a1RXX(n − 1) + a2RXX(n − 2) Because W(n) and X(n) are uncorrelated E{W(n + m)X∗ (m)} = 0 ρX(n) = RXX(n) RXX(0) = a1ρX(n − 1) + a2ρX(n − 2). AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.84/105
  • 95. RXX(n) = RWW (n) ∗ h∗ (−n) ∗ h(n) = σ2 W h∗ (−n) ∗ h(n) = σ2 W ∞ X k=0 h∗ (n + k)h(k) = σ2 W |b1|2 (λ∗ 1)n 1 − |λ1|2 + b∗ 1b2(λ∗ 1)n 1 − λ∗ 1λ2 + b1b∗ 2(λ∗ 2)n 1 − λ1λ∗ 2 + |b2|2 (λ∗ 2)n 1 − |λ2|2 AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.85/105
  • 96. The normalized autocorrelation function: ρX(n) = RXX(n) RXX(0) = c1λ∗n 1 + c2λ∗n 2 where c1 and c2 are the appropriate constant. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.86/105
  • 97. ARMA(p, q) system has only p + q + 1 independent coefficients, and hence its impulse response sequence {hk} also must exhibit a similar dependence among them. An old result due to Kronecker (1881) states that the nec- essary and sufficient condition for H(z) = P∞ k=0 h(n)z−n to represent a rational system (ARMA) is that AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.87/105
  • 98. det Hn = 0, n ≥ N (for all sufficiently large n), where Hn =      h0 h1 h2 · · · hn h1 h2 h3 · · · hn+1 . . . hn hn+1 hn+2 · · · h2n      . (10) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.88/105
  • 99. In the case of rational systems for all sufficiently large n, the Hankel matrices Hn in (10) all have the same rank. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.89/105
  • 100. The necessary part easily follows from ∞ X k=0 h(n)z−n = Pq k=0 bkz−k Pp k=0 akz−k by cross multiplying and equating coefficients of like powers of z−n for n = 0, 1, 2, · · · . We have AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.90/105
  • 101. b0 = h0 b1 = h0a1 + h1 . . . bq = h0aq + h1aq−1 + · · · + hm 0 = h0aq+i + h1aq+i−1 + · · · + hq+i−1a1 + hq+i, i ≥ 1. For systems with q ≤ p−1, letting i = p−q, p−q+1, · · · , 2p−q in the last equation we get AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.91/105
  • 102. h0ap + h1ap−1 + · · · + hp−1a1 + hp = 0 . . . hpap + hp+1ap−1 + · · · + h2p−1a1 + h2p = 0 which gives det(Hp) = 0, and similarly for i = p − q + 1, · · · . AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.92/105
  • 103. h0ap+1 + h1ap + · · · + hp+1 = 0 h1ap+1 + h2ap + · · · + hp+2 = 0 . . . hp+1ap+1 + hp+2ap + · · · + h2p+2 = 0, † AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.93/105
  • 104. the ARMA (p, q) model, the input white noise process W(n) is uncorrelated with its own past sample values as well as the past values of the system output. This gives E{W(n)W∗ (n − k)} = 0, k ≥ 1 E{W(n)X∗ (n − k)} = 0, k ≥ 1. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.94/105
  • 105. ri = E{x(n)x∗ (n − i)} = − p X k=1 ak{x(n − k)x∗ (n − i)} + q X k=0 bk{w(n − k)w∗ (n − i)} = − p X k=1 akri−k + q X k=0 bk{w(n − k)x∗ (n − i)} AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.95/105
  • 106. p X k=1 akri−k + ri 6= 0, i ≤ q p X k=1 akri−k + ri = 0, i ≥ q + 1. ‡ We note that the above equation (‡) is the same as (†) with {hk} replaced by {rk}. Hence, the Kronecker conditions for rational systems can be expressed in terms of its output autocorrelations as well. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.96/105
  • 107. X(n) ∼ ARMA (p, q) represents a wide sense stationary stochastic process, then its output autocorrelation sequence {rk} satisfies rankDp−1 = rankDp+k = p, k ≥ 0, where AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.97/105
  • 108. Dk =      r0 r1 r2 · · · rk r1 r2 r3 · · · rk+1 . . . rk rk+1 rk+2 · · · r2k      represents the (k + 1) × (k + 1) Hankel matrix generated from r0, r1, · · · , r2k. It follows that for ARMA (p, q) systems, we have det Dn = 0, for all sufficiently large n. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.98/105
  • 109. Cyclostationarity: In the determinitic case, a signal is called periodic if it repeats after a period of time. In this section we will look at the statistical equivalent of periodicity. A random process X(t) is said to be nth-order cyclostationary if the nth-order PDF is periodic: fX(~ x;~ t) = fX(~ x;~ t + ~ T (n) o ~ 1); ~ T (n) o ∈ R AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.99/105
  • 110. The periodicity of the statistics, ~ T (n) o is often referred to as the cyclostationarity parameter. As in the case of stationarity, n = 1 cyclostationarity implies that µX(t) = µX(t + T(1) o ) σX(t) = σX(t + T(1) o ) Second-order cyclostationarity specifically implies that RXX(t1, t2) = RXX(t1 + T(2) o , t2 + T(2) o ) AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.100/105
  • 111. Since the ACF of the process is dependent on both t1 t2, these processes are non-stationary. Furthermore the ensemble ACF is periodic and as a consequence it has a Fourier series expansion of the form: RXX(t, τ) = E{X(t)X∗ (t − τ)} = ∞ X n=−∞ R̃ (n) XX(τ) exp j 2πnt T (2) o AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.101/105
  • 112. The Fourier series coefficients R (n) XX(τ) are referred to as the cyclic autocorrelation function. The DC Fourier coefficient in particular is called the time averaged ACF and is determined via: R̃ (0) XX(τ) = 1 T (2) o Z T (2) o /2 −T (2) o /2 R (0) XX(t, t − τ) dt AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.102/105
  • 113. The corresponding time-averaged power spectrum is defined via the Fourier transform: P (0) XX(Ω) = Z ∞ −∞ R (0) XX(τ) exp(−jΩτ)dτ Using the cyclic autocorrelation function we can define a corresponding cyclic power-spectrum P (n) XX(Ω) via: AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.103/105
  • 114. P (n) XX(Ω) = Z ∞ −∞ R (n) XX(τ) exp(−jΩτ)dτ R (n) XX(τ) = 1 2π Z ∞ −∞ P (n) XX(Ω) exp(jΩτ)dΩ AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.104/105
  • 115. The motivation behind the definition of the time-averaged ACF is to average out and remove the dependence of the ACF on the t variable so that we do not have to deal with two dimensional Fourier transforms. AKU-EE/Stochastic/HA, 1st Semester, 85-86 – p.105/105