2. Contents are…
Definition -DTP
Bernoulli’s Process
Moments-Ensemble Averages
Stationary Process-WSS
Matrix Forms
Parseval’s Theorem
Weiner-Khinchine relation
PSD
Filtering of Random Process
Spectral factorization
Bias-Consistency
Special types of RP
Yule-walker Equation
3. Discrete Time Random Process:
A random variable may be thought of as mapping from
sample space of an experiment into a set of real or complex
values.
A Discrete time random process may be thought of mapping
from sample space Ω into a set of discrete time signals.
It is nothing but an Indexed sequence of random variables.
Example: Tossing a coin, Rolling a die
4. Bernoulli’s Process:
The outcome of an event does not affect the outcome of the
other event at any time then the process is called as
Bernoulli’s Process.
The Moments are,
Mean :The average of outcomes
µ=1/n ∑ x(i), i=1 to n
Variance: How far the random values is away from the
central mean.
σ2 =1/n ∑ (x-µ)2, i=1 to n
5. Skewness: It deals symmetry with the mean values
S= ∑ (x-µ)3/σ3
Kurtosis: Flatness or stability of the system
K= ∑ (x-µ)4/σ4
ERGODICITY:
When the time average of the process is equal to the ensemble average.
It is said to be “ergodic”. ie, E(X)= Complement of X
6. ENSEMBLE AVERAGES:
Mean: Mx(n)= E[x(n)]
Variance: σ2x(n)=E[|x(n)-Mx(n)|2]
Auto Correlation :Finding the relationship between the random
variables in the same process.
rx(k,l)=E[x(k) x*(l)]
Auto Covariance: Cx(k,l)=E[|x(k)-Mx(k)|,|x(l)-Mx*(l)|]
Cross Correlation: rxy(k,l)=E[x(k) y*(l)]
Cross Covariance: Cxy(k,l)=E[|x(k)-Mx(k)|,|y(l)-My*(l)|]
7. RELATIONS
Relation between rx &Cx:
Cx(k,l)=rx(k,l)-Mx(k) Mx*(l)
Mean=0,
Cx(k,l)=rx(k,l)
Relation between rxy &Cxy:
Cxy(k,l) =rxy(k,l)-Mx(k) My*(l)
Mean=0,
Cxy(k,l)=rxy(k,l)
•If the random process is uncorrelated means
Cxy(k,l)=0.
•If the two random process x(n) & y(n) are said to be orthogonal
means rxy(k,l)=0
8. STATIONARY PROCESS
A process is said to be stationary when all the statistical
averages (Mean, Variance etc.) are independent of time
i.e, For first order, Mx(n)=Mx
σ2x(n)=σ2x
For second order, r (k,l)= rx(k-l,0)
x
r (k,l)= rx(k-l)
x
Example: Quantization Error
9. WIDE SENSE STATIONARY PROCESS:
Case:1
The mean of the process is constant Mx.
The autocorrelation of the process depends on the difference
on k,l.(k-l)
The variance of the process is finite.
Case:2
x(n),y(n) a said to be jointly WSS if they are independently
WSS.
rxy(k-l)=E[x(k) ,y*(l)]
10. PROPERTIES OF WSS & AUTO CORRELATION:
1. Symmetry rx(k)=rx*(-k)
2. Mean square value rx(0)=E[|x(n)|2]≥0
3. Maximum Value rx(k) ≤ rx(0)
4. Periodicity E[|x(n)-x(n-ko)|2]
For the auto correlation Rxx…
11. MATRIX AND ITS PROPERTIES
The auto correlation & auto covariance can be expressed in
the form of matrix.
PROPERTIES:
The autocorrelation of a WSS process x(n) is a Hermitian
Toeplitz matrix.
Non negative & definite.
The eigen value λk are real value and non negative.
12. IMPORTANT MATRIX FORMS
Orthogonal Matrix A T =A-1
Hermitian Matrix [A*]T=[AT]*
Skew Hermitian Matrix A=-AH
Toeplitz Matrix => All the diagonal elements are
same.
Henkal Matrix M+N-1
13. PARSEVAL’S THEOREM (OR) RAYLEIGH ENERGY
FORMULA
The sum or integral of the square of the function is equal to the
sum or integral of square of the transform.
That is E<x,x>
14. WEINER KHINCHINE RELATION
For a well behaved stationary random process the power
spectrum is equal to the Fourier transform of the
autocorrelation function.
15. POWER SPECTRAL DENSITY
The PSD of the process is written by,
Px(ejw)=∑rx(k) e(-jwk) , k=-∞ to ∞
Power spectrum of x(n),
Px (z)=∑ rx(k) z-k , k=-∞ to ∞
16. FILTERING OF RANDOM PROCESS
A linear shift-invariant (LSI) system (or filter) with a unit sample
response h(n), applied to the case of a deterministic signal. The
input is x(n) and the output is y(n).
Py ( z) = Px ( z)H ( z)H * (1/ z* )
18. Wold Decomposition Theorem:
A general random process can be written as a sum of a
regular random process xr (n)and a predictable process x p
(n) ,
x(n) = xr (n) + x p (n) ,
19. Bias-Consistency
The difference between the expected value of the estimate
and the actual value is called the ‘Bias’ B.
B=ϴ-E[ϴ^N]
ϴ - Actual Value
ϴ^N- Estimate Value
If an estimate is biased ,Asymptotically Biased,
Lt E[ϴ^N]=0
N->∞
If an estimate is consistent, Mean Square Convergence
Lt |ϴ-E[ϴ^N]|2=0
N->∞
20. SPECIAL TYPES OF RP
Types are,
ARMA Process –ARMA(p,q)
AR Process (Auto Regressive)-ARMA (p,0)
MA Process (Moving average)-ARMA (0,q)