This document discusses key concepts in probability and statistics for signal processing and communications, including:
- Definitions of expectation, linearity, moments, variance, and correlation for random variables
- Properties of two random variables including expectation, correlation, orthogonality, covariance, and independence
- Characteristics of Gaussian random variables including their probability density function and properties under transformations
- Representation of jointly Gaussian random variables using their covariance matrix
1. EENGM0014 Mathematics for Signal Processing and
Communications
Tutorial
Soon Yau Cheong
University of Bristol
24 Jan 2017
Soon Yau Cheong (University of Bristol) EENGM0014 Mathematics for Signal Processing and Communications24 Jan 2017 1 / 10
2. Expectation of Random Variable
Definition
For X ∼ fX (x)
E(X) =
∞
−∞
xfX (x)dx
Linearity
If a and b are constant, g1 and g2 are function of x
E(a) = a
E[ag1(x) + bg2(x)] = aE[g1(x)] + bE[g2(x)]
Soon Yau Cheong (University of Bristol) EENGM0014 Mathematics for Signal Processing and Communications24 Jan 2017 2 / 10
3. Moment
k-th moment
E(Xk
) =
∞
−∞
xk
fX (x)dx
Second moment (average power)
E(X2
) =
∞
−∞
x2
fX (x)dx
Variance
Var(X) = E[(X − E(X))2
]
= E[X2
+ (E(X))2
− 2XE(X)]
= E(X2
) + (E(X))2
− 2(E(X))2
= E(X2
) − (E(X))2
Soon Yau Cheong (University of Bristol) EENGM0014 Mathematics for Signal Processing and Communications24 Jan 2017 3 / 10
4. Two Random Variables
Expectation
E(g(X, Y )) =
∞
−∞
∞
−∞
g(x, y)fX,Y (x, y)dxdy
The function g(X,Y) may be X, Y, X2, Y 2 etc
Correlation
E(XY )
Orthogonal
if E(XY ) = 0
Soon Yau Cheong (University of Bristol) EENGM0014 Mathematics for Signal Processing and Communications24 Jan 2017 4 / 10
5. Covariance
COV (X, Y ) = E[(X − E(X))(Y − E(Y ))]
= E(XY ) − E(X)E(Y )
Uncorrelated
if COV (X, Y ) = 0
Independence imply Uncorrelation
if X and Y are independent
E(XY ) = E(X)E(Y )
therefore
COV (X, Y ) = 0
BUT not true the other way around
Soon Yau Cheong (University of Bristol) EENGM0014 Mathematics for Signal Processing and Communications24 Jan 2017 5 / 10
6. Correlation coefficient
ρX,Y =
COV (X, Y )
VAR(X)VAR(Y )
Soon Yau Cheong (University of Bristol) EENGM0014 Mathematics for Signal Processing and Communications24 Jan 2017 6 / 10
7. Gaussian Random Variable
pdf
For X ∼ N(µ, σ2) where µ is mean, σ2 is variance
pdf fX (x) =
1
√
2πσ2
exp
−(x − µ)2
2σ2
Generate sample from standard Gaussian distrbution
If Y=aX+b where a and b are constant, then
Y ∼ N(aµ + b, a2
σ2
)
Summation of independent GRVs
if Y = X1 + X2
Y ∼ N(µx1 + µx2, σ2
x1 + σ2
x2)
Soon Yau Cheong (University of Bristol) EENGM0014 Mathematics for Signal Processing and Communications24 Jan 2017 7 / 10
8. Jointly GRVs
Soon Yau Cheong (University of Bristol) EENGM0014 Mathematics for Signal Processing and Communications24 Jan 2017 8 / 10
9. Soon Yau Cheong (University of Bristol) EENGM0014 Mathematics for Signal Processing and Communications24 Jan 2017 9 / 10
10. Vector Form
fX (x1, ..., xk) =
1
(2π)k|Σ|
exp[−
1
2
(x − µ)T
Σ−1
(x − µ)]
Σ is covariance matrix, for k=2
Σ =
σ2
1 ρ12σ1σ2
ρ12σ1σ2 σ2
2
Soon Yau Cheong (University of Bristol) EENGM0014 Mathematics for Signal Processing and Communications24 Jan 2017 10 / 10