SlideShare a Scribd company logo
1 of 39
Download to read offline
Estimating Space-Time Covariance
from Finite Sample Sets
Stephan Weiss
Centre for Signal & Image Processing
Department of Electonic & Electrical Engineering
University of Strathclyde, Glasgow, Scotland, UK
TeWi Seminar, Alpen Adria University, 22 May 2019
Thanks to: I.K. Proudler, J. Pestana, F. Coutts, C. Delaosa
This work is supported by the Physical Sciences Research Council (EPSRC) Grant num-
ber EP/S000631/1 and the MOD University Defence Research Collaboration in Signal
Processing.
1 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Presentation Overview
1. Overview;
2. a reminder of statistics background;
3. a reminder on auto- and cross-correlation sequences;
4. mid-talk exam;
5. sample sapce-time covariance matrix;
6. cross-correlation estimation;
7. some results and comparisons;
8. applications: support estimation and eigenvalue perturbation;
9. summary; and
10. a shameless last slide.
2 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Random Signals/ Stochastic Processes
A stochastic process x[n] is characterised by deterministic measures:
◮ the probability density function (PDF), or normalised histogram,
p(x):
p(x) ≥ 0 ∀ x and
∞
−∞
p(x)dx = 1
◮ the PDF’s moments of order l:
∞
−∞
xl
p(x)dx
◮ specifically, note that the first moment l = 1 is the mean µ, and
that the second moment l = 2 is variance σ2 if µ = 0;
◮ the autocorrelation function of the process x[n].
3 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Probability Density Function
◮ Random data can be characterised by its distribution of
amplitude values:
−3−2−10123
0
0.2
0.4
0.6
0.8
(x)d
x
0 10 20 30 40 50 60 70 80 90 100
−3
−2
−1
0
1
2
3
time index n
x[n]
◮ the PDF describes with which probability P amplitude values of
x[n] will fall within a specific interval [x1 ; x2]:
P(x ∈ [x1 ; x2]) =
x2
x1
p(x)dx
◮ a histogram of the data can be used to estimate the PDF . . . 4 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Probability Density Function Estimation
◮ Histogram estimation based on 103 samples:
−4 −3 −2 −1 0 1 2 3 4
0
50
rel.freq.
sample values x
◮ histogram based on 104 samples:
−4 −3 −2 −1 0 1 2 3 4
0
500
rel.freq.
sample values x
◮ histogram based on 105 samples:
−4 −3 −2 −1 0 1 2 3 4
0
500
rel.freq.
sample values x
◮ for consistent estimates, we need as much data as possible!
5 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Gaussian or Normal Distribution
◮ For the Gaussian or normal PDF, x ∈ N(µ, σ2):
p(x) =
1
√
2πσ
e
−(x−µ)2
2σ2 (1)
◮ mean is µ, variance is σ2;
◮ sketch for x ∈ N(0, 1):
−3 −2 −1 0 1 2 3
0
0.2
0.4
0.6
0.8
p(x)
x
◮ central limit theorem: the sum of arbitrarily distributed processes
converges to a Gaussian PDF;
6 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Uniform Distribution
◮ A uniform distribution has equal probability of amplitude values
within a specified interval;
◮ e.g. x= rand() in Matlab produces samples x ∈ [0 ; 1] with the
following PDF:
−1 −0.5 0 0.5 1 1.5 2
0
0.5
1
p(x)
x
◮ mean and variance are
µ =
∞
−∞
xp(x)dx =
1
0
xdx =
1
2
x2
1
0
=
1
2
(2)
σ2
=
1
x2
dx − µ2
=
1
x3
1
−
1
=
1
(3) 7 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Other PDFs
◮ PDF of a binary phase shift keying (BPSK) symbol sequence,
which is a type of Bernoulli distribution:
x
p(x)
-1 1
1
2
1
2
◮ PDFs for complex valued signals also exist;
◮ example for the PDF of a quaternary phase shift keying (QPSK)
sequence:
ℜ{x}
p(x)
-1 1
1
4
1
4
1
4
1
4
−j
ℑ{x}
j
8 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Complex Gaussian Distribution
◮ PDF of a complex Gaussian process with independent and
identically distributed (IID) real and imaginary parts:
−3
−2
−1
0
1
2
3
−3
−2
−1
0
1
2
3
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
ℜ{x}
ℑ{x}
p(x)
◮ this leads to a circularly-symmetric PDF. 9 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Central Limit Theorem
◮ Theorem: adding arbitarily distributed but independent signals
will, in the limit, tend towards a Gaussian distribution;
◮ example: y[n] = h[n] ∗ x[n], with x[n] a sequence of independent
BPSK symbols:
−1 −0.5 0 0.5 1
0
2
4
6
x 10
4
x
rel.freq.
−1 −0.5 0 0.5 1
0
2000
4000
6000
8000
y
rel.freq.
h[n]
x[n] y[n]
0 5 10 15 20
−0.1
−0.05
0
0.05
0.1
index n
h[n]
◮ the filter sums differently weighted independent random processes,
and it does not take many to make the output look Gaussian!
10 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Stationarity and Ergodicity
◮ Stationarity means that the statistical moments of a random
process do not change over time;
◮ a weaker condition is wide-sense stationarity (WSS), i.e. moments
up to second order (mean and variance) are constant over time;
this is sufficient unless higher order statistics (HOS) algorithms
are deployed;
◮ a stochastic process is ergodic if the expectation operation can be
replaced by a temporal average,
σ2
xx =
∞
−∞
x2
p(x)dx = E{x[n]x∗
[n]} = lim
N→∞
1
N
N−1
n=0
|x[n]|2
(4)
◮ remember: expectation is an average over an ensemble; a
temporal average is performed over a single ensemble probe!
11 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Sample Size Matters!
◮ When estimating quantities such as PDF, mean or variance, the
estimator should be bias-free, i.e. converge towards the desired
value;
◮ consistency refers to the variability of the estimator around the
asymptotic value;
◮ the more samples, the better the consistency of the estimate;
◮ mean ˆµ and variance ˆσ2 of a uniformly distributed signal:
ˆσ2ˆµ
10
1
10
2
10
3
10
4
10
5
0.2
0.4
0.6
0.8
10
1
10
2
10
3
10
4
10
5
0.04
0.06
0.08
0.1
0.12
0.14
12 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Moving Average (MA) Model / Signal
◮ The PDF does not contain any information on how “correlated”
successive samples are;
◮ consider the following scenario with x[n] ∈ N(0, σ2
xx) being
uncorrelated (successive samples are entirely random):
✲ b[n] ✲
x[n] y[n] = x[n] ∗ b[n]
N(0, σ2
xx) N(0, σ2
yy)
◮ y[n] is called a moving average process (and b[n] an MA model)
of order N − 1 if y[n] =
N−1
ν=0
b[ν]x[n − ν] is a weighted average
over a window of N input samples.
13 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Filtering a Random Signal
◮ Consider lowpass filtering an uncorrelated Gaussian signal x[n]:
✲ h[n] ✲
x[n] y[n] = x[n] ∗ h[n]
N(0, σ2
x) N(0, σ2
y)
0 50 100 150
−2
−1.5
−1
−0.5
0
0.5
1
1.5
2
2.5
3
time n
x[n]
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0
0.2
0.4
0.6
0.8
1
1.2
1.4
norm. angular freq. Ω/π
|H(ejΩ
)|
0 50 100 150
−0.5
−0.4
−0.3
−0.2
−0.1
0
0.1
0.2
0.3
0.4
0.5
time n
y[n]
◮ the output will have Gaussian distribution, but the signal only
changes smoothly: neighbouring samples are correlated. We need
a measure!
14 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Auto-Correlation Function I
◮ The correlation between a sample x[n] and a neighbouring value
x[n − τ] is given by
rxx[τ] = E{x[n] · x∗
[n − τ]} = lim
N→∞
1
N
N−1
n=0
x[n] · x∗
[n − τ]
(5)
◮ For two specific specific lags τ = −3 (left) and τ = −50 (right),
consider:
0 50 100 150
−0.6
−0.4
−0.2
0
0.2
0.4
0.6
time n
x[n],x[n+3]
0 50 100 150
−0.6
−0.4
−0.2
0
0.2
0.4
0.6
time n
x[n],x[n+50]
◮ the curves on the left look “similar”, the ones on the right
“dissimilar”.
15 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Auto-Correlation Function II
◮ For lag zero, note:
rxx[0] = lim
N→∞
1
N
N−1
n=0
x[n] · x∗
[n] = σ2
x + µ2
x (6)
◮ This value for τ = 0 is the maximum of the auto-correlation
function rxx[τ];
xxr [τ]
τ
◮ large values in the ACF indicate strong correlation, small values
weak correlation;
16 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Auto-Correlation Function III
◮ If a signal has no self-similarity, i.e. it is “completely random”, the
ACF takes the following form:
xxr [τ]
τ
◮ If we take the Fourier transform of rxx[τ], we obtain a flat
spectrum (or a lowpass spectrum for the ACF on slide 16);
◮ due to the presence of all frequency components in a flat
spectrum, a completely random signal is often referred to as
“white noise”.
17 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Power Spectral Density
◮ The power spectral density (PSD), Rxx(ejΩ), defines the
spectrum of a random signal:
Rxx(ejΩ
) =
∞
τ=−∞
rxx[τ] e−jΩτ
(7)
◮ PSD and ACF form a Fourier pair, rxx[τ] ◦—• Rxx(ejΩ),
therefore
rxx[τ] =
1
2π
π
−π
Rxx(ejΩ
) ejΩτ
dΩ (8)
◮ note that the power of x[n] is (similar to Parseval)
rxx[0] =
1
2π
π
−π
Rxx(ejΩ
) dΩ (= scaled area under PSD)
(9)
18 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Mid-Talk “Exam”
◮ We are given a unit variance, zero mean (µ = 0) signal x[n];
◮ we want to estimate the mean, ˆµ;
◮ Question 1: how does the sample size affect the estimation error
|µ − ˆµ|2?
◮ Question 2: does it matter whether x[n] has a lowpass or
highpass characteristic?
19 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Mean Estimation
◮ Our estimator is simple:
ˆµ =
1
N
N−1
n=0
x[n] ;
◮ the mean of this estimator:
mean{ˆµ} = E{ˆµ} =
1
N
N−1
n=0
E{x[n]} =
1
N
N−1
n=0
µ = µ
◮ hurray — the estimator is unbiased;
◮ for the error, we look towards the variance of the estimator:
var{ˆµ} = E |ˆµ − µ|2
◮ this is going to be a bit trickier . . .
20 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Variance of Mean Estimator
◮ tedious but hopefully rewarding:
var{ˆµ} = E{(ˆµ − µ)(ˆµ − µ)∗
} (10)
= E{ˆµˆµ∗
} − E{ˆµ} µ∗
− µE{ˆµ∗
} + µµ∗
(11)
= E
1
N2
N−1
n=0
x[n]
N−1
ν=0
x∗
[ν] − µµ∗
(12)
=
1
N2
N−1
n=0
n
m=n−N−1
E{x[n]x∗
[n − m]} − µµ∗
(13)
=
1
N2
N−1
n=0
n
m=n−N−1
rxx[τ] − µµ∗
(14)
=
1
N2
N−1
τ=−N+1
(N − |τ|)rxx[τ] − µµ∗
(15)
◮ so, here are the answers!
21 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Space-Time Covariance Matrix
◮ Measurements obtained from M sensors are collected in a
vector x[n] ∈ CM :
xT
[n] = [x1[n] x2[n] . . . xM [n]] ; (16)
◮ with the expectation operator E{·}, the spatial correlation is
captured by R = E x[n]xH[n] ;
◮ for spatial and temporal correlation, we require a space-time
covariance matrix
R[τ] = E x[n]xH
[n − τ] (17)
◮ this space-time covariance matrix contains auto- and
cross-correlation terms, e.g. for M = 2
R[τ] =
E{x1[n]x∗
1[n − τ]} E{x1[n]x∗
2[n − τ]}
E{x2[n]x∗
1[n − τ]} E{x2[n]x∗
2[n − τ]}
(18)
22 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Cross-Spectral Density Matrix
◮ example for a space-time covariance matrix R[τ] ∈ R2×2:
-4 -2 0 2 4
0
5
10
-4 -2 0 2 4
0
5
10
-4 -2 0 2 4
0
5
10
-4 -2 0 2 4
0
5
10
◮ the cross-spectral density (CSD) matrix: R(z) ◦—• R[τ].
23 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Exact Space-Time Covariance Matrix
◮ We assume knowledge of a source model that ties the
measurement vector x[n] to mutually independent, uncorrelated
unit variance signals uℓ[n]:
u1[n] x1[n]
uL[n] xM [n]
H[n]
...
...
◮ then the space time covariance matrix is
R[τ] =
n
H[n]HH
[n − τ] ,
◮ or for the CSD matrix:
R(z) = H(z)HP
(z) .
24 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Biased Estimator
◮ To estimate from finite data, e.g.
ˆr(biased)
mµ [τ] =



1
N
N−τ−1
n=0
xm[n + τ]x∗
µ[n] , τ ≥ 0 ;
1
N
N+τ−1
n=0
xm[n]x∗
µ[n − τ] , τ < 0 .
(19)
◮ or ˆR
(biased)
mµ (z) = 1
N Xm(z)X∗
µ (z−1) = 1
N Xm(z)XP
µ (z);
◮ for the CSD matrix:
ˆR
(biased)
(z) =
1
N
x(z)xP
(z) . (20)
◮ this is a rank one matrix by definition!
25 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Unbiased Estimator
◮ True cross-correlation sequence:
rmµ[τ] = E xm[n]x∗
µ[n − τ] . (21)
◮ estimation over a window of N samples:
ˆrmµ[τ] =



1
N−|τ|
N−|τ|−1
n=0
xm[n + τ]x∗
µ[n] , τ ≥ 0
1
N−|τ|
N−|τ|−1
n=0
xm[n]x∗
µ[n − τ] , τ < 0
(22)
◮ check on bias:
mean{ˆrmµ[τ]} = E{ˆrmµ[τ]}
=
1
N − |τ|
N−τ−1
n=0
E xm[n]x∗
µ[n − τ]
=
1
N − |τ|
N−τ−1
n=0
rmµ[τ] = rmµ[τ] .
26 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Variance of Estimate I
◮ The variance is given by
var{ˆrmµ[τ]} = E{(ˆrmµ[τ] − rmµ[τ])(ˆrmµ[τ] − rmµ[τ])∗
}
= E ˆrmµ[τ]ˆr∗
mµ[τ] − E{ˆrmµ[τ]} r∗
mµ[τ]−
− rmµ[τ]E ˆr∗
mµ[τ] + rmµ[τ]r∗
mµ[τ]
= E ˆrmµ[τ]ˆr∗
mµ[τ] − rmµ[τ]r∗
mµ[τ] ; (23)
◮ awkward: fourth order cumulants;
◮ lucky: for real and complex Gaussian signals, the cumulants of
order three and above are zero (Mendel’91, Schreier’10); example:
E xm[n]x∗
µ[n − τ]x∗
m[n]xµ[n − τ] =
E xm[n]x∗
µ[n − τ] · E{x∗
m[n]xµ[n − τ]}
+ E{xm[n]x∗
m[n]} · E x∗
µ[n − τ]xµ[n − τ]
+ E{xm[n]xµ[n − τ]} · E x∗
µ[n − τ]x∗
m[n] .
27 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Variance of Estimate II
◮ Inserting for τ > 0:
var{ˆrmµ[τ]} =
1
(N −|τ|)2
N−|τ|−1
n,ν=0
E xm[n+τ]x∗
µ[n] ·
· E{x∗
m[ν+τ]xµ[ν]} +
+ E{xm[n + τ]x∗
m[ν + τ]} E x∗
µ[n]xµ[ν]
+ E{xm[n + τ]xµ[ν]} E x∗
µ[n]x∗
µ[ν + τ]
− rmµ[τ]r∗
mµ[τ]
=
1
(N −|τ|)2
N−|τ|−1
n,ν=0
(E{xm[n]x∗
m[ν]} ·
·E x∗
µ[n]xµ[ν] +
+ E{xm[n]xµ[ν − τ]} E x∗
m[ν]x∗
µ[n − τ] (24)
◮ the same result can be obtained for τ < 0.
28 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Variance of Estimate III
◮ The first term in (24) can be simplified as
N−|τ|−1
n,ν=0
E{xm[n]x∗
m[ν]} E x∗
µ[n]xµ[ν]
=
N−|τ|−1
n,ν=0
(E{xm[n]x∗
m[n − (n − ν)]} ·
· E x∗
µ[n]xµ[n − (n − ν)]
=
N−|τ|−1
n,ν=0
rmm[n − ν]r∗
µµ[n − ν]
=
N−|τ|−1
t=−N+|τ|+1
(N − |τ| − |t|)rmm[t]r∗
νν[t] .
◮ in the last step, the double sum is resolved to a single one.
29 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Variance of Estimate IV
◮ using the complementary cross-correlation sequence
¯rmµ[τ] = E{xm[n]xµ[n − τ]} , the variance of the sample
cross-correlation sequence becomes
var{ˆrmµ[τ]} =
1
(N −|τ|)2
N−|τ|−1
t=−N+|τ|+1
(N − |τ| − |t|)·
· rmm[t]r∗
µµ[t] + ¯rmµ[τ + t]¯r∗
mµ[τ − t] ; (25)
◮ is this any good? (1) Particularisation to the auto-correlation
sequences matches Kay’91.
◮ (2) If data is temporally uncorrelated, then for the instantaneous
and real case, (25) simplifies to
var{ˆrmµ[0]} =
1
N
rmm[0]rµµ[0] + |rmµ[0]|2
,
◮ this is the variance of the Wishart distribution.
30 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Testing of Result – Real Valued Case
◮ Check for N = 100, results over an ensemble of 104
random data instantiations using a fixed source model:
-50 -40 -30 -20 -10 0 10 20 30 40 50
-4
-2
0
2
4
-50 -40 -30 -20 -10 0 10 20 30 40 50
0
0.2
0.4
0.6
31 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Testing of Result – Complex Valued Case
-50 -40 -30 -20 -10 0 10 20 30 40 50
-2
0
2
4
-50 -40 -30 -20 -10 0 10 20 30 40 50
-2
0
2
4
-50 -40 -30 -20 -10 0 10 20 30 40 50
0
0.2
0.4
32 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Application 1: Optimum Support
◮ When estimating Rτ], we have to trade off between
truncation and estimation errors:
0 10 20 30 40 50 60 70 80 90 100
10 -2
10 -1
10 0
10 1
33 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Loss of Positive Semi-Definiteness
◮ Example for a auto-correlation sequence:
R(z) = A(z)AP
(z) with A(z) = 1 − ejπ/4
z−1
+ jz−2
◮ R(z) is of order 4; assume ˆR(z) is truncated to order 2;
◮ evaluation on the unit circle (power spectral density):
0 /4 /2 3 /4 5 /4 3 /2 7 /4 2
-2
0
2
4
6
8
10
◮ negative PSD awkward, but noted by Kay & Marple’81.
34 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Application 2: Perturbation of Eigenvalues
◮ CSD matrix R(z) is analytic in z — we know that
there exists an analytic factorisation R(z) = Q(z)Λ(z)QP
(z);
◮ the estimate ˆR(z, ǫ) is analytic in z and differentiable in ǫ, where
ǫ = 1/N is assumed continuous for N ≫ 1;
◮ on the unit circle, ˆΛ(ejΩ, ǫ) is differentiable for a fixed Ω;
◮ however, ˆΛ((ejΩ), ǫ) is not totally differentiable (Kato’80);
example:
0 /2 3 /2 2
0
1
2
3
4
0 /2 3 /2 2
0
1
2
3
4
norm. angular freq. Ω norm. angular freq. Ω
35 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Perturbation of Eigenvalues II
◮ The estimation error can be used to check on the binwise
perturbation of eigenvalues of the CSD matrix:
0 /4 /2 3 /4 5 /4 3 /2 7 /4 2
0
1
2
3
4
5
36 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Perturbation of Eigenspaces
◮ Binwise subspace correlation mismatch between ground truth and
estimate:
0 /4 /2 3 /4 5 /4 3 /2 7 /4 2
10 -4
10 -3
10 -2
10 -1
10 0
37 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Summary
◮ We have considered the estimation of a space-time covariance
matrix;
◮ the variance of the estimator agrees with known results for
auto-correlation sequences (1-d, correlated) and instantaneous
MIMO systems (M-d, uncorrelated);
◮ awkward, and almost forgotten: ˆR[τ] and the estimated PSD are
no longer guaranteed to be positive semi-definite;
◮ the variance of the estimate can be used to predict the
perturbation of eigenvalues (and eigenspaces);
◮ this however only works bin-wise: the eigenvalues are not totally
differentiable in both Ω and 1/N.
38 / 39
Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage
Engagement
◮ If interested, please feel free
to try the polynomial matrix
toolbox for Matlab:
pevd-toolbox.eee.strath.ac.uk
◮ I have a 2.5 year postdoc position as part of UDRC3:
dimensionality reduction and processing of high-dim.,
heterogeneous and non-traditional signals; see vacancies at the
University of Strathclyde.
39 / 39

More Related Content

What's hot

Cours continuité et limites
Cours continuité et limitesCours continuité et limites
Cours continuité et limitesYessin Abdelhedi
 
【材料力学】座屈 (II-08 2018)
【材料力学】座屈  (II-08 2018)【材料力学】座屈  (II-08 2018)
【材料力学】座屈 (II-08 2018)Kazuhiro Suga
 
Média e Variância da Distribuição Beta de Probabilidades
Média e Variância da Distribuição Beta de ProbabilidadesMédia e Variância da Distribuição Beta de Probabilidades
Média e Variância da Distribuição Beta de ProbabilidadesAnselmo Alves de Sousa
 
Note on fourier transform of unit step function
Note on fourier transform of unit step functionNote on fourier transform of unit step function
Note on fourier transform of unit step functionAnand Krishnamoorthy
 
80386 microprocessor system instruction
80386 microprocessor system instruction80386 microprocessor system instruction
80386 microprocessor system instructionUmesh Talware
 
20130716 はじパタ3章前半 ベイズの識別規則
20130716 はじパタ3章前半 ベイズの識別規則20130716 はじパタ3章前半 ベイズの識別規則
20130716 はじパタ3章前半 ベイズの識別規則koba cky
 
Bca 2nd sem-u-3.2-basic computer programming and micro programmed control
Bca 2nd sem-u-3.2-basic computer programming and micro programmed controlBca 2nd sem-u-3.2-basic computer programming and micro programmed control
Bca 2nd sem-u-3.2-basic computer programming and micro programmed controlRai University
 
Computer organization memory
Computer organization memoryComputer organization memory
Computer organization memoryDeepak John
 
8086 architecture and pin description
8086 architecture and pin description 8086 architecture and pin description
8086 architecture and pin description Aswini Dharmaraj
 
Arm cm3 architecture_and_programmer_model
Arm cm3 architecture_and_programmer_modelArm cm3 architecture_and_programmer_model
Arm cm3 architecture_and_programmer_modelGanesh Naik
 
I/O system in intel 80386 microcomputer architecture
I/O system in intel 80386 microcomputer architectureI/O system in intel 80386 microcomputer architecture
I/O system in intel 80386 microcomputer architecturekavitha muneeshwaran
 
Kernel Recipes 2019 - Marvels of Memory Auto-configuration (SPD)
Kernel Recipes 2019 - Marvels of Memory Auto-configuration (SPD)Kernel Recipes 2019 - Marvels of Memory Auto-configuration (SPD)
Kernel Recipes 2019 - Marvels of Memory Auto-configuration (SPD)Anne Nicolas
 

What's hot (20)

Cours continuité et limites
Cours continuité et limitesCours continuité et limites
Cours continuité et limites
 
【材料力学】座屈 (II-08 2018)
【材料力学】座屈  (II-08 2018)【材料力学】座屈  (II-08 2018)
【材料力学】座屈 (II-08 2018)
 
Integral table
Integral tableIntegral table
Integral table
 
Lagrange multiplier
 Lagrange multiplier Lagrange multiplier
Lagrange multiplier
 
Addressing Modes
Addressing ModesAddressing Modes
Addressing Modes
 
Média e Variância da Distribuição Beta de Probabilidades
Média e Variância da Distribuição Beta de ProbabilidadesMédia e Variância da Distribuição Beta de Probabilidades
Média e Variância da Distribuição Beta de Probabilidades
 
Usart 8251
Usart 8251Usart 8251
Usart 8251
 
Note on fourier transform of unit step function
Note on fourier transform of unit step functionNote on fourier transform of unit step function
Note on fourier transform of unit step function
 
80386 microprocessor system instruction
80386 microprocessor system instruction80386 microprocessor system instruction
80386 microprocessor system instruction
 
20130716 はじパタ3章前半 ベイズの識別規則
20130716 はじパタ3章前半 ベイズの識別規則20130716 はじパタ3章前半 ベイズの識別規則
20130716 はじパタ3章前半 ベイズの識別規則
 
Bca 2nd sem-u-3.2-basic computer programming and micro programmed control
Bca 2nd sem-u-3.2-basic computer programming and micro programmed controlBca 2nd sem-u-3.2-basic computer programming and micro programmed control
Bca 2nd sem-u-3.2-basic computer programming and micro programmed control
 
Computer organization memory
Computer organization memoryComputer organization memory
Computer organization memory
 
Hw1 solution
Hw1 solutionHw1 solution
Hw1 solution
 
8086 architecture and pin description
8086 architecture and pin description 8086 architecture and pin description
8086 architecture and pin description
 
Memory Organization.pdf
Memory Organization.pdfMemory Organization.pdf
Memory Organization.pdf
 
Arm cm3 architecture_and_programmer_model
Arm cm3 architecture_and_programmer_modelArm cm3 architecture_and_programmer_model
Arm cm3 architecture_and_programmer_model
 
量子情報復習
量子情報復習量子情報復習
量子情報復習
 
Math coprocessor 8087
Math coprocessor 8087Math coprocessor 8087
Math coprocessor 8087
 
I/O system in intel 80386 microcomputer architecture
I/O system in intel 80386 microcomputer architectureI/O system in intel 80386 microcomputer architecture
I/O system in intel 80386 microcomputer architecture
 
Kernel Recipes 2019 - Marvels of Memory Auto-configuration (SPD)
Kernel Recipes 2019 - Marvels of Memory Auto-configuration (SPD)Kernel Recipes 2019 - Marvels of Memory Auto-configuration (SPD)
Kernel Recipes 2019 - Marvels of Memory Auto-configuration (SPD)
 

Similar to Estimating Space-Time Covariance from Finite Sample Sets

Mining group correlations over data streams
Mining group correlations over data streamsMining group correlations over data streams
Mining group correlations over data streamsyuanchung
 
Data Driven Choice of Threshold in Cepstrum Based Spectrum Estimate
Data Driven Choice of Threshold in Cepstrum Based Spectrum EstimateData Driven Choice of Threshold in Cepstrum Based Spectrum Estimate
Data Driven Choice of Threshold in Cepstrum Based Spectrum Estimatesipij
 
Linear regression [Theory and Application (In physics point of view) using py...
Linear regression [Theory and Application (In physics point of view) using py...Linear regression [Theory and Application (In physics point of view) using py...
Linear regression [Theory and Application (In physics point of view) using py...ANIRBANMAJUMDAR18
 
Getting started with chemometric classification
Getting started with chemometric classificationGetting started with chemometric classification
Getting started with chemometric classificationAlex Henderson
 
MLHEP Lectures - day 1, basic track
MLHEP Lectures - day 1, basic trackMLHEP Lectures - day 1, basic track
MLHEP Lectures - day 1, basic trackarogozhnikov
 
Dynamics of structures with uncertainties
Dynamics of structures with uncertaintiesDynamics of structures with uncertainties
Dynamics of structures with uncertaintiesUniversity of Glasgow
 
Tensor Spectral Clustering
Tensor Spectral ClusteringTensor Spectral Clustering
Tensor Spectral ClusteringAustin Benson
 
Intelligent fault diagnosis for power distribution systemcomparative studies
Intelligent fault diagnosis for power distribution systemcomparative studiesIntelligent fault diagnosis for power distribution systemcomparative studies
Intelligent fault diagnosis for power distribution systemcomparative studiesnooriasukmaningtyas
 
Performance of Spiked Population Models for Spectrum Sensing
Performance of Spiked Population Models for Spectrum SensingPerformance of Spiked Population Models for Spectrum Sensing
Performance of Spiked Population Models for Spectrum SensingPolytechnique Montreal
 
Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)Mostafa G. M. Mostafa
 
A Non Parametric Estimation Based Underwater Target Classifier
A Non Parametric Estimation Based Underwater Target ClassifierA Non Parametric Estimation Based Underwater Target Classifier
A Non Parametric Estimation Based Underwater Target ClassifierCSCJournals
 
DBSCAN (2014_11_25 06_21_12 UTC)
DBSCAN (2014_11_25 06_21_12 UTC)DBSCAN (2014_11_25 06_21_12 UTC)
DBSCAN (2014_11_25 06_21_12 UTC)Cory Cook
 
The Sample Average Approximation Method for Stochastic Programs with Integer ...
The Sample Average Approximation Method for Stochastic Programs with Integer ...The Sample Average Approximation Method for Stochastic Programs with Integer ...
The Sample Average Approximation Method for Stochastic Programs with Integer ...SSA KPI
 
. An introduction to machine learning and probabilistic ...
. An introduction to machine learning and probabilistic .... An introduction to machine learning and probabilistic ...
. An introduction to machine learning and probabilistic ...butest
 
Consistent Nonparametric Spectrum Estimation Via Cepstrum Thresholding
Consistent Nonparametric Spectrum Estimation Via Cepstrum ThresholdingConsistent Nonparametric Spectrum Estimation Via Cepstrum Thresholding
Consistent Nonparametric Spectrum Estimation Via Cepstrum ThresholdingCSCJournals
 
MVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsMVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsElvis DOHMATOB
 

Similar to Estimating Space-Time Covariance from Finite Sample Sets (20)

Mining group correlations over data streams
Mining group correlations over data streamsMining group correlations over data streams
Mining group correlations over data streams
 
Data Driven Choice of Threshold in Cepstrum Based Spectrum Estimate
Data Driven Choice of Threshold in Cepstrum Based Spectrum EstimateData Driven Choice of Threshold in Cepstrum Based Spectrum Estimate
Data Driven Choice of Threshold in Cepstrum Based Spectrum Estimate
 
Linear regression [Theory and Application (In physics point of view) using py...
Linear regression [Theory and Application (In physics point of view) using py...Linear regression [Theory and Application (In physics point of view) using py...
Linear regression [Theory and Application (In physics point of view) using py...
 
Getting started with chemometric classification
Getting started with chemometric classificationGetting started with chemometric classification
Getting started with chemometric classification
 
MLHEP Lectures - day 1, basic track
MLHEP Lectures - day 1, basic trackMLHEP Lectures - day 1, basic track
MLHEP Lectures - day 1, basic track
 
Dynamics of structures with uncertainties
Dynamics of structures with uncertaintiesDynamics of structures with uncertainties
Dynamics of structures with uncertainties
 
Input analysis
Input analysisInput analysis
Input analysis
 
Tensor Spectral Clustering
Tensor Spectral ClusteringTensor Spectral Clustering
Tensor Spectral Clustering
 
Intelligent fault diagnosis for power distribution systemcomparative studies
Intelligent fault diagnosis for power distribution systemcomparative studiesIntelligent fault diagnosis for power distribution systemcomparative studies
Intelligent fault diagnosis for power distribution systemcomparative studies
 
Performance of Spiked Population Models for Spectrum Sensing
Performance of Spiked Population Models for Spectrum SensingPerformance of Spiked Population Models for Spectrum Sensing
Performance of Spiked Population Models for Spectrum Sensing
 
Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)
 
A Non Parametric Estimation Based Underwater Target Classifier
A Non Parametric Estimation Based Underwater Target ClassifierA Non Parametric Estimation Based Underwater Target Classifier
A Non Parametric Estimation Based Underwater Target Classifier
 
DBSCAN (2014_11_25 06_21_12 UTC)
DBSCAN (2014_11_25 06_21_12 UTC)DBSCAN (2014_11_25 06_21_12 UTC)
DBSCAN (2014_11_25 06_21_12 UTC)
 
The Sample Average Approximation Method for Stochastic Programs with Integer ...
The Sample Average Approximation Method for Stochastic Programs with Integer ...The Sample Average Approximation Method for Stochastic Programs with Integer ...
The Sample Average Approximation Method for Stochastic Programs with Integer ...
 
. An introduction to machine learning and probabilistic ...
. An introduction to machine learning and probabilistic .... An introduction to machine learning and probabilistic ...
. An introduction to machine learning and probabilistic ...
 
Consistent Nonparametric Spectrum Estimation Via Cepstrum Thresholding
Consistent Nonparametric Spectrum Estimation Via Cepstrum ThresholdingConsistent Nonparametric Spectrum Estimation Via Cepstrum Thresholding
Consistent Nonparametric Spectrum Estimation Via Cepstrum Thresholding
 
MVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsMVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priors
 
PhD_defense_Alla
PhD_defense_AllaPhD_defense_Alla
PhD_defense_Alla
 
HPC_NIST_SHA3
HPC_NIST_SHA3HPC_NIST_SHA3
HPC_NIST_SHA3
 
Dycops2019
Dycops2019 Dycops2019
Dycops2019
 

More from Förderverein Technische Fakultät

The Digital Transformation of Education: A Hyper-Disruptive Era through Block...
The Digital Transformation of Education: A Hyper-Disruptive Era through Block...The Digital Transformation of Education: A Hyper-Disruptive Era through Block...
The Digital Transformation of Education: A Hyper-Disruptive Era through Block...Förderverein Technische Fakultät
 
Engineering Serverless Workflow Applications in Federated FaaS.pdf
Engineering Serverless Workflow Applications in Federated FaaS.pdfEngineering Serverless Workflow Applications in Federated FaaS.pdf
Engineering Serverless Workflow Applications in Federated FaaS.pdfFörderverein Technische Fakultät
 
The Role of Machine Learning in Fluid Network Control and Data Planes.pdf
The Role of Machine Learning in Fluid Network Control and Data Planes.pdfThe Role of Machine Learning in Fluid Network Control and Data Planes.pdf
The Role of Machine Learning in Fluid Network Control and Data Planes.pdfFörderverein Technische Fakultät
 
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...Förderverein Technische Fakultät
 
East-west oriented photovoltaic power systems: model, benefits and technical ...
East-west oriented photovoltaic power systems: model, benefits and technical ...East-west oriented photovoltaic power systems: model, benefits and technical ...
East-west oriented photovoltaic power systems: model, benefits and technical ...Förderverein Technische Fakultät
 
Advances in Visual Quality Restoration with Generative Adversarial Networks
Advances in Visual Quality Restoration with Generative Adversarial NetworksAdvances in Visual Quality Restoration with Generative Adversarial Networks
Advances in Visual Quality Restoration with Generative Adversarial NetworksFörderverein Technische Fakultät
 
Industriepraktikum_ Unterstützung bei Projekten in der Automatisierung.pdf
Industriepraktikum_ Unterstützung bei Projekten in der Automatisierung.pdfIndustriepraktikum_ Unterstützung bei Projekten in der Automatisierung.pdf
Industriepraktikum_ Unterstützung bei Projekten in der Automatisierung.pdfFörderverein Technische Fakultät
 

More from Förderverein Technische Fakultät (20)

Supervisory control of business processes
Supervisory control of business processesSupervisory control of business processes
Supervisory control of business processes
 
The Digital Transformation of Education: A Hyper-Disruptive Era through Block...
The Digital Transformation of Education: A Hyper-Disruptive Era through Block...The Digital Transformation of Education: A Hyper-Disruptive Era through Block...
The Digital Transformation of Education: A Hyper-Disruptive Era through Block...
 
A Game of Chess is Like a Swordfight.pdf
A Game of Chess is Like a Swordfight.pdfA Game of Chess is Like a Swordfight.pdf
A Game of Chess is Like a Swordfight.pdf
 
From Mind to Meta.pdf
From Mind to Meta.pdfFrom Mind to Meta.pdf
From Mind to Meta.pdf
 
Miniatures Design for Tabletop Games.pdf
Miniatures Design for Tabletop Games.pdfMiniatures Design for Tabletop Games.pdf
Miniatures Design for Tabletop Games.pdf
 
Distributed Systems in the Post-Moore Era.pptx
Distributed Systems in the Post-Moore Era.pptxDistributed Systems in the Post-Moore Era.pptx
Distributed Systems in the Post-Moore Era.pptx
 
Don't Treat the Symptom, Find the Cause!.pptx
Don't Treat the Symptom, Find the Cause!.pptxDon't Treat the Symptom, Find the Cause!.pptx
Don't Treat the Symptom, Find the Cause!.pptx
 
Engineering Serverless Workflow Applications in Federated FaaS.pdf
Engineering Serverless Workflow Applications in Federated FaaS.pdfEngineering Serverless Workflow Applications in Federated FaaS.pdf
Engineering Serverless Workflow Applications in Federated FaaS.pdf
 
The Role of Machine Learning in Fluid Network Control and Data Planes.pdf
The Role of Machine Learning in Fluid Network Control and Data Planes.pdfThe Role of Machine Learning in Fluid Network Control and Data Planes.pdf
The Role of Machine Learning in Fluid Network Control and Data Planes.pdf
 
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
 
Towards a data driven identification of teaching patterns.pdf
Towards a data driven identification of teaching patterns.pdfTowards a data driven identification of teaching patterns.pdf
Towards a data driven identification of teaching patterns.pdf
 
Förderverein Technische Fakultät.pptx
Förderverein Technische Fakultät.pptxFörderverein Technische Fakultät.pptx
Förderverein Technische Fakultät.pptx
 
The Computing Continuum.pdf
The Computing Continuum.pdfThe Computing Continuum.pdf
The Computing Continuum.pdf
 
East-west oriented photovoltaic power systems: model, benefits and technical ...
East-west oriented photovoltaic power systems: model, benefits and technical ...East-west oriented photovoltaic power systems: model, benefits and technical ...
East-west oriented photovoltaic power systems: model, benefits and technical ...
 
Machine Learning in Finance via Randomization
Machine Learning in Finance via RandomizationMachine Learning in Finance via Randomization
Machine Learning in Finance via Randomization
 
IT does not stop
IT does not stopIT does not stop
IT does not stop
 
Advances in Visual Quality Restoration with Generative Adversarial Networks
Advances in Visual Quality Restoration with Generative Adversarial NetworksAdvances in Visual Quality Restoration with Generative Adversarial Networks
Advances in Visual Quality Restoration with Generative Adversarial Networks
 
Recent Trends in Personalization at Netflix
Recent Trends in Personalization at NetflixRecent Trends in Personalization at Netflix
Recent Trends in Personalization at Netflix
 
Industriepraktikum_ Unterstützung bei Projekten in der Automatisierung.pdf
Industriepraktikum_ Unterstützung bei Projekten in der Automatisierung.pdfIndustriepraktikum_ Unterstützung bei Projekten in der Automatisierung.pdf
Industriepraktikum_ Unterstützung bei Projekten in der Automatisierung.pdf
 
Introduction to 5G from radio perspective
Introduction to 5G from radio perspectiveIntroduction to 5G from radio perspective
Introduction to 5G from radio perspective
 

Recently uploaded

Collecting & Temporal Analysis of Behavioral Web Data - Tales From The Inside
Collecting & Temporal Analysis of Behavioral Web Data - Tales From The InsideCollecting & Temporal Analysis of Behavioral Web Data - Tales From The Inside
Collecting & Temporal Analysis of Behavioral Web Data - Tales From The InsideStefan Dietze
 
UiPath manufacturing technology benefits and AI overview
UiPath manufacturing technology benefits and AI overviewUiPath manufacturing technology benefits and AI overview
UiPath manufacturing technology benefits and AI overviewDianaGray10
 
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...TrustArc
 
Where to Learn More About FDO _ Richard at FIDO Alliance.pdf
Where to Learn More About FDO _ Richard at FIDO Alliance.pdfWhere to Learn More About FDO _ Richard at FIDO Alliance.pdf
Where to Learn More About FDO _ Richard at FIDO Alliance.pdfFIDO Alliance
 
The Value of Certifying Products for FDO _ Paul at FIDO Alliance.pdf
The Value of Certifying Products for FDO _ Paul at FIDO Alliance.pdfThe Value of Certifying Products for FDO _ Paul at FIDO Alliance.pdf
The Value of Certifying Products for FDO _ Paul at FIDO Alliance.pdfFIDO Alliance
 
Google I/O Extended 2024 Warsaw
Google I/O Extended 2024 WarsawGoogle I/O Extended 2024 Warsaw
Google I/O Extended 2024 WarsawGDSC PJATK
 
2024 May Patch Tuesday
2024 May Patch Tuesday2024 May Patch Tuesday
2024 May Patch TuesdayIvanti
 
Linux Foundation Edge _ Overview of FDO Software Components _ Randy at Intel.pdf
Linux Foundation Edge _ Overview of FDO Software Components _ Randy at Intel.pdfLinux Foundation Edge _ Overview of FDO Software Components _ Randy at Intel.pdf
Linux Foundation Edge _ Overview of FDO Software Components _ Randy at Intel.pdfFIDO Alliance
 
AI mind or machine power point presentation
AI mind or machine power point presentationAI mind or machine power point presentation
AI mind or machine power point presentationyogeshlabana357357
 
JavaScript Usage Statistics 2024 - The Ultimate Guide
JavaScript Usage Statistics 2024 - The Ultimate GuideJavaScript Usage Statistics 2024 - The Ultimate Guide
JavaScript Usage Statistics 2024 - The Ultimate GuidePixlogix Infotech
 
Secure Zero Touch enabled Edge compute with Dell NativeEdge via FDO _ Brad at...
Secure Zero Touch enabled Edge compute with Dell NativeEdge via FDO _ Brad at...Secure Zero Touch enabled Edge compute with Dell NativeEdge via FDO _ Brad at...
Secure Zero Touch enabled Edge compute with Dell NativeEdge via FDO _ Brad at...FIDO Alliance
 
Observability Concepts EVERY Developer Should Know (DevOpsDays Seattle)
Observability Concepts EVERY Developer Should Know (DevOpsDays Seattle)Observability Concepts EVERY Developer Should Know (DevOpsDays Seattle)
Observability Concepts EVERY Developer Should Know (DevOpsDays Seattle)Paige Cruz
 
The Zero-ETL Approach: Enhancing Data Agility and Insight
The Zero-ETL Approach: Enhancing Data Agility and InsightThe Zero-ETL Approach: Enhancing Data Agility and Insight
The Zero-ETL Approach: Enhancing Data Agility and InsightSafe Software
 
Event-Driven Architecture Masterclass: Engineering a Robust, High-performance...
Event-Driven Architecture Masterclass: Engineering a Robust, High-performance...Event-Driven Architecture Masterclass: Engineering a Robust, High-performance...
Event-Driven Architecture Masterclass: Engineering a Robust, High-performance...ScyllaDB
 
Working together SRE & Platform Engineering
Working together SRE & Platform EngineeringWorking together SRE & Platform Engineering
Working together SRE & Platform EngineeringMarcus Vechiato
 
Hyatt driving innovation and exceptional customer experiences with FIDO passw...
Hyatt driving innovation and exceptional customer experiences with FIDO passw...Hyatt driving innovation and exceptional customer experiences with FIDO passw...
Hyatt driving innovation and exceptional customer experiences with FIDO passw...FIDO Alliance
 
Generative AI Use Cases and Applications.pdf
Generative AI Use Cases and Applications.pdfGenerative AI Use Cases and Applications.pdf
Generative AI Use Cases and Applications.pdfalexjohnson7307
 
Design and Development of a Provenance Capture Platform for Data Science
Design and Development of a Provenance Capture Platform for Data ScienceDesign and Development of a Provenance Capture Platform for Data Science
Design and Development of a Provenance Capture Platform for Data SciencePaolo Missier
 
Introduction to FIDO Authentication and Passkeys.pptx
Introduction to FIDO Authentication and Passkeys.pptxIntroduction to FIDO Authentication and Passkeys.pptx
Introduction to FIDO Authentication and Passkeys.pptxFIDO Alliance
 
“Iamnobody89757” Understanding the Mysterious of Digital Identity.pdf
“Iamnobody89757” Understanding the Mysterious of Digital Identity.pdf“Iamnobody89757” Understanding the Mysterious of Digital Identity.pdf
“Iamnobody89757” Understanding the Mysterious of Digital Identity.pdfMuhammad Subhan
 

Recently uploaded (20)

Collecting & Temporal Analysis of Behavioral Web Data - Tales From The Inside
Collecting & Temporal Analysis of Behavioral Web Data - Tales From The InsideCollecting & Temporal Analysis of Behavioral Web Data - Tales From The Inside
Collecting & Temporal Analysis of Behavioral Web Data - Tales From The Inside
 
UiPath manufacturing technology benefits and AI overview
UiPath manufacturing technology benefits and AI overviewUiPath manufacturing technology benefits and AI overview
UiPath manufacturing technology benefits and AI overview
 
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
 
Where to Learn More About FDO _ Richard at FIDO Alliance.pdf
Where to Learn More About FDO _ Richard at FIDO Alliance.pdfWhere to Learn More About FDO _ Richard at FIDO Alliance.pdf
Where to Learn More About FDO _ Richard at FIDO Alliance.pdf
 
The Value of Certifying Products for FDO _ Paul at FIDO Alliance.pdf
The Value of Certifying Products for FDO _ Paul at FIDO Alliance.pdfThe Value of Certifying Products for FDO _ Paul at FIDO Alliance.pdf
The Value of Certifying Products for FDO _ Paul at FIDO Alliance.pdf
 
Google I/O Extended 2024 Warsaw
Google I/O Extended 2024 WarsawGoogle I/O Extended 2024 Warsaw
Google I/O Extended 2024 Warsaw
 
2024 May Patch Tuesday
2024 May Patch Tuesday2024 May Patch Tuesday
2024 May Patch Tuesday
 
Linux Foundation Edge _ Overview of FDO Software Components _ Randy at Intel.pdf
Linux Foundation Edge _ Overview of FDO Software Components _ Randy at Intel.pdfLinux Foundation Edge _ Overview of FDO Software Components _ Randy at Intel.pdf
Linux Foundation Edge _ Overview of FDO Software Components _ Randy at Intel.pdf
 
AI mind or machine power point presentation
AI mind or machine power point presentationAI mind or machine power point presentation
AI mind or machine power point presentation
 
JavaScript Usage Statistics 2024 - The Ultimate Guide
JavaScript Usage Statistics 2024 - The Ultimate GuideJavaScript Usage Statistics 2024 - The Ultimate Guide
JavaScript Usage Statistics 2024 - The Ultimate Guide
 
Secure Zero Touch enabled Edge compute with Dell NativeEdge via FDO _ Brad at...
Secure Zero Touch enabled Edge compute with Dell NativeEdge via FDO _ Brad at...Secure Zero Touch enabled Edge compute with Dell NativeEdge via FDO _ Brad at...
Secure Zero Touch enabled Edge compute with Dell NativeEdge via FDO _ Brad at...
 
Observability Concepts EVERY Developer Should Know (DevOpsDays Seattle)
Observability Concepts EVERY Developer Should Know (DevOpsDays Seattle)Observability Concepts EVERY Developer Should Know (DevOpsDays Seattle)
Observability Concepts EVERY Developer Should Know (DevOpsDays Seattle)
 
The Zero-ETL Approach: Enhancing Data Agility and Insight
The Zero-ETL Approach: Enhancing Data Agility and InsightThe Zero-ETL Approach: Enhancing Data Agility and Insight
The Zero-ETL Approach: Enhancing Data Agility and Insight
 
Event-Driven Architecture Masterclass: Engineering a Robust, High-performance...
Event-Driven Architecture Masterclass: Engineering a Robust, High-performance...Event-Driven Architecture Masterclass: Engineering a Robust, High-performance...
Event-Driven Architecture Masterclass: Engineering a Robust, High-performance...
 
Working together SRE & Platform Engineering
Working together SRE & Platform EngineeringWorking together SRE & Platform Engineering
Working together SRE & Platform Engineering
 
Hyatt driving innovation and exceptional customer experiences with FIDO passw...
Hyatt driving innovation and exceptional customer experiences with FIDO passw...Hyatt driving innovation and exceptional customer experiences with FIDO passw...
Hyatt driving innovation and exceptional customer experiences with FIDO passw...
 
Generative AI Use Cases and Applications.pdf
Generative AI Use Cases and Applications.pdfGenerative AI Use Cases and Applications.pdf
Generative AI Use Cases and Applications.pdf
 
Design and Development of a Provenance Capture Platform for Data Science
Design and Development of a Provenance Capture Platform for Data ScienceDesign and Development of a Provenance Capture Platform for Data Science
Design and Development of a Provenance Capture Platform for Data Science
 
Introduction to FIDO Authentication and Passkeys.pptx
Introduction to FIDO Authentication and Passkeys.pptxIntroduction to FIDO Authentication and Passkeys.pptx
Introduction to FIDO Authentication and Passkeys.pptx
 
“Iamnobody89757” Understanding the Mysterious of Digital Identity.pdf
“Iamnobody89757” Understanding the Mysterious of Digital Identity.pdf“Iamnobody89757” Understanding the Mysterious of Digital Identity.pdf
“Iamnobody89757” Understanding the Mysterious of Digital Identity.pdf
 

Estimating Space-Time Covariance from Finite Sample Sets

  • 1. Estimating Space-Time Covariance from Finite Sample Sets Stephan Weiss Centre for Signal & Image Processing Department of Electonic & Electrical Engineering University of Strathclyde, Glasgow, Scotland, UK TeWi Seminar, Alpen Adria University, 22 May 2019 Thanks to: I.K. Proudler, J. Pestana, F. Coutts, C. Delaosa This work is supported by the Physical Sciences Research Council (EPSRC) Grant num- ber EP/S000631/1 and the MOD University Defence Research Collaboration in Signal Processing. 1 / 39
  • 2. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Presentation Overview 1. Overview; 2. a reminder of statistics background; 3. a reminder on auto- and cross-correlation sequences; 4. mid-talk exam; 5. sample sapce-time covariance matrix; 6. cross-correlation estimation; 7. some results and comparisons; 8. applications: support estimation and eigenvalue perturbation; 9. summary; and 10. a shameless last slide. 2 / 39
  • 3. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Random Signals/ Stochastic Processes A stochastic process x[n] is characterised by deterministic measures: ◮ the probability density function (PDF), or normalised histogram, p(x): p(x) ≥ 0 ∀ x and ∞ −∞ p(x)dx = 1 ◮ the PDF’s moments of order l: ∞ −∞ xl p(x)dx ◮ specifically, note that the first moment l = 1 is the mean µ, and that the second moment l = 2 is variance σ2 if µ = 0; ◮ the autocorrelation function of the process x[n]. 3 / 39
  • 4. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Probability Density Function ◮ Random data can be characterised by its distribution of amplitude values: −3−2−10123 0 0.2 0.4 0.6 0.8 (x)d x 0 10 20 30 40 50 60 70 80 90 100 −3 −2 −1 0 1 2 3 time index n x[n] ◮ the PDF describes with which probability P amplitude values of x[n] will fall within a specific interval [x1 ; x2]: P(x ∈ [x1 ; x2]) = x2 x1 p(x)dx ◮ a histogram of the data can be used to estimate the PDF . . . 4 / 39
  • 5. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Probability Density Function Estimation ◮ Histogram estimation based on 103 samples: −4 −3 −2 −1 0 1 2 3 4 0 50 rel.freq. sample values x ◮ histogram based on 104 samples: −4 −3 −2 −1 0 1 2 3 4 0 500 rel.freq. sample values x ◮ histogram based on 105 samples: −4 −3 −2 −1 0 1 2 3 4 0 500 rel.freq. sample values x ◮ for consistent estimates, we need as much data as possible! 5 / 39
  • 6. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Gaussian or Normal Distribution ◮ For the Gaussian or normal PDF, x ∈ N(µ, σ2): p(x) = 1 √ 2πσ e −(x−µ)2 2σ2 (1) ◮ mean is µ, variance is σ2; ◮ sketch for x ∈ N(0, 1): −3 −2 −1 0 1 2 3 0 0.2 0.4 0.6 0.8 p(x) x ◮ central limit theorem: the sum of arbitrarily distributed processes converges to a Gaussian PDF; 6 / 39
  • 7. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Uniform Distribution ◮ A uniform distribution has equal probability of amplitude values within a specified interval; ◮ e.g. x= rand() in Matlab produces samples x ∈ [0 ; 1] with the following PDF: −1 −0.5 0 0.5 1 1.5 2 0 0.5 1 p(x) x ◮ mean and variance are µ = ∞ −∞ xp(x)dx = 1 0 xdx = 1 2 x2 1 0 = 1 2 (2) σ2 = 1 x2 dx − µ2 = 1 x3 1 − 1 = 1 (3) 7 / 39
  • 8. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Other PDFs ◮ PDF of a binary phase shift keying (BPSK) symbol sequence, which is a type of Bernoulli distribution: x p(x) -1 1 1 2 1 2 ◮ PDFs for complex valued signals also exist; ◮ example for the PDF of a quaternary phase shift keying (QPSK) sequence: ℜ{x} p(x) -1 1 1 4 1 4 1 4 1 4 −j ℑ{x} j 8 / 39
  • 9. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Complex Gaussian Distribution ◮ PDF of a complex Gaussian process with independent and identically distributed (IID) real and imaginary parts: −3 −2 −1 0 1 2 3 −3 −2 −1 0 1 2 3 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 ℜ{x} ℑ{x} p(x) ◮ this leads to a circularly-symmetric PDF. 9 / 39
  • 10. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Central Limit Theorem ◮ Theorem: adding arbitarily distributed but independent signals will, in the limit, tend towards a Gaussian distribution; ◮ example: y[n] = h[n] ∗ x[n], with x[n] a sequence of independent BPSK symbols: −1 −0.5 0 0.5 1 0 2 4 6 x 10 4 x rel.freq. −1 −0.5 0 0.5 1 0 2000 4000 6000 8000 y rel.freq. h[n] x[n] y[n] 0 5 10 15 20 −0.1 −0.05 0 0.05 0.1 index n h[n] ◮ the filter sums differently weighted independent random processes, and it does not take many to make the output look Gaussian! 10 / 39
  • 11. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Stationarity and Ergodicity ◮ Stationarity means that the statistical moments of a random process do not change over time; ◮ a weaker condition is wide-sense stationarity (WSS), i.e. moments up to second order (mean and variance) are constant over time; this is sufficient unless higher order statistics (HOS) algorithms are deployed; ◮ a stochastic process is ergodic if the expectation operation can be replaced by a temporal average, σ2 xx = ∞ −∞ x2 p(x)dx = E{x[n]x∗ [n]} = lim N→∞ 1 N N−1 n=0 |x[n]|2 (4) ◮ remember: expectation is an average over an ensemble; a temporal average is performed over a single ensemble probe! 11 / 39
  • 12. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Sample Size Matters! ◮ When estimating quantities such as PDF, mean or variance, the estimator should be bias-free, i.e. converge towards the desired value; ◮ consistency refers to the variability of the estimator around the asymptotic value; ◮ the more samples, the better the consistency of the estimate; ◮ mean ˆµ and variance ˆσ2 of a uniformly distributed signal: ˆσ2ˆµ 10 1 10 2 10 3 10 4 10 5 0.2 0.4 0.6 0.8 10 1 10 2 10 3 10 4 10 5 0.04 0.06 0.08 0.1 0.12 0.14 12 / 39
  • 13. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Moving Average (MA) Model / Signal ◮ The PDF does not contain any information on how “correlated” successive samples are; ◮ consider the following scenario with x[n] ∈ N(0, σ2 xx) being uncorrelated (successive samples are entirely random): ✲ b[n] ✲ x[n] y[n] = x[n] ∗ b[n] N(0, σ2 xx) N(0, σ2 yy) ◮ y[n] is called a moving average process (and b[n] an MA model) of order N − 1 if y[n] = N−1 ν=0 b[ν]x[n − ν] is a weighted average over a window of N input samples. 13 / 39
  • 14. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Filtering a Random Signal ◮ Consider lowpass filtering an uncorrelated Gaussian signal x[n]: ✲ h[n] ✲ x[n] y[n] = x[n] ∗ h[n] N(0, σ2 x) N(0, σ2 y) 0 50 100 150 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2 2.5 3 time n x[n] 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.2 0.4 0.6 0.8 1 1.2 1.4 norm. angular freq. Ω/π |H(ejΩ )| 0 50 100 150 −0.5 −0.4 −0.3 −0.2 −0.1 0 0.1 0.2 0.3 0.4 0.5 time n y[n] ◮ the output will have Gaussian distribution, but the signal only changes smoothly: neighbouring samples are correlated. We need a measure! 14 / 39
  • 15. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Auto-Correlation Function I ◮ The correlation between a sample x[n] and a neighbouring value x[n − τ] is given by rxx[τ] = E{x[n] · x∗ [n − τ]} = lim N→∞ 1 N N−1 n=0 x[n] · x∗ [n − τ] (5) ◮ For two specific specific lags τ = −3 (left) and τ = −50 (right), consider: 0 50 100 150 −0.6 −0.4 −0.2 0 0.2 0.4 0.6 time n x[n],x[n+3] 0 50 100 150 −0.6 −0.4 −0.2 0 0.2 0.4 0.6 time n x[n],x[n+50] ◮ the curves on the left look “similar”, the ones on the right “dissimilar”. 15 / 39
  • 16. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Auto-Correlation Function II ◮ For lag zero, note: rxx[0] = lim N→∞ 1 N N−1 n=0 x[n] · x∗ [n] = σ2 x + µ2 x (6) ◮ This value for τ = 0 is the maximum of the auto-correlation function rxx[τ]; xxr [τ] τ ◮ large values in the ACF indicate strong correlation, small values weak correlation; 16 / 39
  • 17. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Auto-Correlation Function III ◮ If a signal has no self-similarity, i.e. it is “completely random”, the ACF takes the following form: xxr [τ] τ ◮ If we take the Fourier transform of rxx[τ], we obtain a flat spectrum (or a lowpass spectrum for the ACF on slide 16); ◮ due to the presence of all frequency components in a flat spectrum, a completely random signal is often referred to as “white noise”. 17 / 39
  • 18. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Power Spectral Density ◮ The power spectral density (PSD), Rxx(ejΩ), defines the spectrum of a random signal: Rxx(ejΩ ) = ∞ τ=−∞ rxx[τ] e−jΩτ (7) ◮ PSD and ACF form a Fourier pair, rxx[τ] ◦—• Rxx(ejΩ), therefore rxx[τ] = 1 2π π −π Rxx(ejΩ ) ejΩτ dΩ (8) ◮ note that the power of x[n] is (similar to Parseval) rxx[0] = 1 2π π −π Rxx(ejΩ ) dΩ (= scaled area under PSD) (9) 18 / 39
  • 19. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Mid-Talk “Exam” ◮ We are given a unit variance, zero mean (µ = 0) signal x[n]; ◮ we want to estimate the mean, ˆµ; ◮ Question 1: how does the sample size affect the estimation error |µ − ˆµ|2? ◮ Question 2: does it matter whether x[n] has a lowpass or highpass characteristic? 19 / 39
  • 20. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Mean Estimation ◮ Our estimator is simple: ˆµ = 1 N N−1 n=0 x[n] ; ◮ the mean of this estimator: mean{ˆµ} = E{ˆµ} = 1 N N−1 n=0 E{x[n]} = 1 N N−1 n=0 µ = µ ◮ hurray — the estimator is unbiased; ◮ for the error, we look towards the variance of the estimator: var{ˆµ} = E |ˆµ − µ|2 ◮ this is going to be a bit trickier . . . 20 / 39
  • 21. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Variance of Mean Estimator ◮ tedious but hopefully rewarding: var{ˆµ} = E{(ˆµ − µ)(ˆµ − µ)∗ } (10) = E{ˆµˆµ∗ } − E{ˆµ} µ∗ − µE{ˆµ∗ } + µµ∗ (11) = E 1 N2 N−1 n=0 x[n] N−1 ν=0 x∗ [ν] − µµ∗ (12) = 1 N2 N−1 n=0 n m=n−N−1 E{x[n]x∗ [n − m]} − µµ∗ (13) = 1 N2 N−1 n=0 n m=n−N−1 rxx[τ] − µµ∗ (14) = 1 N2 N−1 τ=−N+1 (N − |τ|)rxx[τ] − µµ∗ (15) ◮ so, here are the answers! 21 / 39
  • 22. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Space-Time Covariance Matrix ◮ Measurements obtained from M sensors are collected in a vector x[n] ∈ CM : xT [n] = [x1[n] x2[n] . . . xM [n]] ; (16) ◮ with the expectation operator E{·}, the spatial correlation is captured by R = E x[n]xH[n] ; ◮ for spatial and temporal correlation, we require a space-time covariance matrix R[τ] = E x[n]xH [n − τ] (17) ◮ this space-time covariance matrix contains auto- and cross-correlation terms, e.g. for M = 2 R[τ] = E{x1[n]x∗ 1[n − τ]} E{x1[n]x∗ 2[n − τ]} E{x2[n]x∗ 1[n − τ]} E{x2[n]x∗ 2[n − τ]} (18) 22 / 39
  • 23. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Cross-Spectral Density Matrix ◮ example for a space-time covariance matrix R[τ] ∈ R2×2: -4 -2 0 2 4 0 5 10 -4 -2 0 2 4 0 5 10 -4 -2 0 2 4 0 5 10 -4 -2 0 2 4 0 5 10 ◮ the cross-spectral density (CSD) matrix: R(z) ◦—• R[τ]. 23 / 39
  • 24. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Exact Space-Time Covariance Matrix ◮ We assume knowledge of a source model that ties the measurement vector x[n] to mutually independent, uncorrelated unit variance signals uℓ[n]: u1[n] x1[n] uL[n] xM [n] H[n] ... ... ◮ then the space time covariance matrix is R[τ] = n H[n]HH [n − τ] , ◮ or for the CSD matrix: R(z) = H(z)HP (z) . 24 / 39
  • 25. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Biased Estimator ◮ To estimate from finite data, e.g. ˆr(biased) mµ [τ] =    1 N N−τ−1 n=0 xm[n + τ]x∗ µ[n] , τ ≥ 0 ; 1 N N+τ−1 n=0 xm[n]x∗ µ[n − τ] , τ < 0 . (19) ◮ or ˆR (biased) mµ (z) = 1 N Xm(z)X∗ µ (z−1) = 1 N Xm(z)XP µ (z); ◮ for the CSD matrix: ˆR (biased) (z) = 1 N x(z)xP (z) . (20) ◮ this is a rank one matrix by definition! 25 / 39
  • 26. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Unbiased Estimator ◮ True cross-correlation sequence: rmµ[τ] = E xm[n]x∗ µ[n − τ] . (21) ◮ estimation over a window of N samples: ˆrmµ[τ] =    1 N−|τ| N−|τ|−1 n=0 xm[n + τ]x∗ µ[n] , τ ≥ 0 1 N−|τ| N−|τ|−1 n=0 xm[n]x∗ µ[n − τ] , τ < 0 (22) ◮ check on bias: mean{ˆrmµ[τ]} = E{ˆrmµ[τ]} = 1 N − |τ| N−τ−1 n=0 E xm[n]x∗ µ[n − τ] = 1 N − |τ| N−τ−1 n=0 rmµ[τ] = rmµ[τ] . 26 / 39
  • 27. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Variance of Estimate I ◮ The variance is given by var{ˆrmµ[τ]} = E{(ˆrmµ[τ] − rmµ[τ])(ˆrmµ[τ] − rmµ[τ])∗ } = E ˆrmµ[τ]ˆr∗ mµ[τ] − E{ˆrmµ[τ]} r∗ mµ[τ]− − rmµ[τ]E ˆr∗ mµ[τ] + rmµ[τ]r∗ mµ[τ] = E ˆrmµ[τ]ˆr∗ mµ[τ] − rmµ[τ]r∗ mµ[τ] ; (23) ◮ awkward: fourth order cumulants; ◮ lucky: for real and complex Gaussian signals, the cumulants of order three and above are zero (Mendel’91, Schreier’10); example: E xm[n]x∗ µ[n − τ]x∗ m[n]xµ[n − τ] = E xm[n]x∗ µ[n − τ] · E{x∗ m[n]xµ[n − τ]} + E{xm[n]x∗ m[n]} · E x∗ µ[n − τ]xµ[n − τ] + E{xm[n]xµ[n − τ]} · E x∗ µ[n − τ]x∗ m[n] . 27 / 39
  • 28. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Variance of Estimate II ◮ Inserting for τ > 0: var{ˆrmµ[τ]} = 1 (N −|τ|)2 N−|τ|−1 n,ν=0 E xm[n+τ]x∗ µ[n] · · E{x∗ m[ν+τ]xµ[ν]} + + E{xm[n + τ]x∗ m[ν + τ]} E x∗ µ[n]xµ[ν] + E{xm[n + τ]xµ[ν]} E x∗ µ[n]x∗ µ[ν + τ] − rmµ[τ]r∗ mµ[τ] = 1 (N −|τ|)2 N−|τ|−1 n,ν=0 (E{xm[n]x∗ m[ν]} · ·E x∗ µ[n]xµ[ν] + + E{xm[n]xµ[ν − τ]} E x∗ m[ν]x∗ µ[n − τ] (24) ◮ the same result can be obtained for τ < 0. 28 / 39
  • 29. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Variance of Estimate III ◮ The first term in (24) can be simplified as N−|τ|−1 n,ν=0 E{xm[n]x∗ m[ν]} E x∗ µ[n]xµ[ν] = N−|τ|−1 n,ν=0 (E{xm[n]x∗ m[n − (n − ν)]} · · E x∗ µ[n]xµ[n − (n − ν)] = N−|τ|−1 n,ν=0 rmm[n − ν]r∗ µµ[n − ν] = N−|τ|−1 t=−N+|τ|+1 (N − |τ| − |t|)rmm[t]r∗ νν[t] . ◮ in the last step, the double sum is resolved to a single one. 29 / 39
  • 30. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Variance of Estimate IV ◮ using the complementary cross-correlation sequence ¯rmµ[τ] = E{xm[n]xµ[n − τ]} , the variance of the sample cross-correlation sequence becomes var{ˆrmµ[τ]} = 1 (N −|τ|)2 N−|τ|−1 t=−N+|τ|+1 (N − |τ| − |t|)· · rmm[t]r∗ µµ[t] + ¯rmµ[τ + t]¯r∗ mµ[τ − t] ; (25) ◮ is this any good? (1) Particularisation to the auto-correlation sequences matches Kay’91. ◮ (2) If data is temporally uncorrelated, then for the instantaneous and real case, (25) simplifies to var{ˆrmµ[0]} = 1 N rmm[0]rµµ[0] + |rmµ[0]|2 , ◮ this is the variance of the Wishart distribution. 30 / 39
  • 31. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Testing of Result – Real Valued Case ◮ Check for N = 100, results over an ensemble of 104 random data instantiations using a fixed source model: -50 -40 -30 -20 -10 0 10 20 30 40 50 -4 -2 0 2 4 -50 -40 -30 -20 -10 0 10 20 30 40 50 0 0.2 0.4 0.6 31 / 39
  • 32. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Testing of Result – Complex Valued Case -50 -40 -30 -20 -10 0 10 20 30 40 50 -2 0 2 4 -50 -40 -30 -20 -10 0 10 20 30 40 50 -2 0 2 4 -50 -40 -30 -20 -10 0 10 20 30 40 50 0 0.2 0.4 32 / 39
  • 33. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Application 1: Optimum Support ◮ When estimating Rτ], we have to trade off between truncation and estimation errors: 0 10 20 30 40 50 60 70 80 90 100 10 -2 10 -1 10 0 10 1 33 / 39
  • 34. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Loss of Positive Semi-Definiteness ◮ Example for a auto-correlation sequence: R(z) = A(z)AP (z) with A(z) = 1 − ejπ/4 z−1 + jz−2 ◮ R(z) is of order 4; assume ˆR(z) is truncated to order 2; ◮ evaluation on the unit circle (power spectral density): 0 /4 /2 3 /4 5 /4 3 /2 7 /4 2 -2 0 2 4 6 8 10 ◮ negative PSD awkward, but noted by Kay & Marple’81. 34 / 39
  • 35. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Application 2: Perturbation of Eigenvalues ◮ CSD matrix R(z) is analytic in z — we know that there exists an analytic factorisation R(z) = Q(z)Λ(z)QP (z); ◮ the estimate ˆR(z, ǫ) is analytic in z and differentiable in ǫ, where ǫ = 1/N is assumed continuous for N ≫ 1; ◮ on the unit circle, ˆΛ(ejΩ, ǫ) is differentiable for a fixed Ω; ◮ however, ˆΛ((ejΩ), ǫ) is not totally differentiable (Kato’80); example: 0 /2 3 /2 2 0 1 2 3 4 0 /2 3 /2 2 0 1 2 3 4 norm. angular freq. Ω norm. angular freq. Ω 35 / 39
  • 36. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Perturbation of Eigenvalues II ◮ The estimation error can be used to check on the binwise perturbation of eigenvalues of the CSD matrix: 0 /4 /2 3 /4 5 /4 3 /2 7 /4 2 0 1 2 3 4 5 36 / 39
  • 37. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Perturbation of Eigenspaces ◮ Binwise subspace correlation mismatch between ground truth and estimate: 0 /4 /2 3 /4 5 /4 3 /2 7 /4 2 10 -4 10 -3 10 -2 10 -1 10 0 37 / 39
  • 38. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Summary ◮ We have considered the estimation of a space-time covariance matrix; ◮ the variance of the estimator agrees with known results for auto-correlation sequences (1-d, correlated) and instantaneous MIMO systems (M-d, uncorrelated); ◮ awkward, and almost forgotten: ˆR[τ] and the estimated PSD are no longer guaranteed to be positive semi-definite; ◮ the variance of the estimate can be used to predict the perturbation of eigenvalues (and eigenspaces); ◮ this however only works bin-wise: the eigenvalues are not totally differentiable in both Ω and 1/N. 38 / 39
  • 39. Overview Stats ACS Exam ST Sample Cross-Correlation Apps Concl Engage Engagement ◮ If interested, please feel free to try the polynomial matrix toolbox for Matlab: pevd-toolbox.eee.strath.ac.uk ◮ I have a 2.5 year postdoc position as part of UDRC3: dimensionality reduction and processing of high-dim., heterogeneous and non-traditional signals; see vacancies at the University of Strathclyde. 39 / 39