SlideShare a Scribd company logo
1 of 84
Download to read offline
Current limitations of sequential inference in
general hidden Markov models
Pierre Jacob
Department of Statistics, University of Oxford
March 5th
Pierre Jacob Sequential inference in HMM 1/ 60
Outline
1 Setting: online inference in time series
Hidden Markov Models
Implicit models
Exact / sequential / online methods
2 Plug and play methods
Approximate Bayesian Computation
Particle Filters
3 SMC2 for sequential inference
A sequential method for HMM
Not online
4 Numerical experiments
5 Discussion
Pierre Jacob Sequential inference in HMM 2/ 60
Outline
1 Setting: online inference in time series
Hidden Markov Models
Implicit models
Exact / sequential / online methods
2 Plug and play methods
Approximate Bayesian Computation
Particle Filters
3 SMC2 for sequential inference
A sequential method for HMM
Not online
4 Numerical experiments
5 Discussion
Pierre Jacob Sequential inference in HMM 2/ 60
Outline
1 Setting: online inference in time series
Hidden Markov Models
Implicit models
Exact / sequential / online methods
2 Plug and play methods
Approximate Bayesian Computation
Particle Filters
3 SMC2 for sequential inference
A sequential method for HMM
Not online
4 Numerical experiments
5 Discussion
Pierre Jacob Sequential inference in HMM 2/ 60
Hidden Markov Models
y2y1 yT
X2X0 X1 XT
y0
Figure : Graph representation of a general HMM.
(Xt): initial µθ, transition fθ. (Yt) given (Xt): measurement gθ.
Prior on the parameter θ ∈ Θ.
Pierre Jacob Sequential inference in HMM 3/ 60
Phytoplankton–Zooplankton
qqq
q
qq
q
qq
qqq
qq
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
qqq
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
qq
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
qq
q
qq
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqqqqqq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
qqqq
q
q
qqq
q
q
q
q
qqq
q
qq
q
q
q
q
q
q
q
qq
q
qq
q
qqq
q
q
q
qqqqqq
q
q
qq
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qqqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qqq
qqq
qq
q
q
q
q
q
qq
q
q
q
q
0
10
20
30
40
0 100 200 300
time
observations
Figure : A time series of 365 observations generated according to a
phytoplankton–zooplankton model.
Pierre Jacob Sequential inference in HMM 4/ 60
General questions
For each model, how much do the data inform the
parameters?
For each model, how much do the data inform the latent
Markov process?
How much do the data inform the choice of a model?
How to predict future observations?
Pierre Jacob Sequential inference in HMM 5/ 60
Questions translated into integrals
Filtering question:
∫
X
φ(xt) p(dxt | y0:t, θ)
=
1
Zt(θ)
∫
Xt+1
φ(xt)p(dx0:t | θ)
t∏
s=0
p(ys | xs, θ).
Prediction question:
∫
Y
φ(yt+k) p(dyt+k | y0:t, θ)
=
∫
Y
∫
X
φ(yt+k) p(dxt+k | y0:t, θ) p(dyt+k | xt+k, θ).
Pierre Jacob Sequential inference in HMM 6/ 60
Questions translated into integrals
Parameter estimation:
p(y0:t | θ) =
∫
Xt+1
p(dx0 | θ)
t∏
s=1
p(dxs | xs−1, θ)
t∏
s=0
p(ys | xs, θ),
and eventually
∫
Θ
φ(θ)πθ,t(dθ) =
1
Zt
∫
Θ
φ(θ)p(y0:t | θ)πθ(dθ).
If we acknowledge parameter uncertainty, then more questions:
∫
X
φ(xt) p(dxt | y0:t) =
∫
Θ
∫
X
φ(xt)p(dxt | y0:t, θ)πθ,t(dθ).
Pierre Jacob Sequential inference in HMM 7/ 60
Questions translated into integrals
Model choice:
P
(
M = M(m)
| y0:t
)
=
P
(
M = M(m)
)
Z
(m)
t
∑M
m′=1 P
(
M = M(m′)
)
Z
(m′)
t
.
If we acknowledge model uncertainty, then more questions:
∫
Y
φ(yt+k) P(dyt+k | y0:t)
=
M∑
m=1
∫
Θ(m)
∫
Y
φ(yt+k) p(dyt+k | y0:t, θ, M(m)
)
× πθ(m),t(dθ)P
(
M = M(m)
| y0:t
)
.
Pierre Jacob Sequential inference in HMM 8/ 60
Outline
1 Setting: online inference in time series
Hidden Markov Models
Implicit models
Exact / sequential / online methods
2 Plug and play methods
Approximate Bayesian Computation
Particle Filters
3 SMC2 for sequential inference
A sequential method for HMM
Not online
4 Numerical experiments
5 Discussion
Pierre Jacob Sequential inference in HMM 8/ 60
Phytoplankton–Zooplankton model
Hidden process (xt) = (αt, pt, zt).
At each (integer) time, αt ∼ N(µα, σ2
α).
Given αt,
dpt
dt
= αpt − cptzt,
dzt
dt
= ecptzt − mlzt − mqz2
t .
Observations: log yt ∼ N(log pt, σ2
y).
Set c = 0.25 and e = 0.3, and (log p0, log z0) ∼ N(log 2, 0.2).
Unknown parameters: θ = (µα, σα, σy, ml, mq).
Pierre Jacob Sequential inference in HMM 9/ 60
Implicit models
Even simple, standard scientific models are such that the
implied probability distribution p(dx0:t | θ) admits a density
function that cannot be computed pointwise.
To cover as many models as possible, we can only assume
that the hidden process can be simulated.
This covers cases where xt = ψ(xt−1, k, v1:k), for some integer
k, vector v1:k ∈ Rk, and deterministic function ψ.
Calls for “plug and play” methods.
Time series analysis via mechanistic models,
Bret´o, He, Ionides and King, 2009.
Pierre Jacob Sequential inference in HMM 10/ 60
Outline
1 Setting: online inference in time series
Hidden Markov Models
Implicit models
Exact / sequential / online methods
2 Plug and play methods
Approximate Bayesian Computation
Particle Filters
3 SMC2 for sequential inference
A sequential method for HMM
Not online
4 Numerical experiments
5 Discussion
Pierre Jacob Sequential inference in HMM 10/ 60
Exact methods
Consider the problem of estimating some quantity It.
Consider an estimator IN
t where N is a tuning parameter.
Hopefully N is such that IN
t
some sense
−−−−−−→
N→∞
It.
For instance E[(IN
t − It)2] goes to zero when N → ∞.
Variational methods / Ensemble Kalman Filters are not exact.
Consider the estimator that always returns 29.5. . .
Pierre Jacob Sequential inference in HMM 11/ 60
Sequential methods
Consider the problem of estimating some quantity It, for all
t ≥ 0, e.g. upon the arrival of new data.
Assume the quantities It for all t ≥ 0 are related one to the
other.
A sequential method “updates” the estimate IN
t into IN
t+1.
MCMC methods are not sequential: they have to be re-run
from scratch whenever a new observation arrives.
Therefore, sequential methods are not to be confused with
iterative methods.
Pierre Jacob Sequential inference in HMM 12/ 60
Online methods
Consider the problem of estimating some quantity It, for all
t ≥ 0, e.g. upon the arrival of new data.
A method is online if it provides estimates IN
t of It for all
t ≥ 0, such that. . .
. . . the computational cost of obtaining each IN
t given IN
t−1 is
independent of t,
. . . the precision of the estimate does not explode over time:
r(IN
t ) =
(
E
[
(IN
t − It)2
])1/2
|It|
can be uniformly bounded over t.
Consider the estimator that always returns 29.5. . .
Pierre Jacob Sequential inference in HMM 13/ 60
Outline
1 Setting: online inference in time series
Hidden Markov Models
Implicit models
Exact / sequential / online methods
2 Plug and play methods
Approximate Bayesian Computation
Particle Filters
3 SMC2 for sequential inference
A sequential method for HMM
Not online
4 Numerical experiments
5 Discussion
Pierre Jacob Sequential inference in HMM 13/ 60
Outline
1 Setting: online inference in time series
Hidden Markov Models
Implicit models
Exact / sequential / online methods
2 Plug and play methods
Approximate Bayesian Computation
Particle Filters
3 SMC2 for sequential inference
A sequential method for HMM
Not online
4 Numerical experiments
5 Discussion
Pierre Jacob Sequential inference in HMM 13/ 60
Approximate Bayesian Computation
1 Draw θ from the prior distribution πθ.
2 Draw x0:t, a realisation of the hidden Markov chain given θ.
3 Draw ˆy0:t, a realisation of the observations given x0:t and θ.
4 If D(ˆy0:t, y0:t) ≤ ε, keep (θ, x0:t).
Pierre Jacob Sequential inference in HMM 14/ 60
Approximate Bayesian Computation
Plug and play: only requires simulations from the model.
Exact if D is a distance and ε is zero.
In practice, D is typically not a distance.
The tolerance ε is often chosen implicitely.
E.g., ε is chosen so that 1% of the generated samples is kept.
Better than the 29.5 estimator?
Pierre Jacob Sequential inference in HMM 15/ 60
Outline
1 Setting: online inference in time series
Hidden Markov Models
Implicit models
Exact / sequential / online methods
2 Plug and play methods
Approximate Bayesian Computation
Particle Filters
3 SMC2 for sequential inference
A sequential method for HMM
Not online
4 Numerical experiments
5 Discussion
Pierre Jacob Sequential inference in HMM 15/ 60
Sequential Monte Carlo for filtering
Objects of interest:
filtering distributions: p(xt|y0:t, θ), for all t, for a given θ,
likelihood: p(y0:t | θ) =
∫
p(y0:t | x0:t, θ)p(x0:t | θ)dx0:t.
Particle filters:
propagate recursively Nx particles approximating p(xt | y0:t, θ)
for all t,
give likelihood estimates ˆpNx (y0:t | θ) of p(y0:t | θ) for all t.
Pierre Jacob Sequential inference in HMM 16/ 60
Plug and play requirement
Particle filters can be implemented if
the hidden process can be simulated forward, given any θ:
x0 ∼ µθ and xt ∼ fθ(· | xt−1),
the measurement density gθ(y | x) can be evaluated
point-wise, for any x, y, θ.
A bit less “plug and play” than ABC.
Pierre Jacob Sequential inference in HMM 17/ 60
Sequential Monte Carlo for filtering
y2
X2X0
y1
X1
...
... yT
XT
θ
Pierre Jacob Sequential inference in HMM 18/ 60
Sequential Monte Carlo for filtering
y2
X2X0
y1
X1
...
... yT
XT
θ
Pierre Jacob Sequential inference in HMM 18/ 60
Sequential Monte Carlo for filtering
y2
X2X0
y1
X1
...
... yT
XT
θ
Pierre Jacob Sequential inference in HMM 18/ 60
Sequential Monte Carlo for filtering
y2
X2X0
y1
X1
...
... yT
XT
θ
Pierre Jacob Sequential inference in HMM 18/ 60
Sequential Monte Carlo for filtering
y2
X2X0
y1
X1
...
... yT
XT
θ
Pierre Jacob Sequential inference in HMM 18/ 60
Sequential Monte Carlo for filtering
y2
X2X0
y1
X1
...
... yT
XT
θ
Pierre Jacob Sequential inference in HMM 18/ 60
Sequential Monte Carlo for filtering
y2
X2X0
y1
X1
...
... yT
XT
θ
Pierre Jacob Sequential inference in HMM 18/ 60
Sequential Monte Carlo for filtering
Consider I(φt) =
∫
φt(xt)p(xt | y0:t)dxt.
Lp-bound:
E
[
IN
(φt) − I (φt)
p]1/p
≤
c(p) ||φt||∞√
N
.
Central limit theorem:
√
N
(
IN
(φt) − I (φt)
)
D
−−−−→
N→∞
N
(
0, σ2
t
)
.
where σ2
t < σ2
max for all t.
Particle filters are fully online, plug and play, and exact. . . for
filtering.
Pierre Jacob Sequential inference in HMM 19/ 60
Sequential Monte Carlo for filtering
Properties of the likelihood estimator
The likelihood estimator is unbiased,
E
[
ˆpNx
(y0:t | θ)
]
= E
[ t∏
s=0
1
Nx
Nx∑
k=1
wk
s
]
= p(y0:t | θ)
and the relative variance is bounded linearly in time,
V
[
ˆpNx (y0:t | θ)
p(y0:t | θ)
]
≤ C
t
Nx
for some constant C (under some conditions!).
Particle filters are not online for likelihood estimation.
Pierre Jacob Sequential inference in HMM 20/ 60
Outline
1 Setting: online inference in time series
Hidden Markov Models
Implicit models
Exact / sequential / online methods
2 Plug and play methods
Approximate Bayesian Computation
Particle Filters
3 SMC2 for sequential inference
A sequential method for HMM
Not online
4 Numerical experiments
5 Discussion
Pierre Jacob Sequential inference in HMM 20/ 60
Outline
1 Setting: online inference in time series
Hidden Markov Models
Implicit models
Exact / sequential / online methods
2 Plug and play methods
Approximate Bayesian Computation
Particle Filters
3 SMC2 for sequential inference
A sequential method for HMM
Not online
4 Numerical experiments
5 Discussion
Pierre Jacob Sequential inference in HMM 20/ 60
SMC samplers
The goal is now to approximate sequentially
p(θ), p(θ|y0), . . . , p(θ|y0:T ).
Sequential Monte Carlo samplers.
Jarzynski 1997, Neal 2001, Chopin 2002, Del Moral, Doucet
& Jasra 2006. . .
Propagates a number Nθ of θ-particles approximating
p(θ | y0:t) for all t.
Evidence estimates ˆpNθ (y0:t) ≈ p(y0:t) for all t.
Pierre Jacob Sequential inference in HMM 21/ 60
Targets
p(θ|y1)
p(θ|y1, y2)
p(θ|y1, y2, y3)
p(θ)
Θ
density
Figure : Sequence of target distributions.
Pierre Jacob Sequential inference in HMM 22/ 60
First step
p(θ|y1)
p(θ)
q qqq qq q qqq qq qq qq qq qqqqq qq qqqqqq qqq q qqq qq qqq q qq qq qq
Θ
density
Figure : First distribution in black, next distribution in red.
Pierre Jacob Sequential inference in HMM 23/ 60
Importance Sampling
p(θ|y1)
p(θ)
q
qqq
qq q q
qq qq qq qq qq qqqqq qq
qq qqqq q qq q qqq
qq
qq q q q
q qq qq
Θ
density
Figure : Samples θ weighted by p(θ | y1)/p(θ) ∝ p(y1 | θ).
Pierre Jacob Sequential inference in HMM 24/ 60
Resampling and move
p(θ|y1)
p(θ)
q qqqq qq q q qqq qq qqq qqq qqqq qqq qq qq q qqq q qqq q qq qqq qq q qq
Θ
density
Figure : Samples θ after resampling and MCMC move.
Pierre Jacob Sequential inference in HMM 25/ 60
SMC samplers
1: Sample from the prior θ(m) ∼ p(·) for m ∈ [1, Nθ].
2: Set ω(m) ← 1/Nθ.
3: for t = 0 to T do
4: Reweight ω(m) ← ω(m) × p(yt|y0:t−1, θ(m)) for m ∈ [1, Nθ].
5: if some degeneracy criterion is met then
6: Resample the particles, reset the weights ω(m) ← 1/Nθ.
7: MCMC move for each particle, targeting p(θ | y0:t).
8: end if
9: end for
Pierre Jacob Sequential inference in HMM 26/ 60
Proposed method
SMC samplers require
pointwise evaluations of p(yt | y0:t−1, θ),
MCMC moves targeting each intermediate distribution.
For Hidden Markov models, the likelihood is intractable.
Particle filters provide likelihood approximations for a given θ.
Hence, we equip each θ-particle with its own particle filter.
Pierre Jacob Sequential inference in HMM 27/ 60
One step of SMC2
For each θ-particle θ
(m)
t , perform one step of its particle filter:
to obtain pNx (yt+1 | y0:t, θ
(m)
t ) and reweight:
ω
(m)
t+1 = ω
(m)
t × pNx
(yt+1|y0:t, θ
(m)
t ).
Pierre Jacob Sequential inference in HMM 28/ 60
One step of SMC2
Whenever
Effective sample size =
(∑Nθ
m=1 ω
(m)
t+1
)2
∑Nθ
m=1
(
ω
(m)
t+1
)2 < threshold × Nθ
(Kong, Liu & Wong, 1994)
resample the θ-particles and move them by PMCMC, i.e.
Propose θ⋆ ∼ q(·|θ
(m)
t ) and run PF(Nx, θ⋆) for t + 1 steps.
Accept or not based on ˆpNx (y0:t+1 | θ⋆).
Pierre Jacob Sequential inference in HMM 29/ 60
SMC2
y2
X2X0
y1
X1
...
... yT
XT
Θ
Pierre Jacob Sequential inference in HMM 30/ 60
SMC2
y2
X2X0
y1
X1
...
... yT
XT
Θ
Pierre Jacob Sequential inference in HMM 30/ 60
SMC2
y2
X2X0
y1
X1
...
... yT
XT
Θ
Pierre Jacob Sequential inference in HMM 30/ 60
SMC2
y2
X2X0
y1
X1
...
... yT
XT
Θ
Pierre Jacob Sequential inference in HMM 30/ 60
SMC2
y2
X2X0
y1
X1
...
... yT
XT
Θ
Pierre Jacob Sequential inference in HMM 30/ 60
SMC2
y2
X2X0
y1
X1
...
... yT
XT
Θ
Pierre Jacob Sequential inference in HMM 30/ 60
SMC2
y2
X2X0
y1
X1
...
... yT
XT
Θ
Pierre Jacob Sequential inference in HMM 30/ 60
Exact approximation
SMC2 is a standard SMC sampler on an extended space, with
target distribution:
πt(θ, x1:Nx
0:t , a1:Nx
0:t−1) = p(θ|y0:t)
×
1
Nt+1
x
Nx∑
n=1
p(xn
0:t|θ, y0:t)



Nx∏
i=1
i̸=hn
t (1)
q0,θ(xi
0)



×



t∏
s=1
Nx∏
i=1
i̸=hn
t (s)
W
ai
s−1
s−1,θqs,θ(xi
s|x
ai
s−1
s−1 )



.
Related to pseudo-marginal and PMCMC methods.
Pierre Jacob Sequential inference in HMM 31/ 60
Exact approximation
From the extended target representation, we obtain
θ from p(θ | y1:t),
xn
0:t from p(x0:t | θ, y1:t),
thus allowing joint state and parameter inference.
Evidence estimates are obtained by computing the average of
the θ-weights ω
(m)
t .
The “extended target” argument yields consistency for any
fixed Nx, when Nθ goes to infinity.
Exact method, sequential by design, but not online.
Pierre Jacob Sequential inference in HMM 32/ 60
Outline
1 Setting: online inference in time series
Hidden Markov Models
Implicit models
Exact / sequential / online methods
2 Plug and play methods
Approximate Bayesian Computation
Particle Filters
3 SMC2 for sequential inference
A sequential method for HMM
Not online
4 Numerical experiments
5 Discussion
Pierre Jacob Sequential inference in HMM 32/ 60
Scalability in T
Cost if MCMC move at each time step
A single move step at time t costs O (tNxNθ).
If move at every step, the total cost becomes O
(
t2NxNθ
)
.
If Nx = Ct, the total cost becomes O
(
t3Nθ
)
.
With adaptive resampling, the cost is only O
(
t2Nθ
)
. Why?
Pierre Jacob Sequential inference in HMM 33/ 60
Scalability in T
512
1024
0 100 200 300
time
ESS
Figure : Effective Sample Size against time, for the PZ model.
Pierre Jacob Sequential inference in HMM 34/ 60
Scalability in T
0e+00
2e+06
4e+06
6e+06
0 100 200 300
time
Cumulativecostperparticle
Figure : Cumulative cost per θ-particle during one run of SMC2
. The
cost is measured by the number of calls to the transition sampling
function. Nx is fixed.
Pierre Jacob Sequential inference in HMM 35/ 60
Scalability in T
iterations
ESS
0
200
400
600
800
1000
1000 2000 3000 4000 5000
Figure : Effective Sample Size against time, for a linear Gaussian model.
Pierre Jacob Sequential inference in HMM 36/ 60
Scalability in T
iteration
computingtime(squarerootscale)
2500
10000
22500
40000
1000 2000 3000 4000 5000
Figure :
√
computing time against time. Nx is increased to achieve a
fixed acceptance rate in the PMCMC steps.
Pierre Jacob Sequential inference in HMM 37/ 60
Scalability in T
Under Bernstein-Von Mises, the posterior becomes Gaussian.
p(θ|y1:ct)
p(θ|y1:t)
Θ
density
E[ESS] from p(θ | y1:t) to p(θ | y1:ct) becomes independent of t.
Hence resampling times occur geometrically: τk ≈ ck with c > 1.
Pierre Jacob Sequential inference in HMM 38/ 60
Scalability in T
More formally. . .
The expected ESS at time t + k, if the last resampling time was t,
is related to
Vp(θ|y1:t)
[
p(θ | y1:t+k)
p(θ | y1:t)
]
= Vp(θ|y1:t)
[
L(θ; y1:t+k)
L(θ; y1:t)
∫
Θ L(θ; y1:t)p(dθ)
∫
Θ L(θ; y1:t+k)p(dθ)
]
.
Then Laplace expansions of L yield similar results as before, under
regularity conditions.
Pierre Jacob Sequential inference in HMM 39/ 60
Scalability in T
Open problem
Online exact Bayesian inference in linear time?
On one hand dim(X0:t) = dim(X) × (t + 1) which grows . . .
. . . but θ itself is of fixed dimension and p(θ | y1:t) ≈ N(θ⋆, v⋆/t)!
Our specific problem
Move steps at time t imply running a particle filter from time zero.
Attempts have been made at re-starting from t − ∆ but then, bias.
Pierre Jacob Sequential inference in HMM 40/ 60
Outline
1 Setting: online inference in time series
Hidden Markov Models
Implicit models
Exact / sequential / online methods
2 Plug and play methods
Approximate Bayesian Computation
Particle Filters
3 SMC2 for sequential inference
A sequential method for HMM
Not online
4 Numerical experiments
5 Discussion
Pierre Jacob Sequential inference in HMM 40/ 60
Phytoplankton–Zooplankton: model
Hidden process (xt) = (αt, pt, zt).
At each (integer) time, αt ∼ N(µα, σ2
α).
Given αt,
dpt
dt
= αpt − cptzt,
dzt
dt
= ecptzt − mlzt − mqz2
t .
Observations: log yt ∼ N(log pt, σ2
y).
Set c = 0.25 and e = 0.3, and (log p0, log z0) ∼ N(log 2, 0.2).
Unknown parameters: θ = (µα, σα, σy, ml, mq).
Pierre Jacob Sequential inference in HMM 41/ 60
Phytoplankton–Zooplankton: observations
qqq
q
qq
q
qq
qqq
qq
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
qqq
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
qq
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
qq
q
qq
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqqqqqq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
qqqq
q
q
qqq
q
q
q
q
qqq
q
qq
q
q
q
q
q
q
q
qq
q
qq
q
qqq
q
q
q
qqqqqq
q
q
qq
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qqqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qqq
qqq
qq
q
q
q
q
q
qq
q
q
q
q
0
10
20
30
40
0 100 200 300
time
observations
Figure : A time series of 365 observations generated according to a
phytoplankton–zooplankton model.
Pierre Jacob Sequential inference in HMM 42/ 60
Phytoplankton–Zooplankton: parameters
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
0.60
0.65
0.70
0.75
0.80
0.45 0.50 0.55 0.60
σα
µα
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
0.10
0.15
0.20
0.25
0.30
0.45 0.50 0.55 0.60
σα
σy
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
0.08
0.09
0.10
0.11
0.05 0.10 0.15 0.20
ml
mq
Figure : Posterior distribution of the parameters.
Pierre Jacob Sequential inference in HMM 43/ 60
Phytoplankton–Zooplankton: parameters
0.00
0.25
0.50
0.75
1.00
0 10 20 30 40 50
time
µα
Figure : Evolution, over the first 50 time steps, of the posterior
distribution of µα.
Pierre Jacob Sequential inference in HMM 44/ 60
Phytoplankton–Zooplankton: parameters
0.00
0.25
0.50
0.75
1.00
0 10 20 30 40 50
time
σα
Figure : Evolution, over the first 50 time steps, of the posterior
distribution of σα.
Pierre Jacob Sequential inference in HMM 45/ 60
Phytoplankton–Zooplankton: parameters
0.00
0.25
0.50
0.75
1.00
0 10 20 30 40 50
time
σy
Figure : Evolution, over the first 50 time steps, of the posterior
distribution of σy.
Pierre Jacob Sequential inference in HMM 46/ 60
Phytoplankton–Zooplankton: parameters
0.00
0.25
0.50
0.75
1.00
0 10 20 30 40 50
time
ml
Figure : Evolution, over the first 50 time steps, of the posterior
distribution of ml.
Pierre Jacob Sequential inference in HMM 47/ 60
Phytoplankton–Zooplankton: parameters
0.00
0.25
0.50
0.75
1.00
0 10 20 30 40 50
time
mq
Figure : Evolution, over the first 50 time steps, of the posterior
distribution of mq.
Pierre Jacob Sequential inference in HMM 48/ 60
Phytoplankton–Zooplankton: prediction
q
q
q q
q
q
q q q
q
q q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q q
q q
q
q
q
q
q
q
q q
q
0
10
20
30
0 10 20 30 40 50
time
observations
Figure : One step predictions under parameter uncertainty.
Pierre Jacob Sequential inference in HMM 49/ 60
Phytoplankton–Zooplankton: prediction
q
q
qq
q
q
qqq
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
qqqq
q
q
q
q
q
qq
q
q
q
q
q
qqq
q
q
qqqqq
qq
q
qq
q
q
qq
q
q
qq
q
qq
q
q
qqq
q
q
q
qqq
q
qqq
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
qqqq
q
q
q
q
q
q
q
q
q
q
q
qqqqqqqq
q
qq
qq
q
q
q
qq
q
q
q
q
qqqq
q
q
q
qqq
qqq
qq
q
q
q
q
qq
qq
qqq
qq
q
qqqqqq
q
q
qq
qq
q
qq
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qqq
q
q
q
qqq
q
qq
q
q
q
q
q
qq
q
q
q
q
qq
qqqqq
qq
q
q
q
q
q
qq
q
q
q
0
10
20
30
40
0 100 200 300
time
observations
Figure : One step predictions under parameter uncertainty.
Pierre Jacob Sequential inference in HMM 50/ 60
Phytoplankton–Zooplankton: another model
dpt
dt
= αpt − cptzt
dzt
dt
= ecptzt − mlzt
or
dpt
dt
= αpt − cptzt
dzt
dt
= ecptzt − mlzt−mqz2
t
?
Pierre Jacob Sequential inference in HMM 51/ 60
Phytoplankton–Zooplankton: model choice
1
100
0 25 50 75 100
time
Bayesfactor
Figure : Bayes Factor against time.
Pierre Jacob Sequential inference in HMM 52/ 60
Phytoplankton–Zooplankton: model choice
1e+00
1e+02
1e+06
1e+12
1e+18
0 100 200 300
time
Bayesfactor
Figure : Bayes Factor against time.
Pierre Jacob Sequential inference in HMM 53/ 60
Outline
1 Setting: online inference in time series
Hidden Markov Models
Implicit models
Exact / sequential / online methods
2 Plug and play methods
Approximate Bayesian Computation
Particle Filters
3 SMC2 for sequential inference
A sequential method for HMM
Not online
4 Numerical experiments
5 Discussion
Pierre Jacob Sequential inference in HMM 53/ 60
Forgetting mechanism for hidden states
Forgetting property of a uniformly ergodic Markov chain:
||pν
t − pµ
t ||TV ≤ Cρt
where ν, µ are two initial distributions pν
t is the distribution of
Xt after t steps, ρ < 1, C > 0.
Similarly, the filtering distribution πt(dxt) = p(dxt | y0:t)
forgets its initial condition geometrically fast.
Introduce the operator Φt, taking a measure, applying a
Markov kernel to it, and then a Bayes update using yt.
Under conditions on the data generating process and the
model,
||Φ0:t(µ) − Φ0:t(ν)||TV ≤ Cρt
.
Pierre Jacob Sequential inference in HMM 54/ 60
Forgetting mechanism for parameters
Forgetting mechanism for Bayesian posterior distribution:
||pν
t − pµ
t ||TV ≤
1
√
t
C.
Huge literature on prior robustness.
Posterior forgetting goes much slower than Markov chain
forgetting.
An error in the approximation of p(θ | y1:t) damages the
subsequent approximations of p(θ | y1:t+k), for many k’s.
SMC samplers are stable because of the added MCMC steps,
which costs increase with t.
Pierre Jacob Sequential inference in HMM 55/ 60
Other challenges
Dimensionality: the other big open problem.
Particle filter’s errors grow exponentially fast with dim(X).
Can local particle filters beat the curse of dimensionality?
Rebeschini, van Handel, 2013.
Carefully analyzed biased approximations.
Assumption of a spatial forgetting effect from the model.
Pierre Jacob Sequential inference in HMM 56/ 60
Other challenges
Particle filters provide useful estimates. . .
. . . but no estimates of their associated variance.
Can we estimate the variance without having to run the
algorithm many times?
Pierre Jacob Sequential inference in HMM 57/ 60
Other challenges
Particle methods are more and more commonly used outside
the setting of HMMs.
For instance, in the setting of long memory processes:
probabilistic programming, Bayesian non-parametric
applications.
Are particle methods useful for models that do not satisfy
forgetting properties?
Stability of Feynman-Kac formulae with path-dependent
potentials,
Chopin, Del Moral, Rubenthaler, 2009.
Pierre Jacob Sequential inference in HMM 58/ 60
Discussion
SMC2 allows sequential exact approximation in HMMs, but
not online.
Properties of posterior distributions could help achieving exact
online inference, or prove that it is, in fact, impossible.
Do we want to sample from the posterior as t → ∞?
Importance of plug and play inference for time series.
Implementation in LibBi, with GPU support.
Pierre Jacob Sequential inference in HMM 59/ 60
Links
Particle Markov chain Monte Carlo,
Andrieu, Doucet, Holenstein, 2010 (JRSS B)
Sequential Monte Carlo samplers: error bounds and
insensitivity to initial conditions,
Whiteley, 2011 (Stoch. Analysis and Appl.).
SMC2: an algorithm for sequential analysis of HMM,
Chopin, Jacob, Papaspiliopoulos, 2013 (JRSS B)
www.libbi.org
Pierre Jacob Sequential inference in HMM 60/ 60

More Related Content

What's hot

DSP_FOEHU - MATLAB 04 - The Discrete Fourier Transform (DFT)
DSP_FOEHU - MATLAB 04 - The Discrete Fourier Transform (DFT)DSP_FOEHU - MATLAB 04 - The Discrete Fourier Transform (DFT)
DSP_FOEHU - MATLAB 04 - The Discrete Fourier Transform (DFT)Amr E. Mohamed
 
Lecture1 Intro To Signa
Lecture1 Intro To SignaLecture1 Intro To Signa
Lecture1 Intro To Signababak danyal
 
DSP_2018_FOEHU - Lec 03 - Discrete-Time Signals and Systems
DSP_2018_FOEHU - Lec 03 - Discrete-Time Signals and SystemsDSP_2018_FOEHU - Lec 03 - Discrete-Time Signals and Systems
DSP_2018_FOEHU - Lec 03 - Discrete-Time Signals and SystemsAmr E. Mohamed
 
Rfid presentation in internet
Rfid presentation in internetRfid presentation in internet
Rfid presentation in internetAli Azarnia
 
Matt Purkeypile's Doctoral Dissertation Defense Slides
Matt Purkeypile's Doctoral Dissertation Defense SlidesMatt Purkeypile's Doctoral Dissertation Defense Slides
Matt Purkeypile's Doctoral Dissertation Defense Slidesmpurkeypile
 
Digital Signal Processing[ECEG-3171]-Ch1_L05
Digital Signal Processing[ECEG-3171]-Ch1_L05Digital Signal Processing[ECEG-3171]-Ch1_L05
Digital Signal Processing[ECEG-3171]-Ch1_L05Rediet Moges
 
Computability - Tractable, Intractable and Non-computable Function
Computability - Tractable, Intractable and Non-computable FunctionComputability - Tractable, Intractable and Non-computable Function
Computability - Tractable, Intractable and Non-computable FunctionReggie Niccolo Santos
 
DSP_FOEHU - Lec 09 - Fast Fourier Transform
DSP_FOEHU - Lec 09 - Fast Fourier TransformDSP_FOEHU - Lec 09 - Fast Fourier Transform
DSP_FOEHU - Lec 09 - Fast Fourier TransformAmr E. Mohamed
 
DSP_FOEHU - Lec 08 - The Discrete Fourier Transform
DSP_FOEHU - Lec 08 - The Discrete Fourier TransformDSP_FOEHU - Lec 08 - The Discrete Fourier Transform
DSP_FOEHU - Lec 08 - The Discrete Fourier TransformAmr E. Mohamed
 
Lecture5 Signal and Systems
Lecture5 Signal and SystemsLecture5 Signal and Systems
Lecture5 Signal and Systemsbabak danyal
 

What's hot (19)

Impulse Response ppt
Impulse Response pptImpulse Response ppt
Impulse Response ppt
 
DSP_FOEHU - MATLAB 04 - The Discrete Fourier Transform (DFT)
DSP_FOEHU - MATLAB 04 - The Discrete Fourier Transform (DFT)DSP_FOEHU - MATLAB 04 - The Discrete Fourier Transform (DFT)
DSP_FOEHU - MATLAB 04 - The Discrete Fourier Transform (DFT)
 
Lecture1 Intro To Signa
Lecture1 Intro To SignaLecture1 Intro To Signa
Lecture1 Intro To Signa
 
DSP_2018_FOEHU - Lec 03 - Discrete-Time Signals and Systems
DSP_2018_FOEHU - Lec 03 - Discrete-Time Signals and SystemsDSP_2018_FOEHU - Lec 03 - Discrete-Time Signals and Systems
DSP_2018_FOEHU - Lec 03 - Discrete-Time Signals and Systems
 
Signals and systems assignment help
Signals and systems assignment helpSignals and systems assignment help
Signals and systems assignment help
 
Rfid presentation in internet
Rfid presentation in internetRfid presentation in internet
Rfid presentation in internet
 
1 6
1 61 6
1 6
 
Hastings 1970
Hastings 1970Hastings 1970
Hastings 1970
 
Matt Purkeypile's Doctoral Dissertation Defense Slides
Matt Purkeypile's Doctoral Dissertation Defense SlidesMatt Purkeypile's Doctoral Dissertation Defense Slides
Matt Purkeypile's Doctoral Dissertation Defense Slides
 
Lti system(akept)
Lti system(akept)Lti system(akept)
Lti system(akept)
 
Stochastic matrices
Stochastic matricesStochastic matrices
Stochastic matrices
 
signals and system
signals and systemsignals and system
signals and system
 
Digital Signal Processing[ECEG-3171]-Ch1_L05
Digital Signal Processing[ECEG-3171]-Ch1_L05Digital Signal Processing[ECEG-3171]-Ch1_L05
Digital Signal Processing[ECEG-3171]-Ch1_L05
 
Computability - Tractable, Intractable and Non-computable Function
Computability - Tractable, Intractable and Non-computable FunctionComputability - Tractable, Intractable and Non-computable Function
Computability - Tractable, Intractable and Non-computable Function
 
Introduction to fourier analysis
Introduction to fourier analysisIntroduction to fourier analysis
Introduction to fourier analysis
 
DSP_FOEHU - Lec 09 - Fast Fourier Transform
DSP_FOEHU - Lec 09 - Fast Fourier TransformDSP_FOEHU - Lec 09 - Fast Fourier Transform
DSP_FOEHU - Lec 09 - Fast Fourier Transform
 
By BIRASA FABRICE
By BIRASA FABRICEBy BIRASA FABRICE
By BIRASA FABRICE
 
DSP_FOEHU - Lec 08 - The Discrete Fourier Transform
DSP_FOEHU - Lec 08 - The Discrete Fourier TransformDSP_FOEHU - Lec 08 - The Discrete Fourier Transform
DSP_FOEHU - Lec 08 - The Discrete Fourier Transform
 
Lecture5 Signal and Systems
Lecture5 Signal and SystemsLecture5 Signal and Systems
Lecture5 Signal and Systems
 

Similar to Current limitations of sequential inference in general hidden Markov models

2012 mdsp pr06  hmm
2012 mdsp pr06  hmm2012 mdsp pr06  hmm
2012 mdsp pr06  hmmnozomuhamada
 
SMC^2: an algorithm for sequential analysis of state-space models
SMC^2: an algorithm for sequential analysis of state-space modelsSMC^2: an algorithm for sequential analysis of state-space models
SMC^2: an algorithm for sequential analysis of state-space modelsPierre Jacob
 
MCMC and likelihood-free methods
MCMC and likelihood-free methodsMCMC and likelihood-free methods
MCMC and likelihood-free methodsChristian Robert
 
Firefly exact MCMC for Big Data
Firefly exact MCMC for Big DataFirefly exact MCMC for Big Data
Firefly exact MCMC for Big DataGianvito Siciliano
 
Computational Intelligence for Time Series Prediction
Computational Intelligence for Time Series PredictionComputational Intelligence for Time Series Prediction
Computational Intelligence for Time Series PredictionGianluca Bontempi
 
My data are incomplete and noisy: Information-reduction statistical methods f...
My data are incomplete and noisy: Information-reduction statistical methods f...My data are incomplete and noisy: Information-reduction statistical methods f...
My data are incomplete and noisy: Information-reduction statistical methods f...Umberto Picchini
 
Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...Pierre Jacob
 
Sample presentation slides
Sample presentation slidesSample presentation slides
Sample presentation slidesvahid baghi
 
Introduction to MCMC methods
Introduction to MCMC methodsIntroduction to MCMC methods
Introduction to MCMC methodsChristian Robert
 
Bayesian Experimental Design for Stochastic Kinetic Models
Bayesian Experimental Design for Stochastic Kinetic ModelsBayesian Experimental Design for Stochastic Kinetic Models
Bayesian Experimental Design for Stochastic Kinetic ModelsColin Gillespie
 
Sequential Monte Carlo algorithms for agent-based models of disease transmission
Sequential Monte Carlo algorithms for agent-based models of disease transmissionSequential Monte Carlo algorithms for agent-based models of disease transmission
Sequential Monte Carlo algorithms for agent-based models of disease transmissionJeremyHeng10
 
A walk through the intersection between machine learning and mechanistic mode...
A walk through the intersection between machine learning and mechanistic mode...A walk through the intersection between machine learning and mechanistic mode...
A walk through the intersection between machine learning and mechanistic mode...JuanPabloCarbajal3
 
Hierarchical matrix techniques for maximum likelihood covariance estimation
Hierarchical matrix techniques for maximum likelihood covariance estimationHierarchical matrix techniques for maximum likelihood covariance estimation
Hierarchical matrix techniques for maximum likelihood covariance estimationAlexander Litvinenko
 
Controlled sequential Monte Carlo
Controlled sequential Monte Carlo Controlled sequential Monte Carlo
Controlled sequential Monte Carlo JeremyHeng10
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical MethodsChristian Robert
 
Ray : modeling dynamic systems
Ray : modeling dynamic systemsRay : modeling dynamic systems
Ray : modeling dynamic systemsHouw Liong The
 

Similar to Current limitations of sequential inference in general hidden Markov models (20)

2012 mdsp pr06  hmm
2012 mdsp pr06  hmm2012 mdsp pr06  hmm
2012 mdsp pr06  hmm
 
SMC^2: an algorithm for sequential analysis of state-space models
SMC^2: an algorithm for sequential analysis of state-space modelsSMC^2: an algorithm for sequential analysis of state-space models
SMC^2: an algorithm for sequential analysis of state-space models
 
MCMC and likelihood-free methods
MCMC and likelihood-free methodsMCMC and likelihood-free methods
MCMC and likelihood-free methods
 
Firefly exact MCMC for Big Data
Firefly exact MCMC for Big DataFirefly exact MCMC for Big Data
Firefly exact MCMC for Big Data
 
intro
introintro
intro
 
Computational Intelligence for Time Series Prediction
Computational Intelligence for Time Series PredictionComputational Intelligence for Time Series Prediction
Computational Intelligence for Time Series Prediction
 
My data are incomplete and noisy: Information-reduction statistical methods f...
My data are incomplete and noisy: Information-reduction statistical methods f...My data are incomplete and noisy: Information-reduction statistical methods f...
My data are incomplete and noisy: Information-reduction statistical methods f...
 
Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...
 
Sample presentation slides
Sample presentation slidesSample presentation slides
Sample presentation slides
 
Introduction to MCMC methods
Introduction to MCMC methodsIntroduction to MCMC methods
Introduction to MCMC methods
 
Bayesian Experimental Design for Stochastic Kinetic Models
Bayesian Experimental Design for Stochastic Kinetic ModelsBayesian Experimental Design for Stochastic Kinetic Models
Bayesian Experimental Design for Stochastic Kinetic Models
 
Sequential Monte Carlo algorithms for agent-based models of disease transmission
Sequential Monte Carlo algorithms for agent-based models of disease transmissionSequential Monte Carlo algorithms for agent-based models of disease transmission
Sequential Monte Carlo algorithms for agent-based models of disease transmission
 
A walk through the intersection between machine learning and mechanistic mode...
A walk through the intersection between machine learning and mechanistic mode...A walk through the intersection between machine learning and mechanistic mode...
A walk through the intersection between machine learning and mechanistic mode...
 
Hierarchical matrix techniques for maximum likelihood covariance estimation
Hierarchical matrix techniques for maximum likelihood covariance estimationHierarchical matrix techniques for maximum likelihood covariance estimation
Hierarchical matrix techniques for maximum likelihood covariance estimation
 
Controlled sequential Monte Carlo
Controlled sequential Monte Carlo Controlled sequential Monte Carlo
Controlled sequential Monte Carlo
 
Shors'algorithm simplified.pptx
Shors'algorithm simplified.pptxShors'algorithm simplified.pptx
Shors'algorithm simplified.pptx
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical Methods
 
Ray : modeling dynamic systems
Ray : modeling dynamic systemsRay : modeling dynamic systems
Ray : modeling dynamic systems
 
002 ray modeling dynamic systems
002 ray modeling dynamic systems002 ray modeling dynamic systems
002 ray modeling dynamic systems
 
002 ray modeling dynamic systems
002 ray modeling dynamic systems002 ray modeling dynamic systems
002 ray modeling dynamic systems
 

More from Pierre Jacob

Talk at CIRM on Poisson equation and debiasing techniques
Talk at CIRM on Poisson equation and debiasing techniquesTalk at CIRM on Poisson equation and debiasing techniques
Talk at CIRM on Poisson equation and debiasing techniquesPierre Jacob
 
ISBA 2022 Susie Bayarri lecture
ISBA 2022 Susie Bayarri lectureISBA 2022 Susie Bayarri lecture
ISBA 2022 Susie Bayarri lecturePierre Jacob
 
Couplings of Markov chains and the Poisson equation
Couplings of Markov chains and the Poisson equation Couplings of Markov chains and the Poisson equation
Couplings of Markov chains and the Poisson equation Pierre Jacob
 
Monte Carlo methods for some not-quite-but-almost Bayesian problems
Monte Carlo methods for some not-quite-but-almost Bayesian problemsMonte Carlo methods for some not-quite-but-almost Bayesian problems
Monte Carlo methods for some not-quite-but-almost Bayesian problemsPierre Jacob
 
Monte Carlo methods for some not-quite-but-almost Bayesian problems
Monte Carlo methods for some not-quite-but-almost Bayesian problemsMonte Carlo methods for some not-quite-but-almost Bayesian problems
Monte Carlo methods for some not-quite-but-almost Bayesian problemsPierre Jacob
 
Markov chain Monte Carlo methods and some attempts at parallelizing them
Markov chain Monte Carlo methods and some attempts at parallelizing themMarkov chain Monte Carlo methods and some attempts at parallelizing them
Markov chain Monte Carlo methods and some attempts at parallelizing themPierre Jacob
 
Unbiased MCMC with couplings
Unbiased MCMC with couplingsUnbiased MCMC with couplings
Unbiased MCMC with couplingsPierre Jacob
 
Unbiased Markov chain Monte Carlo methods
Unbiased Markov chain Monte Carlo methods Unbiased Markov chain Monte Carlo methods
Unbiased Markov chain Monte Carlo methods Pierre Jacob
 
Recent developments on unbiased MCMC
Recent developments on unbiased MCMCRecent developments on unbiased MCMC
Recent developments on unbiased MCMCPierre Jacob
 
Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...Pierre Jacob
 
Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...Pierre Jacob
 
Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...Pierre Jacob
 
On non-negative unbiased estimators
On non-negative unbiased estimatorsOn non-negative unbiased estimators
On non-negative unbiased estimatorsPierre Jacob
 
Path storage in the particle filter
Path storage in the particle filterPath storage in the particle filter
Path storage in the particle filterPierre Jacob
 
Density exploration methods
Density exploration methodsDensity exploration methods
Density exploration methodsPierre Jacob
 
PAWL - GPU meeting @ Warwick
PAWL - GPU meeting @ WarwickPAWL - GPU meeting @ Warwick
PAWL - GPU meeting @ WarwickPierre Jacob
 
Presentation of SMC^2 at BISP7
Presentation of SMC^2 at BISP7Presentation of SMC^2 at BISP7
Presentation of SMC^2 at BISP7Pierre Jacob
 
Presentation MCB seminar 09032011
Presentation MCB seminar 09032011Presentation MCB seminar 09032011
Presentation MCB seminar 09032011Pierre Jacob
 

More from Pierre Jacob (18)

Talk at CIRM on Poisson equation and debiasing techniques
Talk at CIRM on Poisson equation and debiasing techniquesTalk at CIRM on Poisson equation and debiasing techniques
Talk at CIRM on Poisson equation and debiasing techniques
 
ISBA 2022 Susie Bayarri lecture
ISBA 2022 Susie Bayarri lectureISBA 2022 Susie Bayarri lecture
ISBA 2022 Susie Bayarri lecture
 
Couplings of Markov chains and the Poisson equation
Couplings of Markov chains and the Poisson equation Couplings of Markov chains and the Poisson equation
Couplings of Markov chains and the Poisson equation
 
Monte Carlo methods for some not-quite-but-almost Bayesian problems
Monte Carlo methods for some not-quite-but-almost Bayesian problemsMonte Carlo methods for some not-quite-but-almost Bayesian problems
Monte Carlo methods for some not-quite-but-almost Bayesian problems
 
Monte Carlo methods for some not-quite-but-almost Bayesian problems
Monte Carlo methods for some not-quite-but-almost Bayesian problemsMonte Carlo methods for some not-quite-but-almost Bayesian problems
Monte Carlo methods for some not-quite-but-almost Bayesian problems
 
Markov chain Monte Carlo methods and some attempts at parallelizing them
Markov chain Monte Carlo methods and some attempts at parallelizing themMarkov chain Monte Carlo methods and some attempts at parallelizing them
Markov chain Monte Carlo methods and some attempts at parallelizing them
 
Unbiased MCMC with couplings
Unbiased MCMC with couplingsUnbiased MCMC with couplings
Unbiased MCMC with couplings
 
Unbiased Markov chain Monte Carlo methods
Unbiased Markov chain Monte Carlo methods Unbiased Markov chain Monte Carlo methods
Unbiased Markov chain Monte Carlo methods
 
Recent developments on unbiased MCMC
Recent developments on unbiased MCMCRecent developments on unbiased MCMC
Recent developments on unbiased MCMC
 
Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...
 
Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...
 
Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...
 
On non-negative unbiased estimators
On non-negative unbiased estimatorsOn non-negative unbiased estimators
On non-negative unbiased estimators
 
Path storage in the particle filter
Path storage in the particle filterPath storage in the particle filter
Path storage in the particle filter
 
Density exploration methods
Density exploration methodsDensity exploration methods
Density exploration methods
 
PAWL - GPU meeting @ Warwick
PAWL - GPU meeting @ WarwickPAWL - GPU meeting @ Warwick
PAWL - GPU meeting @ Warwick
 
Presentation of SMC^2 at BISP7
Presentation of SMC^2 at BISP7Presentation of SMC^2 at BISP7
Presentation of SMC^2 at BISP7
 
Presentation MCB seminar 09032011
Presentation MCB seminar 09032011Presentation MCB seminar 09032011
Presentation MCB seminar 09032011
 

Recently uploaded

Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxRamakrishna Reddy Bijjam
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfPoh-Sun Goh
 
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxEsquimalt MFRC
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxDenish Jangid
 
Interdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptxInterdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptxPooja Bhuva
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...Nguyen Thanh Tu Collection
 
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxAmanpreet Kaur
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxheathfieldcps1
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and ModificationsMJDuyan
 
REMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptxREMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptxDr. Ravikiran H M Gowda
 
How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17Celine George
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibitjbellavia9
 
Fostering Friendships - Enhancing Social Bonds in the Classroom
Fostering Friendships - Enhancing Social Bonds  in the ClassroomFostering Friendships - Enhancing Social Bonds  in the Classroom
Fostering Friendships - Enhancing Social Bonds in the ClassroomPooky Knightsmith
 
Wellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxWellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxJisc
 
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxHMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxmarlenawright1
 
How to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSHow to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSCeline George
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.christianmathematics
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.pptRamjanShidvankar
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17Celine George
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptxMaritesTamaniVerdade
 

Recently uploaded (20)

Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Interdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptxInterdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptx
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and Modifications
 
REMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptxREMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptx
 
How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
Fostering Friendships - Enhancing Social Bonds in the Classroom
Fostering Friendships - Enhancing Social Bonds  in the ClassroomFostering Friendships - Enhancing Social Bonds  in the Classroom
Fostering Friendships - Enhancing Social Bonds in the Classroom
 
Wellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxWellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptx
 
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxHMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
 
How to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSHow to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POS
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 

Current limitations of sequential inference in general hidden Markov models

  • 1. Current limitations of sequential inference in general hidden Markov models Pierre Jacob Department of Statistics, University of Oxford March 5th Pierre Jacob Sequential inference in HMM 1/ 60
  • 2. Outline 1 Setting: online inference in time series Hidden Markov Models Implicit models Exact / sequential / online methods 2 Plug and play methods Approximate Bayesian Computation Particle Filters 3 SMC2 for sequential inference A sequential method for HMM Not online 4 Numerical experiments 5 Discussion Pierre Jacob Sequential inference in HMM 2/ 60
  • 3. Outline 1 Setting: online inference in time series Hidden Markov Models Implicit models Exact / sequential / online methods 2 Plug and play methods Approximate Bayesian Computation Particle Filters 3 SMC2 for sequential inference A sequential method for HMM Not online 4 Numerical experiments 5 Discussion Pierre Jacob Sequential inference in HMM 2/ 60
  • 4. Outline 1 Setting: online inference in time series Hidden Markov Models Implicit models Exact / sequential / online methods 2 Plug and play methods Approximate Bayesian Computation Particle Filters 3 SMC2 for sequential inference A sequential method for HMM Not online 4 Numerical experiments 5 Discussion Pierre Jacob Sequential inference in HMM 2/ 60
  • 5. Hidden Markov Models y2y1 yT X2X0 X1 XT y0 Figure : Graph representation of a general HMM. (Xt): initial µθ, transition fθ. (Yt) given (Xt): measurement gθ. Prior on the parameter θ ∈ Θ. Pierre Jacob Sequential inference in HMM 3/ 60
  • 6. Phytoplankton–Zooplankton qqq q qq q qq qqq qq q qq q q q q q q q qq q q q q q q q q q qq q q q q q q q q qq qq q q q q q q q qq qq q q q q q q qqq q q q q q q qq q qq q q q q q q q qq qq q q qq q q q q q q q q q q qq q qq q q q qq q qq q q q q q qqq q q q q q q q q q q q q q q q q q q q q q q qq q qq q q q q q q q q q q q q q q qq qq q q qq q q q q q q q q q q q q q q q qqqqqqq q q q q q q q qq q q q q q q qq q q q q qqqq q q qqq q q q q qqq q qq q q q q q q q qq q qq q qqq q q q qqqqqq q q qq qq q qq q q q q q q q q q q q q qqqq q q q q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q qqq q q q q q q q q q q q q qq q q q q q q q qqq qqq qq q q q q q qq q q q q 0 10 20 30 40 0 100 200 300 time observations Figure : A time series of 365 observations generated according to a phytoplankton–zooplankton model. Pierre Jacob Sequential inference in HMM 4/ 60
  • 7. General questions For each model, how much do the data inform the parameters? For each model, how much do the data inform the latent Markov process? How much do the data inform the choice of a model? How to predict future observations? Pierre Jacob Sequential inference in HMM 5/ 60
  • 8. Questions translated into integrals Filtering question: ∫ X φ(xt) p(dxt | y0:t, θ) = 1 Zt(θ) ∫ Xt+1 φ(xt)p(dx0:t | θ) t∏ s=0 p(ys | xs, θ). Prediction question: ∫ Y φ(yt+k) p(dyt+k | y0:t, θ) = ∫ Y ∫ X φ(yt+k) p(dxt+k | y0:t, θ) p(dyt+k | xt+k, θ). Pierre Jacob Sequential inference in HMM 6/ 60
  • 9. Questions translated into integrals Parameter estimation: p(y0:t | θ) = ∫ Xt+1 p(dx0 | θ) t∏ s=1 p(dxs | xs−1, θ) t∏ s=0 p(ys | xs, θ), and eventually ∫ Θ φ(θ)πθ,t(dθ) = 1 Zt ∫ Θ φ(θ)p(y0:t | θ)πθ(dθ). If we acknowledge parameter uncertainty, then more questions: ∫ X φ(xt) p(dxt | y0:t) = ∫ Θ ∫ X φ(xt)p(dxt | y0:t, θ)πθ,t(dθ). Pierre Jacob Sequential inference in HMM 7/ 60
  • 10. Questions translated into integrals Model choice: P ( M = M(m) | y0:t ) = P ( M = M(m) ) Z (m) t ∑M m′=1 P ( M = M(m′) ) Z (m′) t . If we acknowledge model uncertainty, then more questions: ∫ Y φ(yt+k) P(dyt+k | y0:t) = M∑ m=1 ∫ Θ(m) ∫ Y φ(yt+k) p(dyt+k | y0:t, θ, M(m) ) × πθ(m),t(dθ)P ( M = M(m) | y0:t ) . Pierre Jacob Sequential inference in HMM 8/ 60
  • 11. Outline 1 Setting: online inference in time series Hidden Markov Models Implicit models Exact / sequential / online methods 2 Plug and play methods Approximate Bayesian Computation Particle Filters 3 SMC2 for sequential inference A sequential method for HMM Not online 4 Numerical experiments 5 Discussion Pierre Jacob Sequential inference in HMM 8/ 60
  • 12. Phytoplankton–Zooplankton model Hidden process (xt) = (αt, pt, zt). At each (integer) time, αt ∼ N(µα, σ2 α). Given αt, dpt dt = αpt − cptzt, dzt dt = ecptzt − mlzt − mqz2 t . Observations: log yt ∼ N(log pt, σ2 y). Set c = 0.25 and e = 0.3, and (log p0, log z0) ∼ N(log 2, 0.2). Unknown parameters: θ = (µα, σα, σy, ml, mq). Pierre Jacob Sequential inference in HMM 9/ 60
  • 13. Implicit models Even simple, standard scientific models are such that the implied probability distribution p(dx0:t | θ) admits a density function that cannot be computed pointwise. To cover as many models as possible, we can only assume that the hidden process can be simulated. This covers cases where xt = ψ(xt−1, k, v1:k), for some integer k, vector v1:k ∈ Rk, and deterministic function ψ. Calls for “plug and play” methods. Time series analysis via mechanistic models, Bret´o, He, Ionides and King, 2009. Pierre Jacob Sequential inference in HMM 10/ 60
  • 14. Outline 1 Setting: online inference in time series Hidden Markov Models Implicit models Exact / sequential / online methods 2 Plug and play methods Approximate Bayesian Computation Particle Filters 3 SMC2 for sequential inference A sequential method for HMM Not online 4 Numerical experiments 5 Discussion Pierre Jacob Sequential inference in HMM 10/ 60
  • 15. Exact methods Consider the problem of estimating some quantity It. Consider an estimator IN t where N is a tuning parameter. Hopefully N is such that IN t some sense −−−−−−→ N→∞ It. For instance E[(IN t − It)2] goes to zero when N → ∞. Variational methods / Ensemble Kalman Filters are not exact. Consider the estimator that always returns 29.5. . . Pierre Jacob Sequential inference in HMM 11/ 60
  • 16. Sequential methods Consider the problem of estimating some quantity It, for all t ≥ 0, e.g. upon the arrival of new data. Assume the quantities It for all t ≥ 0 are related one to the other. A sequential method “updates” the estimate IN t into IN t+1. MCMC methods are not sequential: they have to be re-run from scratch whenever a new observation arrives. Therefore, sequential methods are not to be confused with iterative methods. Pierre Jacob Sequential inference in HMM 12/ 60
  • 17. Online methods Consider the problem of estimating some quantity It, for all t ≥ 0, e.g. upon the arrival of new data. A method is online if it provides estimates IN t of It for all t ≥ 0, such that. . . . . . the computational cost of obtaining each IN t given IN t−1 is independent of t, . . . the precision of the estimate does not explode over time: r(IN t ) = ( E [ (IN t − It)2 ])1/2 |It| can be uniformly bounded over t. Consider the estimator that always returns 29.5. . . Pierre Jacob Sequential inference in HMM 13/ 60
  • 18. Outline 1 Setting: online inference in time series Hidden Markov Models Implicit models Exact / sequential / online methods 2 Plug and play methods Approximate Bayesian Computation Particle Filters 3 SMC2 for sequential inference A sequential method for HMM Not online 4 Numerical experiments 5 Discussion Pierre Jacob Sequential inference in HMM 13/ 60
  • 19. Outline 1 Setting: online inference in time series Hidden Markov Models Implicit models Exact / sequential / online methods 2 Plug and play methods Approximate Bayesian Computation Particle Filters 3 SMC2 for sequential inference A sequential method for HMM Not online 4 Numerical experiments 5 Discussion Pierre Jacob Sequential inference in HMM 13/ 60
  • 20. Approximate Bayesian Computation 1 Draw θ from the prior distribution πθ. 2 Draw x0:t, a realisation of the hidden Markov chain given θ. 3 Draw ˆy0:t, a realisation of the observations given x0:t and θ. 4 If D(ˆy0:t, y0:t) ≤ ε, keep (θ, x0:t). Pierre Jacob Sequential inference in HMM 14/ 60
  • 21. Approximate Bayesian Computation Plug and play: only requires simulations from the model. Exact if D is a distance and ε is zero. In practice, D is typically not a distance. The tolerance ε is often chosen implicitely. E.g., ε is chosen so that 1% of the generated samples is kept. Better than the 29.5 estimator? Pierre Jacob Sequential inference in HMM 15/ 60
  • 22. Outline 1 Setting: online inference in time series Hidden Markov Models Implicit models Exact / sequential / online methods 2 Plug and play methods Approximate Bayesian Computation Particle Filters 3 SMC2 for sequential inference A sequential method for HMM Not online 4 Numerical experiments 5 Discussion Pierre Jacob Sequential inference in HMM 15/ 60
  • 23. Sequential Monte Carlo for filtering Objects of interest: filtering distributions: p(xt|y0:t, θ), for all t, for a given θ, likelihood: p(y0:t | θ) = ∫ p(y0:t | x0:t, θ)p(x0:t | θ)dx0:t. Particle filters: propagate recursively Nx particles approximating p(xt | y0:t, θ) for all t, give likelihood estimates ˆpNx (y0:t | θ) of p(y0:t | θ) for all t. Pierre Jacob Sequential inference in HMM 16/ 60
  • 24. Plug and play requirement Particle filters can be implemented if the hidden process can be simulated forward, given any θ: x0 ∼ µθ and xt ∼ fθ(· | xt−1), the measurement density gθ(y | x) can be evaluated point-wise, for any x, y, θ. A bit less “plug and play” than ABC. Pierre Jacob Sequential inference in HMM 17/ 60
  • 25. Sequential Monte Carlo for filtering y2 X2X0 y1 X1 ... ... yT XT θ Pierre Jacob Sequential inference in HMM 18/ 60
  • 26. Sequential Monte Carlo for filtering y2 X2X0 y1 X1 ... ... yT XT θ Pierre Jacob Sequential inference in HMM 18/ 60
  • 27. Sequential Monte Carlo for filtering y2 X2X0 y1 X1 ... ... yT XT θ Pierre Jacob Sequential inference in HMM 18/ 60
  • 28. Sequential Monte Carlo for filtering y2 X2X0 y1 X1 ... ... yT XT θ Pierre Jacob Sequential inference in HMM 18/ 60
  • 29. Sequential Monte Carlo for filtering y2 X2X0 y1 X1 ... ... yT XT θ Pierre Jacob Sequential inference in HMM 18/ 60
  • 30. Sequential Monte Carlo for filtering y2 X2X0 y1 X1 ... ... yT XT θ Pierre Jacob Sequential inference in HMM 18/ 60
  • 31. Sequential Monte Carlo for filtering y2 X2X0 y1 X1 ... ... yT XT θ Pierre Jacob Sequential inference in HMM 18/ 60
  • 32. Sequential Monte Carlo for filtering Consider I(φt) = ∫ φt(xt)p(xt | y0:t)dxt. Lp-bound: E [ IN (φt) − I (φt) p]1/p ≤ c(p) ||φt||∞√ N . Central limit theorem: √ N ( IN (φt) − I (φt) ) D −−−−→ N→∞ N ( 0, σ2 t ) . where σ2 t < σ2 max for all t. Particle filters are fully online, plug and play, and exact. . . for filtering. Pierre Jacob Sequential inference in HMM 19/ 60
  • 33. Sequential Monte Carlo for filtering Properties of the likelihood estimator The likelihood estimator is unbiased, E [ ˆpNx (y0:t | θ) ] = E [ t∏ s=0 1 Nx Nx∑ k=1 wk s ] = p(y0:t | θ) and the relative variance is bounded linearly in time, V [ ˆpNx (y0:t | θ) p(y0:t | θ) ] ≤ C t Nx for some constant C (under some conditions!). Particle filters are not online for likelihood estimation. Pierre Jacob Sequential inference in HMM 20/ 60
  • 34. Outline 1 Setting: online inference in time series Hidden Markov Models Implicit models Exact / sequential / online methods 2 Plug and play methods Approximate Bayesian Computation Particle Filters 3 SMC2 for sequential inference A sequential method for HMM Not online 4 Numerical experiments 5 Discussion Pierre Jacob Sequential inference in HMM 20/ 60
  • 35. Outline 1 Setting: online inference in time series Hidden Markov Models Implicit models Exact / sequential / online methods 2 Plug and play methods Approximate Bayesian Computation Particle Filters 3 SMC2 for sequential inference A sequential method for HMM Not online 4 Numerical experiments 5 Discussion Pierre Jacob Sequential inference in HMM 20/ 60
  • 36. SMC samplers The goal is now to approximate sequentially p(θ), p(θ|y0), . . . , p(θ|y0:T ). Sequential Monte Carlo samplers. Jarzynski 1997, Neal 2001, Chopin 2002, Del Moral, Doucet & Jasra 2006. . . Propagates a number Nθ of θ-particles approximating p(θ | y0:t) for all t. Evidence estimates ˆpNθ (y0:t) ≈ p(y0:t) for all t. Pierre Jacob Sequential inference in HMM 21/ 60
  • 37. Targets p(θ|y1) p(θ|y1, y2) p(θ|y1, y2, y3) p(θ) Θ density Figure : Sequence of target distributions. Pierre Jacob Sequential inference in HMM 22/ 60
  • 38. First step p(θ|y1) p(θ) q qqq qq q qqq qq qq qq qq qqqqq qq qqqqqq qqq q qqq qq qqq q qq qq qq Θ density Figure : First distribution in black, next distribution in red. Pierre Jacob Sequential inference in HMM 23/ 60
  • 39. Importance Sampling p(θ|y1) p(θ) q qqq qq q q qq qq qq qq qq qqqqq qq qq qqqq q qq q qqq qq qq q q q q qq qq Θ density Figure : Samples θ weighted by p(θ | y1)/p(θ) ∝ p(y1 | θ). Pierre Jacob Sequential inference in HMM 24/ 60
  • 40. Resampling and move p(θ|y1) p(θ) q qqqq qq q q qqq qq qqq qqq qqqq qqq qq qq q qqq q qqq q qq qqq qq q qq Θ density Figure : Samples θ after resampling and MCMC move. Pierre Jacob Sequential inference in HMM 25/ 60
  • 41. SMC samplers 1: Sample from the prior θ(m) ∼ p(·) for m ∈ [1, Nθ]. 2: Set ω(m) ← 1/Nθ. 3: for t = 0 to T do 4: Reweight ω(m) ← ω(m) × p(yt|y0:t−1, θ(m)) for m ∈ [1, Nθ]. 5: if some degeneracy criterion is met then 6: Resample the particles, reset the weights ω(m) ← 1/Nθ. 7: MCMC move for each particle, targeting p(θ | y0:t). 8: end if 9: end for Pierre Jacob Sequential inference in HMM 26/ 60
  • 42. Proposed method SMC samplers require pointwise evaluations of p(yt | y0:t−1, θ), MCMC moves targeting each intermediate distribution. For Hidden Markov models, the likelihood is intractable. Particle filters provide likelihood approximations for a given θ. Hence, we equip each θ-particle with its own particle filter. Pierre Jacob Sequential inference in HMM 27/ 60
  • 43. One step of SMC2 For each θ-particle θ (m) t , perform one step of its particle filter: to obtain pNx (yt+1 | y0:t, θ (m) t ) and reweight: ω (m) t+1 = ω (m) t × pNx (yt+1|y0:t, θ (m) t ). Pierre Jacob Sequential inference in HMM 28/ 60
  • 44. One step of SMC2 Whenever Effective sample size = (∑Nθ m=1 ω (m) t+1 )2 ∑Nθ m=1 ( ω (m) t+1 )2 < threshold × Nθ (Kong, Liu & Wong, 1994) resample the θ-particles and move them by PMCMC, i.e. Propose θ⋆ ∼ q(·|θ (m) t ) and run PF(Nx, θ⋆) for t + 1 steps. Accept or not based on ˆpNx (y0:t+1 | θ⋆). Pierre Jacob Sequential inference in HMM 29/ 60
  • 45. SMC2 y2 X2X0 y1 X1 ... ... yT XT Θ Pierre Jacob Sequential inference in HMM 30/ 60
  • 46. SMC2 y2 X2X0 y1 X1 ... ... yT XT Θ Pierre Jacob Sequential inference in HMM 30/ 60
  • 47. SMC2 y2 X2X0 y1 X1 ... ... yT XT Θ Pierre Jacob Sequential inference in HMM 30/ 60
  • 48. SMC2 y2 X2X0 y1 X1 ... ... yT XT Θ Pierre Jacob Sequential inference in HMM 30/ 60
  • 49. SMC2 y2 X2X0 y1 X1 ... ... yT XT Θ Pierre Jacob Sequential inference in HMM 30/ 60
  • 50. SMC2 y2 X2X0 y1 X1 ... ... yT XT Θ Pierre Jacob Sequential inference in HMM 30/ 60
  • 51. SMC2 y2 X2X0 y1 X1 ... ... yT XT Θ Pierre Jacob Sequential inference in HMM 30/ 60
  • 52. Exact approximation SMC2 is a standard SMC sampler on an extended space, with target distribution: πt(θ, x1:Nx 0:t , a1:Nx 0:t−1) = p(θ|y0:t) × 1 Nt+1 x Nx∑ n=1 p(xn 0:t|θ, y0:t)    Nx∏ i=1 i̸=hn t (1) q0,θ(xi 0)    ×    t∏ s=1 Nx∏ i=1 i̸=hn t (s) W ai s−1 s−1,θqs,θ(xi s|x ai s−1 s−1 )    . Related to pseudo-marginal and PMCMC methods. Pierre Jacob Sequential inference in HMM 31/ 60
  • 53. Exact approximation From the extended target representation, we obtain θ from p(θ | y1:t), xn 0:t from p(x0:t | θ, y1:t), thus allowing joint state and parameter inference. Evidence estimates are obtained by computing the average of the θ-weights ω (m) t . The “extended target” argument yields consistency for any fixed Nx, when Nθ goes to infinity. Exact method, sequential by design, but not online. Pierre Jacob Sequential inference in HMM 32/ 60
  • 54. Outline 1 Setting: online inference in time series Hidden Markov Models Implicit models Exact / sequential / online methods 2 Plug and play methods Approximate Bayesian Computation Particle Filters 3 SMC2 for sequential inference A sequential method for HMM Not online 4 Numerical experiments 5 Discussion Pierre Jacob Sequential inference in HMM 32/ 60
  • 55. Scalability in T Cost if MCMC move at each time step A single move step at time t costs O (tNxNθ). If move at every step, the total cost becomes O ( t2NxNθ ) . If Nx = Ct, the total cost becomes O ( t3Nθ ) . With adaptive resampling, the cost is only O ( t2Nθ ) . Why? Pierre Jacob Sequential inference in HMM 33/ 60
  • 56. Scalability in T 512 1024 0 100 200 300 time ESS Figure : Effective Sample Size against time, for the PZ model. Pierre Jacob Sequential inference in HMM 34/ 60
  • 57. Scalability in T 0e+00 2e+06 4e+06 6e+06 0 100 200 300 time Cumulativecostperparticle Figure : Cumulative cost per θ-particle during one run of SMC2 . The cost is measured by the number of calls to the transition sampling function. Nx is fixed. Pierre Jacob Sequential inference in HMM 35/ 60
  • 58. Scalability in T iterations ESS 0 200 400 600 800 1000 1000 2000 3000 4000 5000 Figure : Effective Sample Size against time, for a linear Gaussian model. Pierre Jacob Sequential inference in HMM 36/ 60
  • 59. Scalability in T iteration computingtime(squarerootscale) 2500 10000 22500 40000 1000 2000 3000 4000 5000 Figure : √ computing time against time. Nx is increased to achieve a fixed acceptance rate in the PMCMC steps. Pierre Jacob Sequential inference in HMM 37/ 60
  • 60. Scalability in T Under Bernstein-Von Mises, the posterior becomes Gaussian. p(θ|y1:ct) p(θ|y1:t) Θ density E[ESS] from p(θ | y1:t) to p(θ | y1:ct) becomes independent of t. Hence resampling times occur geometrically: τk ≈ ck with c > 1. Pierre Jacob Sequential inference in HMM 38/ 60
  • 61. Scalability in T More formally. . . The expected ESS at time t + k, if the last resampling time was t, is related to Vp(θ|y1:t) [ p(θ | y1:t+k) p(θ | y1:t) ] = Vp(θ|y1:t) [ L(θ; y1:t+k) L(θ; y1:t) ∫ Θ L(θ; y1:t)p(dθ) ∫ Θ L(θ; y1:t+k)p(dθ) ] . Then Laplace expansions of L yield similar results as before, under regularity conditions. Pierre Jacob Sequential inference in HMM 39/ 60
  • 62. Scalability in T Open problem Online exact Bayesian inference in linear time? On one hand dim(X0:t) = dim(X) × (t + 1) which grows . . . . . . but θ itself is of fixed dimension and p(θ | y1:t) ≈ N(θ⋆, v⋆/t)! Our specific problem Move steps at time t imply running a particle filter from time zero. Attempts have been made at re-starting from t − ∆ but then, bias. Pierre Jacob Sequential inference in HMM 40/ 60
  • 63. Outline 1 Setting: online inference in time series Hidden Markov Models Implicit models Exact / sequential / online methods 2 Plug and play methods Approximate Bayesian Computation Particle Filters 3 SMC2 for sequential inference A sequential method for HMM Not online 4 Numerical experiments 5 Discussion Pierre Jacob Sequential inference in HMM 40/ 60
  • 64. Phytoplankton–Zooplankton: model Hidden process (xt) = (αt, pt, zt). At each (integer) time, αt ∼ N(µα, σ2 α). Given αt, dpt dt = αpt − cptzt, dzt dt = ecptzt − mlzt − mqz2 t . Observations: log yt ∼ N(log pt, σ2 y). Set c = 0.25 and e = 0.3, and (log p0, log z0) ∼ N(log 2, 0.2). Unknown parameters: θ = (µα, σα, σy, ml, mq). Pierre Jacob Sequential inference in HMM 41/ 60
  • 65. Phytoplankton–Zooplankton: observations qqq q qq q qq qqq qq q qq q q q q q q q qq q q q q q q q q q qq q q q q q q q q qq qq q q q q q q q qq qq q q q q q q qqq q q q q q q qq q qq q q q q q q q qq qq q q qq q q q q q q q q q q qq q qq q q q qq q qq q q q q q qqq q q q q q q q q q q q q q q q q q q q q q q qq q qq q q q q q q q q q q q q q q qq qq q q qq q q q q q q q q q q q q q q q qqqqqqq q q q q q q q qq q q q q q q qq q q q q qqqq q q qqq q q q q qqq q qq q q q q q q q qq q qq q qqq q q q qqqqqq q q qq qq q qq q q q q q q q q q q q q qqqq q q q q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q qqq q q q q q q q q q q q q qq q q q q q q q qqq qqq qq q q q q q qq q q q q 0 10 20 30 40 0 100 200 300 time observations Figure : A time series of 365 observations generated according to a phytoplankton–zooplankton model. Pierre Jacob Sequential inference in HMM 42/ 60
  • 66. Phytoplankton–Zooplankton: parameters qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq 0.60 0.65 0.70 0.75 0.80 0.45 0.50 0.55 0.60 σα µα qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq 0.10 0.15 0.20 0.25 0.30 0.45 0.50 0.55 0.60 σα σy qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq 0.08 0.09 0.10 0.11 0.05 0.10 0.15 0.20 ml mq Figure : Posterior distribution of the parameters. Pierre Jacob Sequential inference in HMM 43/ 60
  • 67. Phytoplankton–Zooplankton: parameters 0.00 0.25 0.50 0.75 1.00 0 10 20 30 40 50 time µα Figure : Evolution, over the first 50 time steps, of the posterior distribution of µα. Pierre Jacob Sequential inference in HMM 44/ 60
  • 68. Phytoplankton–Zooplankton: parameters 0.00 0.25 0.50 0.75 1.00 0 10 20 30 40 50 time σα Figure : Evolution, over the first 50 time steps, of the posterior distribution of σα. Pierre Jacob Sequential inference in HMM 45/ 60
  • 69. Phytoplankton–Zooplankton: parameters 0.00 0.25 0.50 0.75 1.00 0 10 20 30 40 50 time σy Figure : Evolution, over the first 50 time steps, of the posterior distribution of σy. Pierre Jacob Sequential inference in HMM 46/ 60
  • 70. Phytoplankton–Zooplankton: parameters 0.00 0.25 0.50 0.75 1.00 0 10 20 30 40 50 time ml Figure : Evolution, over the first 50 time steps, of the posterior distribution of ml. Pierre Jacob Sequential inference in HMM 47/ 60
  • 71. Phytoplankton–Zooplankton: parameters 0.00 0.25 0.50 0.75 1.00 0 10 20 30 40 50 time mq Figure : Evolution, over the first 50 time steps, of the posterior distribution of mq. Pierre Jacob Sequential inference in HMM 48/ 60
  • 72. Phytoplankton–Zooplankton: prediction q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q 0 10 20 30 0 10 20 30 40 50 time observations Figure : One step predictions under parameter uncertainty. Pierre Jacob Sequential inference in HMM 49/ 60
  • 74. Phytoplankton–Zooplankton: another model dpt dt = αpt − cptzt dzt dt = ecptzt − mlzt or dpt dt = αpt − cptzt dzt dt = ecptzt − mlzt−mqz2 t ? Pierre Jacob Sequential inference in HMM 51/ 60
  • 75. Phytoplankton–Zooplankton: model choice 1 100 0 25 50 75 100 time Bayesfactor Figure : Bayes Factor against time. Pierre Jacob Sequential inference in HMM 52/ 60
  • 76. Phytoplankton–Zooplankton: model choice 1e+00 1e+02 1e+06 1e+12 1e+18 0 100 200 300 time Bayesfactor Figure : Bayes Factor against time. Pierre Jacob Sequential inference in HMM 53/ 60
  • 77. Outline 1 Setting: online inference in time series Hidden Markov Models Implicit models Exact / sequential / online methods 2 Plug and play methods Approximate Bayesian Computation Particle Filters 3 SMC2 for sequential inference A sequential method for HMM Not online 4 Numerical experiments 5 Discussion Pierre Jacob Sequential inference in HMM 53/ 60
  • 78. Forgetting mechanism for hidden states Forgetting property of a uniformly ergodic Markov chain: ||pν t − pµ t ||TV ≤ Cρt where ν, µ are two initial distributions pν t is the distribution of Xt after t steps, ρ < 1, C > 0. Similarly, the filtering distribution πt(dxt) = p(dxt | y0:t) forgets its initial condition geometrically fast. Introduce the operator Φt, taking a measure, applying a Markov kernel to it, and then a Bayes update using yt. Under conditions on the data generating process and the model, ||Φ0:t(µ) − Φ0:t(ν)||TV ≤ Cρt . Pierre Jacob Sequential inference in HMM 54/ 60
  • 79. Forgetting mechanism for parameters Forgetting mechanism for Bayesian posterior distribution: ||pν t − pµ t ||TV ≤ 1 √ t C. Huge literature on prior robustness. Posterior forgetting goes much slower than Markov chain forgetting. An error in the approximation of p(θ | y1:t) damages the subsequent approximations of p(θ | y1:t+k), for many k’s. SMC samplers are stable because of the added MCMC steps, which costs increase with t. Pierre Jacob Sequential inference in HMM 55/ 60
  • 80. Other challenges Dimensionality: the other big open problem. Particle filter’s errors grow exponentially fast with dim(X). Can local particle filters beat the curse of dimensionality? Rebeschini, van Handel, 2013. Carefully analyzed biased approximations. Assumption of a spatial forgetting effect from the model. Pierre Jacob Sequential inference in HMM 56/ 60
  • 81. Other challenges Particle filters provide useful estimates. . . . . . but no estimates of their associated variance. Can we estimate the variance without having to run the algorithm many times? Pierre Jacob Sequential inference in HMM 57/ 60
  • 82. Other challenges Particle methods are more and more commonly used outside the setting of HMMs. For instance, in the setting of long memory processes: probabilistic programming, Bayesian non-parametric applications. Are particle methods useful for models that do not satisfy forgetting properties? Stability of Feynman-Kac formulae with path-dependent potentials, Chopin, Del Moral, Rubenthaler, 2009. Pierre Jacob Sequential inference in HMM 58/ 60
  • 83. Discussion SMC2 allows sequential exact approximation in HMMs, but not online. Properties of posterior distributions could help achieving exact online inference, or prove that it is, in fact, impossible. Do we want to sample from the posterior as t → ∞? Importance of plug and play inference for time series. Implementation in LibBi, with GPU support. Pierre Jacob Sequential inference in HMM 59/ 60
  • 84. Links Particle Markov chain Monte Carlo, Andrieu, Doucet, Holenstein, 2010 (JRSS B) Sequential Monte Carlo samplers: error bounds and insensitivity to initial conditions, Whiteley, 2011 (Stoch. Analysis and Appl.). SMC2: an algorithm for sequential analysis of HMM, Chopin, Jacob, Papaspiliopoulos, 2013 (JRSS B) www.libbi.org Pierre Jacob Sequential inference in HMM 60/ 60