Your SlideShare is downloading. ×
Presentation of SMC^2 at BISP7
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Presentation of SMC^2 at BISP7

1,711

Published on

This a short presentation for a 15 minutes talk at Bayesian Inference for Stochastic Processes 7, on the SMC^2 algorithm. …

This a short presentation for a 15 minutes talk at Bayesian Inference for Stochastic Processes 7, on the SMC^2 algorithm.
http://arxiv.org/abs/1101.1528

Published in: Education, Technology, Travel
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,711
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Introduction and State Space Models Quick reminder on Sequential Monte Carlo Particle Markov Chain Monte Carlo SMC2 SMC2 : A sequential Monte Carlo algorithm with particle Markov chain Monte Carlo updates N. CHOPIN1 , P.E. JACOB2 , & O. PAPASPILIOPOULOS3 BISP7 – September, 2011 1 ENSAE-CREST 2 CREST & Universit´ Paris Dauphine, funded by AXA research e 3 Universitat Pompeu FabraN. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 1/ 16
  • 2. Introduction and State Space Models Quick reminder on Sequential Monte Carlo Particle Markov Chain Monte Carlo SMC2State Space Models A system of equations Hidden states: p(x1 |θ) = µθ (x1 ) and for t = 1, . . . , T : p(xt+1 |x1:t , θ) = fθ (xt+1 |xt ) Observations: p(yt |y1:t−1 , x1:t−1 , θ) = gθ (yt |xt ) Parameter: θ ∈ Θ, prior p(θ). N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 2/ 16
  • 3. Introduction and State Space Models Quick reminder on Sequential Monte Carlo Particle Markov Chain Monte Carlo SMC2Sequential Monte Carlo for filtering Suppose we are interested in pθ (xT |y1:T ), for a given θ. General idea Sample recursively from pθ (xt |y1:t ) to pθ (xt+1 |y1:t+1 ). After the SMC run, we can approximate the likelihood: T ZT (θ) = p(y1:T |θ) = p(yt |y1:t−1 , θ) p(y1 |θ) t=2 ˆN with an unbiased estimate ZT x (θ). N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 3/ 16
  • 4. Introduction and State Space Models Quick reminder on Sequential Monte Carlo Particle Markov Chain Monte Carlo SMC2Sequential Monte Carlo Samplers Same kind of method but to perform bayesian inference: p(θ|y1:T ) General idea Sample recursively from p(θ|y1:t ) to p(θ|y1:t+1 ). MCMC moves to diversify the particles. Requires the ability to compute point-wise p(yt |y1:t−1 , θ). N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 4/ 16
  • 5. Introduction and State Space Models Quick reminder on Sequential Monte Carlo Particle Markov Chain Monte Carlo SMC2Idealized Metropolis–Hastings for SSM Motivation Bayesian parameter inference in state space models: p(θ|y1:T ) If only. . . . . . we could compute p(θ|y1:T ) ∝ p(θ)p(y1:T |θ), we could run a MH algorithm. N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 5/ 16
  • 6. Introduction and State Space Models Quick reminder on Sequential Monte Carlo Particle Markov Chain Monte Carlo SMC2Valid Metropolis–Hastings for SSM Plug in estimates ˆN We have ZT x (θ) ≈ p(y1:T |θ) by running a SMC filter, and we can try to run a MH algorithm using the estimate instead of the right likelihood. Particle MCMC This is called Particle Marginal Metropolis-Hastings, by Andrieu, Doucet and Holenstein. N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 6/ 16
  • 7. Introduction and State Space Models Quick reminder on Sequential Monte Carlo Particle Markov Chain Monte Carlo SMC2Our contribution. . . . . . was to use the same method to get a valid SMC sampler for state space models. Foreseen benefits to sample more efficiently from the posterior distribution p(θ|y1:T ), to sample sequentially from p(θ|y1 ), p(θ|y1 , y2 ), . . . p(θ|y1:T ). and it turns out, it allows even a bit more. N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 7/ 16
  • 8. Introduction and State Space Models Quick reminder on Sequential Monte Carlo Particle Markov Chain Monte Carlo SMC2Valid SMC sampler for SSM Plug in estimates Similarly to PMCMC methods, we want to replace p(yt |y1:t−1 , θ) with an unbiased estimate, and see what happens. SMC everywhere We associate Nx x-particles to each of the Nθ θ-particles, these are used to get estimates of the incremental likelihoods for each θ-particle. N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 8/ 16
  • 9. Introduction and State Space Models Quick reminder on Sequential Monte Carlo Particle Markov Chain Monte Carlo SMC2Side benefits Evidence SMC2 provides an estimate of the “evidence”: t p(y1:t ) = p(ys |y1:s−1 ) s=1 Automatic tuning θ-particles are moved with adaptive particle MCMC steps, the number of Nx particles can be dynamically increased if need be. N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 9/ 16
  • 10. Introduction and State Space Models Quick reminder on Sequential Monte Carlo Particle Markov Chain Monte Carlo SMC2Numerical illustrations: Stochastic Volatility 2 Observations 0 −2 −4 100 200 300 400 500 600 700 Time Figure: The S&P 500 data from 03/01/2005 to 21/12/2007. N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 10/ 16
  • 11. Introduction and State Space Models Quick reminder on Sequential Monte Carlo Particle Markov Chain Monte Carlo SMC2Numerical illustrations: Stochastic Volatility Stochastic Volatility model Observations (“log returns”): 1/2 yt = µ + βvt + vt t, t ∼ N (0, 1) Hidden states: the “actual volatility” (vt ), a process that depends on another process, the “spot volatility” (zt ). All these processes are parameterized by θ ∈ (µ, β, ξ, ω 2 , λ). N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 11/ 16
  • 12. Introduction and State Space Models Quick reminder on Sequential Monte Carlo Particle Markov Chain Monte Carlo SMC2Numerical illustrations: Stochastic Volatility T = 250 T = 500 T = 750 T = 1000 8 6 Density 4 2 0 −1.0 −0.5 0.0 0.5 1.0 −1.0 −0.5 0.0 0.5 1.0 −1.0 −0.5 0.0 0.5 1.0 −1.0 −0.5 0.0 0.5 1.0 µ Figure: Concentration of the posterior distribution for parameter µ. N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 12/ 16
  • 13. Introduction and State Space Models Quick reminder on Sequential Monte Carlo Particle Markov Chain Monte Carlo SMC2Numerical illustrations: Stochastic Volatility Model comparison For the same problem there could be various models that we want to compare. Here: the “basic” previous model, a similar model with more factors (= more hidden states), a similar model with more factors and “leverage” (= different likelihood function with more parameters). N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 13/ 16
  • 14. Introduction and State Space Models Quick reminder on Sequential Monte Carlo Particle Markov Chain Monte Carlo SMC2Numerical illustrations: Stochastic Volatility Evidence compared to the one factor model variable 20 Multi factor without leverage 4 Multi factor with leverage Squared observations 15 2 10 0 5 −2 100 200 300 400 500 600 700 100 200 300 400 500 600 700 Time Iterations (a) (b) Figure: Left: observations; right: log-evidence relative to the basic model. N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 14/ 16
  • 15. Introduction and State Space Models Quick reminder on Sequential Monte Carlo Particle Markov Chain Monte Carlo SMC2Conclusion A powerful framework The SMC2 framework allows to obtain various quantities of interest, especially for sequential analysis. It extends the PMCMC framework introduced by Andrieu, Doucet and Holenstein. A python package is available: http://code.google.com/p/py-smc2/. N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 15/ 16
  • 16. Introduction and State Space Models Quick reminder on Sequential Monte Carlo Particle Markov Chain Monte Carlo SMC2Bibliography SMC2 : A sequential Monte Carlo algorithm with particle Markov chain Monte Carlo updates, N. Chopin, P.E. Jacob, O. Papaspiliopoulos, submitted, available on arXiv. Main references: Particle Markov Chain Monte Carlo methods, C. Andrieu, A. Doucet, R. Holenstein, JRSS B., 2010, 72(3):269–342 The pseudo-marginal approach for efficient computation, C. Andrieu, G.O. Roberts, Ann. Statist., 2009, 37, 697–725 Random weight particle filtering of continuous time processes, P. Fearnhead, O. Papaspiliopoulos, G.O. Roberts, A. Stuart, JRSS B., 2010, 72:497–513 Feynman-Kac Formulae, P. Del Moral, Springer N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 16/ 16

×