SlideShare a Scribd company logo
1 of 16
Download to read offline
Particle Filtering (Sequential Monte Carlo) and
             State-Space Processes

                   Wei Wang



                  Dec 13, 2010
Outline
      Motivation of Particle Filters

      Two commonly used Particle Filters

      A Stochastic Volatility Model example
State-Space Processes
      A general state-space model is defined as

            Observation equation:       yt+1 ∼ p(yt+1 |xt+1 , θ)
               Evolution equation:       xt+1 ∼ p(xt+1 |xt , θ)

      with initial state distribution p(x0 |θ) and prior p(θ).


      Some notation notes:
       1. xt , yt denote a single observation or state at time t.
       2. xt , yt denote vectors of observations or states up to time t.
       3. T is the total number of observations and N is the Monte Carlo
           sample size in our SMC algorithm.
Inferential Goals
       There are primarily four inferential goals, namely:
        1. Filtering: the posterior distribution of current latent state at
           each time point and known parameters

                               p(xt |yt , θ) for t = 1, . . . , T

        2. Learning: the posterior distribution of parameter given all
           observations
                                           p(θ|yT )
        3. Smoothing: the posterior distribution of latent states given all
           the observations
                                          p(xT |yT )
        4. Model Assessment: the marginal distribution of observations,
           used for computing Bayes Factor

                                            p(yT )
Inferential Goals
       Due to time constraints, we will only talk about filtering today.
       The dependence on parameters is depressed.
        1. Filtering: the posterior distribution of current latent state at
           each time point and known parameters
                                p(xt |yt ) for t = 1, . . . , T

        2. Learning: the posterior distribution of parameter given all
           observations
                                          p(θ|yT )

        3. Smoothing: the posterior distribution of latent states given all
           the observations
                                         p(xT |yT )

        4. Model Assessment: the marginal distribution of observations,
           used for computing Bayes Factor
                                           p(yT )
Kalman Filters
      When the state-space model is linear and Gaussian, it is
      known as Normal Dynamic Linear Model (NDLM).

                          yt = Ft xt + vt
                          xt = Gt xt−1 + wt
                          vt   ∼   N(0, σ2 )
                                         t
                         wt    ∼   N(0, τ2 )
                                         t

      in which noises vt and wt are temporally and mutually.

      The celebrated Kalman Filters solve this model using Kalman
      Recursions.
But the real world is not NDLM
      Most real world applications are not inside the realm of NDLM.

                             yt |xt ∼ p(yt |xt ),
                         xt |xt−1 ∼ p(xt |xt−1 )

      We cannot get the analytical form of the recursions as we can
      in NDLM.

            p(xt |yt−1 ) =      p(xt |xt−1 )p(xt−1 |yt−1 ) dxt−1

              p(xt |yt ) ∝ p(yt |xt )p(xt |yt−1 )

      But the form of the above display suggests that some kind of
      Monte Carlo methods might work.
First Particle Filter: Bootstrap Filter
       Gordon, Salmond and Smith (1993) proposed one of the
       earliest (and still popular) particle filter: bootstrap filter.

             p(xt |yt−1 ) =      p(xt |xt−1 )p(xt−1 |yt−1 ) dxt−1      (1)

                p(xt |yt ) ∝ p(yt |xt )p(xt |yt−1 )                    (2)

       The rough idea is when we have Monte Carlo samples
       (particles) from p(xt−1 |yt−1 ), we can generate new particles
       from p(xt |yt−1 ) based on (1), and then resample from new
       particles based on likelihood ratio to get the particles from
       p(xt |yt ).
Bootstrap Filter
                      (i)           (i)
     1. Propagate {xt−1 }N 1 to {xt }N 1 via p(xt |xt−1 )
                         i=      ˜   i=
                     (i)            (i)
     2. Resample {xt }N 1 from {xt−1 }N 1 with weights
                      i=        ˜     i=
                     (i)
        wi ∝ p(yt |xt )
         t         ˜
                      Algorithm 1: Bootstrap Filter

       Bootstrap Filter belongs to the class of Propagate-Resample
       Filters.

       Bootstrap Filter is incredibly easy to implement, since we only
       need to be able to sample from evolutionary density
       p(xt |xt−1 ) and to compute likelihood p(yt |xt ).

       Notice that the Resampling step is not essential.
Drawback of Bootstrap Filter
      However, Bootstrap Filter suffers from severe Model
      Degeneracy (Model Impoverishment).

      In the Propagation step, information of new observation yt is
      not incorporated, so the propagated states may not be
      important or high likelihood states, and so at the resampling
      step most weights are given to a few of the new states. Then
      we won’t have a good Monte Carlo sample of the objective
      distribution.

      Of course, if we set the MC sample size N large enough, we
      might reduce the Model Degeneracy to a reasonable range,
      but it is too computationally expensive.
Another Approach: Auxiliary Particle Filter
       The problem with Bootstrap Filter is the blind propagation.

       Playing with math, we can find the following thing

             p(xt |yt ) ∝ p(xt |xt−1 , yt )p(yt |xt−1 )p(xt−1 |yt−1 )


       Based on this Pitt and Shephard (1999) proposed a
       Resample-Propagate scheme.

       The idea is with MC samples from p(xt−1 |yt−1 ), we resample
       to get p(xt−1 |yt ), then propagate to get MC samples from
       p(xt |yt ).
Auxiliary Particle Filter
                     (i)             (i)
      1. Resample {xt−1 }N 1 from {xt−1 }N 1 with weights
                   ˜     i=              i=
                     (i)
         wi ∝ p(yt |xt−1 )
          t
                       (i)          (i)
      2. Propagate {xt−1 }N 1 to {xt }N 1 via p(xt |xt−1 , yt )
                    ˜     i=          i=
                   Algorithm 2: Auxiliary Particle Filter


       In APF, we use current observation yt in our first resampling
       step, so only “good” particles are propagated forward. It
       should be more efficient than Boostrap Filter.
Drawback of APF
     But nothing comes for free. In order to use the Algorithm
     given above, we need to (1) be able to compute the predictive
     likelihood p(yt |xt−1 ) and (2) be able to sample from
     “evolution-posterior” density p(xt |xt−1 , yt ).

     These are often impossible in most cases.

     An general approach is Importance Sampling. Pitts and
     Shephard (1999) suggests:
      1. Use p(yt |g(xt−1 )), i.e. likelihood p(yt |xt ) evaluated at
         g(xt−1 ) as proposal weights of resample;
      2. Use evolution kernel p(xt |xt−1 ) as the proposal density to
          propagate particles.
An Applicable APF
                      (i)           (i)
     1. Resample {xt−1 }N 1 from {xt−1 }N 1 with weights
                  ˜     i=              i=
                         (i)
        wi ∝ p(yt |g(xt−1 ))
         t
                        (i)        (i)
     2. Propagate {xt−1 }N 1 to {xt }N 1 via p(xt |xt−1 )
                   ˜     i=      ˜   i=
                      (i)          (i)
     3. Resample {xt }N 1 from {xt }N 1 with weights
                      i=        ˜   i=
                        (i)
                 p(yt |xt )
                       ˜
        wi ∝
         t              (i)
               p(yt |g(xt−1 ))
                       ˜

           Algorithm 3: An Applicable Auxiliary Particle Filter

      The performance of APF depends hugely on how we choose
      proposal density. In some cases, it might perform worse than
      Bootsrap Filter.
Example: Stochastic Volatility Model
      Consider a very simple log-stochastic volatility model

                           yt =             Vt εt
                      log(Vt ) = α + β log(Vt−1 ) + σηt
                      log(V1 )    ∼   N(0, σ2 )
                                            0

                      α = 0, β = .95, σ = σ0 = .1, T = 500


                  4




                  2
             y




                  0




                 −2




                            100       200           300   400   500
                                             Time
Example: Stochastic Volatility Model

                                                                                               Volatility BF

      3.5

      3.0

      2.5
                                                                                       q    q
                                                                                            q
                                                                                           q                                                              q
                                                                                          qq q
                                                                                       q qqq
      2.0                                                                               qqq q
                                                                                                                                                         qq
                                                                                                                                                         qqq
                                                                                          q q
                            q                                                            qq
                                                                                q q
                            qq
                           qqq                                                   qq q
                                                                                 qq                                                                      q q
      1.5                    q
                                                          q                    qq                                           q                   qq q
                                                                                                                                                 qq
                                                                                                                                                         q q
                                                                                                                                                                          q
                              q                                               qq               q                                 q               q qq q                   q
                                  q                    qq
                                                        q q                                     q                                                   q       qq qq
                                                                                                                                                             q q
                                                                                                                                                            q qq         q
                       q q qqq qqq                                           q                                               q q
                                                                                                                              q                    q q qq
                                                                                                                                                 qq q qq               q
                                                                                                                                                                       q q
                                                                                                                                                                      q q
                           q q
                       q q qq q
                                                       q q          q        qq                qq q                  qq qqqqqq
                                                                                                                       q
                                                                                                                     q q qq q q
                                                                                                                      q                                       qq q q qqq q
                                                                                                                                                                q q                    q
                     qq q
                q qq q q          q qq        qq        qq        qq qq
                                                                   q q
                                                                   q q                           q qq q
                                                                                                  q q qq                       qq             qq    q qq      q qqq q                   q
      1.0            q
            q q q q qqq            q qq      qq q
                                      qq q q qq q qq qq qq q
                                             q                 q q q qqq
                                                      q q q q q qq qq  qq q                      qqqq qq      q
                                                                                                                 q
                                                                                                               qq qq qq
                                                                                                               qqqqq qqq
                                                                                                                        q
                                                                                                                          q
                                                                                                                                q q
                                                                                                                                        q
                                                                                                                                              q
                                                                                                                                             qq
                                                                                                                                               q      q
                                                                                                                                                      qq       q
                                                                                                                                                                     q q q q
                                                                                                                                                                     q      q         qq
                                                                                                                                                                                        q      q
             q                                        q                                                                                                            qq
             qq q
            qqq qq      q
                         q          q      qq
                                           q    q q
                                    q q qq qq q q           qq qqq qq
                                                            qq q q
                                                                q q     q q                         q   qq q q q
                                                                                                   q qq qq qqq q q q
                                                                                                   q q qq qq q            q        q qqq
                                                                                                                                   qq qqq q
                                                                                                                                    q                              q        qqq q q qq
                                                                                                                                                                             q q
                                                                                                                                                                             q qq        q q  q               q
               q q      q               qq q     qqqq         q          q q                                       q                 q q qq q
                                                                                                                                     q      q                                 q q qqq q q q
                                                                                                                                                                                    qq q q q                  qq
                 qqq                    qq          q        q           qqq                          q   qq
                                                                                                           q                          q qqqq
                                                                                                                                           q                                         q                  q
                                                                                                                                                                                                       qqq
                 q                      qq
                                         q        q q
                                                 q q
                                                  q                       q
                                                                          qq                              q                           q                                          qq
                                                                                                                                                                                 qq
                                                                                                                                                                                  q       qq q q qqqq q q
                                                                                                                                                                                           qq
                                                                                                                                                                                                 q q qq q qq
                                                                                                                                                                                                 q
                                                                                                                                                                                                  qqqqq q qq
                                                                                                                                                                                                          q qq
      0.5                                                                                                                                                                                 q       qqqq
                                                                                                                                                                                                   qqq    q



                                                 100                                     200                                   300                                   400                                    500
                                                                                                       Time


                                                                                               Volatility APF

      3.5

      3.0

      2.5
                                                                                       q    q
                                                                                            q
                                                                                           q                                                              q
                                                                                          qq q
                                                                                       q qqq
      2.0                                                                               qqq q
                                                                                                                                                         qq
                                                                                                                                                         qqq
                                                                                          q q
                            q                                                            qq
                                                                                      q q
                            qq
                           qqq                                                         qq q
                                                                                       qq                                                                q q
      1.5                    q
                                                            q                        qq                                     q                   qq q
                                                                                                                                                qq
                                                                                                                                                         q q
                                                                                                                                                                          q
                              q                                                     qq         q                                 q               q qq q                   q
                                  q                       qq
                                                          q q                                   q                                                   q       qq qq
                                                                                                                                                             q q
                                                                                                                                                            q qq         q
                       q q qqq qqq                                                 q                                         q q
                                                                                                                              q                    q q qq
                                                                                                                                                 qq q qq               q
                                                                                                                                                                       q q
                                                                                                                                                                      q q
                           q q
                       q q qq q
                                                         q q            q          qq          qq q                   qq qqqqqq
                                                                                                                        q
                                                                                                                       q q qq q q
                                                                                                                       q                                      qq q q qqq q
                                                                                                                                                                q q                    q
                     qq q
                q qq q q          q qq         qq         qq           qq qq
                                                                       q qq
                                                                        q q                      q qq q
                                                                                                  q q qq                       qq            qq     q qq      q qqq q
                                                                                                                                                               q                        q
      1.0            q
            q q q q qqq            q qq
                                      qq      qq
                                               q
                                              q qq      q      q q q qqqqq
                                                        qq q qq qq q        qq
                                                                             qq q                q              q
                                                                                                                  q
                                                                                                                 qq qq qqqq
                                                                                                                         qq
                                                                                                                                q q           q
                                                                                                                                              qq      q
                                                                                                                                                      qq             q q q qq         qq
                                                                                                                                                                                        q      q
             q
             qq q        q          q q qqq qq q qq
                                            q    q q
                                                  q q         q q qqqqq qq
                                                                                                  qqq qq      qqqqq
                                                                                                    q qq qq qqq qqqq
                                                                                                               q q         q           qqq q
                                                                                                                                    q qqq q                        qqq
                                                                                                                                                                            qqq q q qq
                                                                                                                                                                             q q
                                                                                                                                                                             q qq        q q
            qqq qq
               q q      q           q
                                        qq q
                                            q                 qq q qq q       q q                  q q qq qq
                                                                                                   q      q
                                                                                                                    qq     q       q
                                                                                                                                   qq q
                                                                                                                                     q qqq q
                                                                                                                                                                   q
                                                                                                                                                                              q q qqq q q q
                                                                                                                                                                                              q               qq
                 qqq    q               qq q       qqqq
                                                      q         qq             q q
                                                                               qqq                    q    qqq                       q    qqq
                                                                                                                                          qq                                        qq q q q
                                                                                                                                                                                     q                  q     q
                 q                       q          q q
                                                                                                            q                         q
                                                                                                                                      q    q
                                                                                                                                                                                 qq         q          qqq
                                                                                                                                                                                           q q q qqqq q q
                                         qq        q q
                                                    q                           q
                                                                                qq                          q                                                                    qq
                                                                                                                                                                                  q       qq     q q qq q qq
                                                                                                                                                                                                 q
                                                                                                                                                                                                  qqqqq q qq
      0.5                                                                                                                                                                                 q       qqqq
                                                                                                                                                                                                   qqq    q qq
                                                                                                                                                                                                          q



                                                 100                                     200                                   300                                   400                                    500
                                                                                                       Time

More Related Content

What's hot

PAC-Bayesian Bound for Gaussian Process Regression and Multiple Kernel Additi...
PAC-Bayesian Bound for Gaussian Process Regression and Multiple Kernel Additi...PAC-Bayesian Bound for Gaussian Process Regression and Multiple Kernel Additi...
PAC-Bayesian Bound for Gaussian Process Regression and Multiple Kernel Additi...
Taiji Suzuki
 
Autoregression
AutoregressionAutoregression
Autoregression
jchristo06
 
RSS discussion of Girolami and Calderhead, October 13, 2010
RSS discussion of Girolami and Calderhead, October 13, 2010RSS discussion of Girolami and Calderhead, October 13, 2010
RSS discussion of Girolami and Calderhead, October 13, 2010
Christian Robert
 
A brief introduction to Hartree-Fock and TDDFT
A brief introduction to Hartree-Fock and TDDFTA brief introduction to Hartree-Fock and TDDFT
A brief introduction to Hartree-Fock and TDDFT
Jiahao Chen
 
Chester Nov08 Terry Lynch
Chester Nov08 Terry LynchChester Nov08 Terry Lynch
Chester Nov08 Terry Lynch
Terry Lynch
 
Quicksort analysis
Quicksort analysisQuicksort analysis
Quicksort analysis
Premjeet Roy
 

What's hot (20)

Recent developments on unbiased MCMC
Recent developments on unbiased MCMCRecent developments on unbiased MCMC
Recent developments on unbiased MCMC
 
PAC-Bayesian Bound for Gaussian Process Regression and Multiple Kernel Additi...
PAC-Bayesian Bound for Gaussian Process Regression and Multiple Kernel Additi...PAC-Bayesian Bound for Gaussian Process Regression and Multiple Kernel Additi...
PAC-Bayesian Bound for Gaussian Process Regression and Multiple Kernel Additi...
 
Minimax optimal alternating minimization \\ for kernel nonparametric tensor l...
Minimax optimal alternating minimization \\ for kernel nonparametric tensor l...Minimax optimal alternating minimization \\ for kernel nonparametric tensor l...
Minimax optimal alternating minimization \\ for kernel nonparametric tensor l...
 
Digital Signal Processing[ECEG-3171]-Ch1_L04
Digital Signal Processing[ECEG-3171]-Ch1_L04Digital Signal Processing[ECEG-3171]-Ch1_L04
Digital Signal Processing[ECEG-3171]-Ch1_L04
 
short course at CIRM, Bayesian Masterclass, October 2018
short course at CIRM, Bayesian Masterclass, October 2018short course at CIRM, Bayesian Masterclass, October 2018
short course at CIRM, Bayesian Masterclass, October 2018
 
Richard Everitt's slides
Richard Everitt's slidesRichard Everitt's slides
Richard Everitt's slides
 
Basic concepts and how to measure price volatility
Basic concepts and how to measure price volatility Basic concepts and how to measure price volatility
Basic concepts and how to measure price volatility
 
Autoregression
AutoregressionAutoregression
Autoregression
 
Nonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodNonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo Method
 
Digital Signal Processing[ECEG-3171]-Ch1_L05
Digital Signal Processing[ECEG-3171]-Ch1_L05Digital Signal Processing[ECEG-3171]-Ch1_L05
Digital Signal Processing[ECEG-3171]-Ch1_L05
 
RSS discussion of Girolami and Calderhead, October 13, 2010
RSS discussion of Girolami and Calderhead, October 13, 2010RSS discussion of Girolami and Calderhead, October 13, 2010
RSS discussion of Girolami and Calderhead, October 13, 2010
 
Master Thesis on Rotating Cryostats and FFT, DRAFT VERSION
Master Thesis on Rotating Cryostats and FFT, DRAFT VERSIONMaster Thesis on Rotating Cryostats and FFT, DRAFT VERSION
Master Thesis on Rotating Cryostats and FFT, DRAFT VERSION
 
Jere Koskela slides
Jere Koskela slidesJere Koskela slides
Jere Koskela slides
 
A brief introduction to Hartree-Fock and TDDFT
A brief introduction to Hartree-Fock and TDDFTA brief introduction to Hartree-Fock and TDDFT
A brief introduction to Hartree-Fock and TDDFT
 
An Intuitive Approach to Fourier Optics
An Intuitive Approach to Fourier OpticsAn Intuitive Approach to Fourier Optics
An Intuitive Approach to Fourier Optics
 
Dsp U Lec10 DFT And FFT
Dsp U   Lec10  DFT And  FFTDsp U   Lec10  DFT And  FFT
Dsp U Lec10 DFT And FFT
 
Chester Nov08 Terry Lynch
Chester Nov08 Terry LynchChester Nov08 Terry Lynch
Chester Nov08 Terry Lynch
 
Unbiased Bayes for Big Data
Unbiased Bayes for Big DataUnbiased Bayes for Big Data
Unbiased Bayes for Big Data
 
BSE and TDDFT at work
BSE and TDDFT at workBSE and TDDFT at work
BSE and TDDFT at work
 
Quicksort analysis
Quicksort analysisQuicksort analysis
Quicksort analysis
 

Viewers also liked (6)

November 30, Projects
November 30, ProjectsNovember 30, Projects
November 30, Projects
 
No. la sottile arte di trovare il tempo dove non esiste - Matteo Collina - Co...
No. la sottile arte di trovare il tempo dove non esiste - Matteo Collina - Co...No. la sottile arte di trovare il tempo dove non esiste - Matteo Collina - Co...
No. la sottile arte di trovare il tempo dove non esiste - Matteo Collina - Co...
 
Particle filtering in Computer Vision (2003)
Particle filtering in Computer Vision (2003)Particle filtering in Computer Vision (2003)
Particle filtering in Computer Vision (2003)
 
Baye’s Theorem
Baye’s TheoremBaye’s Theorem
Baye’s Theorem
 
Probability basics and bayes' theorem
Probability basics and bayes' theoremProbability basics and bayes' theorem
Probability basics and bayes' theorem
 
Hidden markov model ppt
Hidden markov model pptHidden markov model ppt
Hidden markov model ppt
 

Similar to Particle filtering

Sampling strategies for Sequential Monte Carlo (SMC) methods
Sampling strategies for Sequential Monte Carlo (SMC) methodsSampling strategies for Sequential Monte Carlo (SMC) methods
Sampling strategies for Sequential Monte Carlo (SMC) methods
Stephane Senecal
 
2012 mdsp pr05 particle filter
2012 mdsp pr05 particle filter2012 mdsp pr05 particle filter
2012 mdsp pr05 particle filter
nozomuhamada
 
Ml mle_bayes
Ml  mle_bayesMl  mle_bayes
Ml mle_bayes
Phong Vo
 
Speech signal time frequency representation
Speech signal time frequency representationSpeech signal time frequency representation
Speech signal time frequency representation
Nikolay Karpov
 

Similar to Particle filtering (20)

Sampling strategies for Sequential Monte Carlo (SMC) methods
Sampling strategies for Sequential Monte Carlo (SMC) methodsSampling strategies for Sequential Monte Carlo (SMC) methods
Sampling strategies for Sequential Monte Carlo (SMC) methods
 
Controlled sequential Monte Carlo
Controlled sequential Monte Carlo Controlled sequential Monte Carlo
Controlled sequential Monte Carlo
 
2012 mdsp pr05 particle filter
2012 mdsp pr05 particle filter2012 mdsp pr05 particle filter
2012 mdsp pr05 particle filter
 
Univariate Financial Time Series Analysis
Univariate Financial Time Series AnalysisUnivariate Financial Time Series Analysis
Univariate Financial Time Series Analysis
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
SMC^2: an algorithm for sequential analysis of state-space models
SMC^2: an algorithm for sequential analysis of state-space modelsSMC^2: an algorithm for sequential analysis of state-space models
SMC^2: an algorithm for sequential analysis of state-space models
 
Presentation of SMC^2 at BISP7
Presentation of SMC^2 at BISP7Presentation of SMC^2 at BISP7
Presentation of SMC^2 at BISP7
 
z transforms
z transformsz transforms
z transforms
 
Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...
 
Sequential Monte Carlo algorithms for agent-based models of disease transmission
Sequential Monte Carlo algorithms for agent-based models of disease transmissionSequential Monte Carlo algorithms for agent-based models of disease transmission
Sequential Monte Carlo algorithms for agent-based models of disease transmission
 
On estimating the integrated co volatility using
On estimating the integrated co volatility usingOn estimating the integrated co volatility using
On estimating the integrated co volatility using
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
A Note on TopicRNN
A Note on TopicRNNA Note on TopicRNN
A Note on TopicRNN
 
Ml mle_bayes
Ml  mle_bayesMl  mle_bayes
Ml mle_bayes
 
Mathematics and AI
Mathematics and AIMathematics and AI
Mathematics and AI
 
Score-driven models for forecasting - Blasques F., Koopman S.J., Lucas A.. Ju...
Score-driven models for forecasting - Blasques F., Koopman S.J., Lucas A.. Ju...Score-driven models for forecasting - Blasques F., Koopman S.J., Lucas A.. Ju...
Score-driven models for forecasting - Blasques F., Koopman S.J., Lucas A.. Ju...
 
Speech signal time frequency representation
Speech signal time frequency representationSpeech signal time frequency representation
Speech signal time frequency representation
 
Nyquist criterion for zero ISI
Nyquist criterion for zero ISINyquist criterion for zero ISI
Nyquist criterion for zero ISI
 
Stochastic Frank-Wolfe for Constrained Finite Sum Minimization @ Montreal Opt...
Stochastic Frank-Wolfe for Constrained Finite Sum Minimization @ Montreal Opt...Stochastic Frank-Wolfe for Constrained Finite Sum Minimization @ Montreal Opt...
Stochastic Frank-Wolfe for Constrained Finite Sum Minimization @ Montreal Opt...
 
Sequential Monte Carlo Algorithms for Agent-based Models of Disease Transmission
Sequential Monte Carlo Algorithms for Agent-based Models of Disease TransmissionSequential Monte Carlo Algorithms for Agent-based Models of Disease Transmission
Sequential Monte Carlo Algorithms for Agent-based Models of Disease Transmission
 

Recently uploaded

Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
KarakKing
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
heathfieldcps1
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
QucHHunhnh
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
ciinovamais
 

Recently uploaded (20)

2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and Modifications
 
Fostering Friendships - Enhancing Social Bonds in the Classroom
Fostering Friendships - Enhancing Social Bonds  in the ClassroomFostering Friendships - Enhancing Social Bonds  in the Classroom
Fostering Friendships - Enhancing Social Bonds in the Classroom
 
Graduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - EnglishGraduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - English
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdfUGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024
 
Spatium Project Simulation student brief
Spatium Project Simulation student briefSpatium Project Simulation student brief
Spatium Project Simulation student brief
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 

Particle filtering

  • 1. Particle Filtering (Sequential Monte Carlo) and State-Space Processes Wei Wang Dec 13, 2010
  • 2. Outline Motivation of Particle Filters Two commonly used Particle Filters A Stochastic Volatility Model example
  • 3. State-Space Processes A general state-space model is defined as Observation equation: yt+1 ∼ p(yt+1 |xt+1 , θ) Evolution equation: xt+1 ∼ p(xt+1 |xt , θ) with initial state distribution p(x0 |θ) and prior p(θ). Some notation notes: 1. xt , yt denote a single observation or state at time t. 2. xt , yt denote vectors of observations or states up to time t. 3. T is the total number of observations and N is the Monte Carlo sample size in our SMC algorithm.
  • 4. Inferential Goals There are primarily four inferential goals, namely: 1. Filtering: the posterior distribution of current latent state at each time point and known parameters p(xt |yt , θ) for t = 1, . . . , T 2. Learning: the posterior distribution of parameter given all observations p(θ|yT ) 3. Smoothing: the posterior distribution of latent states given all the observations p(xT |yT ) 4. Model Assessment: the marginal distribution of observations, used for computing Bayes Factor p(yT )
  • 5. Inferential Goals Due to time constraints, we will only talk about filtering today. The dependence on parameters is depressed. 1. Filtering: the posterior distribution of current latent state at each time point and known parameters p(xt |yt ) for t = 1, . . . , T 2. Learning: the posterior distribution of parameter given all observations p(θ|yT ) 3. Smoothing: the posterior distribution of latent states given all the observations p(xT |yT ) 4. Model Assessment: the marginal distribution of observations, used for computing Bayes Factor p(yT )
  • 6. Kalman Filters When the state-space model is linear and Gaussian, it is known as Normal Dynamic Linear Model (NDLM). yt = Ft xt + vt xt = Gt xt−1 + wt vt ∼ N(0, σ2 ) t wt ∼ N(0, τ2 ) t in which noises vt and wt are temporally and mutually. The celebrated Kalman Filters solve this model using Kalman Recursions.
  • 7. But the real world is not NDLM Most real world applications are not inside the realm of NDLM. yt |xt ∼ p(yt |xt ), xt |xt−1 ∼ p(xt |xt−1 ) We cannot get the analytical form of the recursions as we can in NDLM. p(xt |yt−1 ) = p(xt |xt−1 )p(xt−1 |yt−1 ) dxt−1 p(xt |yt ) ∝ p(yt |xt )p(xt |yt−1 ) But the form of the above display suggests that some kind of Monte Carlo methods might work.
  • 8. First Particle Filter: Bootstrap Filter Gordon, Salmond and Smith (1993) proposed one of the earliest (and still popular) particle filter: bootstrap filter. p(xt |yt−1 ) = p(xt |xt−1 )p(xt−1 |yt−1 ) dxt−1 (1) p(xt |yt ) ∝ p(yt |xt )p(xt |yt−1 ) (2) The rough idea is when we have Monte Carlo samples (particles) from p(xt−1 |yt−1 ), we can generate new particles from p(xt |yt−1 ) based on (1), and then resample from new particles based on likelihood ratio to get the particles from p(xt |yt ).
  • 9. Bootstrap Filter (i) (i) 1. Propagate {xt−1 }N 1 to {xt }N 1 via p(xt |xt−1 ) i= ˜ i= (i) (i) 2. Resample {xt }N 1 from {xt−1 }N 1 with weights i= ˜ i= (i) wi ∝ p(yt |xt ) t ˜ Algorithm 1: Bootstrap Filter Bootstrap Filter belongs to the class of Propagate-Resample Filters. Bootstrap Filter is incredibly easy to implement, since we only need to be able to sample from evolutionary density p(xt |xt−1 ) and to compute likelihood p(yt |xt ). Notice that the Resampling step is not essential.
  • 10. Drawback of Bootstrap Filter However, Bootstrap Filter suffers from severe Model Degeneracy (Model Impoverishment). In the Propagation step, information of new observation yt is not incorporated, so the propagated states may not be important or high likelihood states, and so at the resampling step most weights are given to a few of the new states. Then we won’t have a good Monte Carlo sample of the objective distribution. Of course, if we set the MC sample size N large enough, we might reduce the Model Degeneracy to a reasonable range, but it is too computationally expensive.
  • 11. Another Approach: Auxiliary Particle Filter The problem with Bootstrap Filter is the blind propagation. Playing with math, we can find the following thing p(xt |yt ) ∝ p(xt |xt−1 , yt )p(yt |xt−1 )p(xt−1 |yt−1 ) Based on this Pitt and Shephard (1999) proposed a Resample-Propagate scheme. The idea is with MC samples from p(xt−1 |yt−1 ), we resample to get p(xt−1 |yt ), then propagate to get MC samples from p(xt |yt ).
  • 12. Auxiliary Particle Filter (i) (i) 1. Resample {xt−1 }N 1 from {xt−1 }N 1 with weights ˜ i= i= (i) wi ∝ p(yt |xt−1 ) t (i) (i) 2. Propagate {xt−1 }N 1 to {xt }N 1 via p(xt |xt−1 , yt ) ˜ i= i= Algorithm 2: Auxiliary Particle Filter In APF, we use current observation yt in our first resampling step, so only “good” particles are propagated forward. It should be more efficient than Boostrap Filter.
  • 13. Drawback of APF But nothing comes for free. In order to use the Algorithm given above, we need to (1) be able to compute the predictive likelihood p(yt |xt−1 ) and (2) be able to sample from “evolution-posterior” density p(xt |xt−1 , yt ). These are often impossible in most cases. An general approach is Importance Sampling. Pitts and Shephard (1999) suggests: 1. Use p(yt |g(xt−1 )), i.e. likelihood p(yt |xt ) evaluated at g(xt−1 ) as proposal weights of resample; 2. Use evolution kernel p(xt |xt−1 ) as the proposal density to propagate particles.
  • 14. An Applicable APF (i) (i) 1. Resample {xt−1 }N 1 from {xt−1 }N 1 with weights ˜ i= i= (i) wi ∝ p(yt |g(xt−1 )) t (i) (i) 2. Propagate {xt−1 }N 1 to {xt }N 1 via p(xt |xt−1 ) ˜ i= ˜ i= (i) (i) 3. Resample {xt }N 1 from {xt }N 1 with weights i= ˜ i= (i) p(yt |xt ) ˜ wi ∝ t (i) p(yt |g(xt−1 )) ˜ Algorithm 3: An Applicable Auxiliary Particle Filter The performance of APF depends hugely on how we choose proposal density. In some cases, it might perform worse than Bootsrap Filter.
  • 15. Example: Stochastic Volatility Model Consider a very simple log-stochastic volatility model yt = Vt εt log(Vt ) = α + β log(Vt−1 ) + σηt log(V1 ) ∼ N(0, σ2 ) 0 α = 0, β = .95, σ = σ0 = .1, T = 500 4 2 y 0 −2 100 200 300 400 500 Time
  • 16. Example: Stochastic Volatility Model Volatility BF 3.5 3.0 2.5 q q q q q qq q q qqq 2.0 qqq q qq qqq q q q qq q q qq qqq qq q qq q q 1.5 q q qq q qq q qq q q q q qq q q q qq q q q qq q q q q qq qq q q q qq q q q qqq qqq q q q q q q qq qq q qq q q q q q q q q q qq q q q q qq qq q qq qqqqqq q q q qq q q q qq q q qqq q q q q qq q q qq q q q qq qq qq qq qq q q q q q qq q q q qq qq qq q qq q qqq q q 1.0 q q q q q qqq q qq qq q qq q q qq q qq qq qq q q q q q qqq q q q q q qq qq qq q qqqq qq q q qq qq qq qqqqq qqq q q q q q q qq q q qq q q q q q q q qq q q q q qq qq q qqq qq q q q qq q q q q q qq qq q q qq qqq qq qq q q q q q q q qq q q q q qq qq qqq q q q q q qq qq q q q qqq qq qqq q q q qqq q q qq q q q qq q q q q q q q qq q qqqq q q q q q q qq q q q q q qqq q q q qq q q q qq qqq qq q q qqq q qq q q qqqq q q q qqq q qq q q q q q q q qq q q qq qq q qq q q qqqq q q qq q q qq q qq q qqqqq q qq q qq 0.5 q qqqq qqq q 100 200 300 400 500 Time Volatility APF 3.5 3.0 2.5 q q q q q qq q q qqq 2.0 qqq q qq qqq q q q qq q q qq qqq qq q qq q q 1.5 q q qq q qq q qq q q q q qq q q q qq q q q qq q q q q qq qq q q q qq q q q qqq qqq q q q q q q qq qq q qq q q q q q q q q q qq q q q q qq qq q qq qqqqqq q q q qq q q q qq q q qqq q q q q qq q q qq q q q qq qq qq qq qq q qq q q q qq q q q qq qq qq q qq q qqq q q q 1.0 q q q q q qqq q qq qq qq q q qq q q q q qqqqq qq q qq qq q qq qq q q q q qq qq qqqq qq q q q qq q qq q q q qq qq q q q qq q q q q qqq qq q qq q q q q q q q qqqqq qq qqq qq qqqqq q qq qq qqq qqqq q q q qqq q q qqq q qqq qqq q q qq q q q qq q q qqq qq q q q q qq q q qq q qq q q q q q qq qq q q qq q q qq q q qqq q q q q qqq q q q q qq qqq q qq q qqqq q qq q q qqq q qqq q qqq qq qq q q q q q q q q q q q q q q qq q qqq q q q qqqq q q qq q q q q qq q qq q qq q q qq q qq q qqqqq q qq 0.5 q qqqq qqq q qq q 100 200 300 400 500 Time