SlideShare a Scribd company logo
1 of 19
Download to read offline
Semi-automatic ABC
            A Discussion

              Christian P. Robert

Universit´ Paris-Dauphine (CEREMADE), IuF, & CREST
         e


      Royal Statistical Society, Dec. 14
F&P’s ABC



  Use of a summary statistic S(·), an importance proposal g(·), a
  kernel K(·) 1 and a bandwidth h > 0 such that

                       (θ, ysim ) ∼ g(θ)f(ysim |θ)

  is accepted with probability (hence the bound)

                        K[{S(ysim ) − sobs }/h]

  and the corresponding importance weight defined by

                              π(θ) g(θ)
Errors, errors, and errors


   Three levels of approximation
       π(θ|yobs ) by π(θ|sobs ) loss of information
                                                                       [ignored]
       π(θ|sobs ) by

                                   π(s)K[{s − sobs }/h]π(θ|s) ds
                πABC (θ|sobs ) =
                                      π(s)K[{s − sobs }/h] ds

       noisy observations
       πABC (θ|sobs ) by importance Monte Carlo based on N
       simulations, represented by var(a(θ)|sobs )/Nacc [expected
       number of acceptances]
                                                      [M. Twain/B. Disraeli]
Average acceptance asymptotics



   For the average acceptance probability/approximate likelihood

           p(θ|sobs ) = f(ysim |θ) K[{S(ysim ) − sobs }/h] dysim ,

   overall acceptance probability

           p(sobs ) = p(θ|sobs ) π(θ) dθ = π(sobs )hd + o(hd )

                                                               [Lemma 1]
Calibration of h


       “This result gives insight into how S(·) and h affect the Monte
       Carlo error. To minimize Monte Carlo error, we need hd to be not
       too small. Thus ideally we want S(·) to be a low dimensional
       summary of the data that is sufficiently informative about θ that
       π(θ|sobs ) is close, in some sense, to π(θ|yobs )” (p.5)


       turns h into an absolute value while it should be
       context-dependent and user-calibrated
       only addresses one term in the approximation error and
       acceptance probability
       h large prevents πABC (θ|sobs ) to be close to π(θ|sobs )
       d small prevents π(θ|sobs ) to be close to π(θ|yobs )
Calibrating ABC



      “If πABC is calibrated, then this means that probability statements
      that are derived from it are appropriate, and in particular that we
      can use πABC to quantify uncertainty in estimates” (p.5)
Calibrating ABC



       “If πABC is calibrated, then this means that probability statements
       that are derived from it are appropriate, and in particular that we
       can use πABC to quantify uncertainty in estimates” (p.5)


   Definition
   For 0 < q < 1 and subset A, fix [one specific?/all?] event Eq (A)
   with PrABC (θ ∈ Eq (A)|sobs ) = q. Then ABC is calibrated if

                            Pr(θ ∈ A|Eq (A)) = q



   unclear meaning of conditioning on Eq (A)
Calibrated ABC




   Theorem
   Noisy ABC, where

                   sobs = S(yobs ) + h ,   ∼ K(·)

   is calibrated
                                                    [Wilkinson, 2008]
                      no condition on h
Calibrated ABC




   Consequence: when h = ∞
   Theorem
   The prior distribution is always calibrated

   is this a relevant property then?
Converging ABC




  Theorem
  For noisy ABC, the expected noisy-ABC log-likelihood,

     E {log[p(θ|sobs )]} =   log[p(θ|S(yobs ) + )]π(yobs |θ0 )K( )dyobs d ,

  has its maximum at θ = θ0 .

  True for any choice of summary statistic? even ancilary statistics?!
                                   [Imposes at least identifiability...]
  Relevant in asymptotia and not for the data
Converging ABC




  Corollary
  For noisy ABC, the ABC posterior converges onto a point mass on
  the true parameter value as m → ∞.

  For standard ABC, not always the case (unless h goes to zero).
  Strength of regularity conditions (c1) and (c2) in Bernardo
  & Smith, 1994?
                [out-of-reach constraints on likelihood and posterior]
  Again, there must be conditions imposed upon summary statistic...
Loss motivated statistic


   Under quadratic loss function,
   Theorem
                                           ˆ
    (i) The minimal posterior error E[L(θ, θ)|yobs ] occurs when
        ˆ
        θ = E(θ|yobs ) (!)
    (ii) When h → 0, EABC (θ|sobs ) converges to E(θ|yobs )
                                           ˆ
   (iii) If S(yobs ) = E[θ|yobs ] then for θ = EABC [θ|sobs ]

                 ˆ
          E[L(θ, θ)|yobs ] = trace(AΣ) + h2 xT AxK(x)dx + o(h2 ).

   measure-theoretic difficulties?
   dependence of sobs on h makes me uncomfortable
   Relevant for choice of K?
Optimal summary statistic

       “We take a different approach, and weaken the requirement for
       πABC to be a good approximation to π(θ|yobs ). We argue for πABC
       to be a good approximation solely in terms of the accuracy of
       certain estimates of the parameters.” (p.5)

   From this result, F&P
       derive their choice of summary statistic,

                                  S(y) = E(θ|y)

                                                               [almost sufficient]
       suggest

                 h = O(N−1/(2+d) )      and h = O(N−1/(4+d) )

       as optimal bandwidths for noisy and standard ABC.
Optimal summary statistic

       “We take a different approach, and weaken the requirement for
       πABC to be a good approximation to π(θ|yobs ). We argue for πABC
       to be a good approximation solely in terms of the accuracy of
       certain estimates of the parameters.” (p.5)

   From this result, F&P
       derive their choice of summary statistic,

                                  S(y) = E(θ|y)

                                           [wow! EABC [θ|S(yobs )] = E[θ|yobs ]]
       suggest

                 h = O(N−1/(2+d) )      and h = O(N−1/(4+d) )

       as optimal bandwidths for noisy and standard ABC.
Caveat




  Since E(θ|yobs ) is most usually unavailable, F&P suggest
    (i) use a pilot run of ABC to determine a region of non-negligible
        posterior mass;
   (ii) simulate sets of parameter values and data;
  (iii) use the simulated sets of parameter values and data to
        estimate the summary statistic; and
   (iv) run ABC with this choice of summary statistic.
Caveat




  Since E(θ|yobs ) is most usually unavailable, F&P suggest
    (i) use a pilot run of ABC to determine a region of non-negligible
        posterior mass;
   (ii) simulate sets of parameter values and data;
  (iii) use the simulated sets of parameter values and data to
        estimate the summary statistic; and
   (iv) run ABC with this choice of summary statistic.
  where is the assessment of the first stage error?
Approximating the summary statistic




   As Beaumont et al. (2002) and Blum and Fran¸ois (2010), F&P
                                                    c
   use a linear regression to approximate E(θ|yobs ):
                          (i)
                    θi = β0 + β(i) f(yobs ) +   i

                                           [Further justifications?]
Questions



      dependence on h and S(·) in the early stage
      reduction of Bayesian inference to point estimation
      approximation error in step (iii) not accounted for
      not parameterisation invariant
      practice shows that proper approximation to genuine posterior
      distributions stems from using a (much) larger number of
      summary statistics than the dimension of the parameter
      the validity of the approximation to the optimal summary
      statistic depends on the quality of the pilot run;
      important inferential issues like model choice are not covered
      by this approach.
envoi

More Related Content

What's hot

Non determinism through type isomophism
Non determinism through type isomophismNon determinism through type isomophism
Non determinism through type isomophismAlejandro Díaz-Caro
 
Evolutionary Strategies as an alternative to Reinforcement Learning
Evolutionary Strategies as an alternative to Reinforcement LearningEvolutionary Strategies as an alternative to Reinforcement Learning
Evolutionary Strategies as an alternative to Reinforcement LearningAshwin Rao
 
Introduction to modern Variational Inference.
Introduction to modern Variational Inference.Introduction to modern Variational Inference.
Introduction to modern Variational Inference.Tomasz Kusmierczyk
 
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...Don Sheehy
 
Loss Calibrated Variational Inference
Loss Calibrated Variational InferenceLoss Calibrated Variational Inference
Loss Calibrated Variational InferenceTomasz Kusmierczyk
 
Efficient Random-Walk Methods forApproximating Polytope Volume
Efficient Random-Walk Methods forApproximating Polytope VolumeEfficient Random-Walk Methods forApproximating Polytope Volume
Efficient Random-Walk Methods forApproximating Polytope VolumeVissarion Fisikopoulos
 
NIPS2007: learning using many examples
NIPS2007: learning using many examplesNIPS2007: learning using many examples
NIPS2007: learning using many exampleszukun
 
Stinespring’s theorem for maps on hilbert c star modules
Stinespring’s theorem for maps on hilbert c star modulesStinespring’s theorem for maps on hilbert c star modules
Stinespring’s theorem for maps on hilbert c star moduleswtyru1989
 
High-dimensional polytopes defined by oracles: algorithms, computations and a...
High-dimensional polytopes defined by oracles: algorithms, computations and a...High-dimensional polytopes defined by oracles: algorithms, computations and a...
High-dimensional polytopes defined by oracles: algorithms, computations and a...Vissarion Fisikopoulos
 
My PhD defence
My PhD defenceMy PhD defence
My PhD defenceJialin LIU
 
Model Checking Base on Interoplation
Model Checking Base onInteroplationModel Checking Base onInteroplation
Model Checking Base on Interoplationleticia2307
 
Incremental Topological Ordering (and Cycle Detection)
Incremental Topological Ordering (and Cycle Detection)Incremental Topological Ordering (and Cycle Detection)
Incremental Topological Ordering (and Cycle Detection)⌨️ Andrey Goder
 
Automatic variational inference with latent categorical variables
Automatic variational inference with latent categorical variablesAutomatic variational inference with latent categorical variables
Automatic variational inference with latent categorical variablesTomasz Kusmierczyk
 
3. TRIGONOMETRIC LEVELLING (SUR) 3140601 GTU
3. TRIGONOMETRIC LEVELLING (SUR) 3140601 GTU3. TRIGONOMETRIC LEVELLING (SUR) 3140601 GTU
3. TRIGONOMETRIC LEVELLING (SUR) 3140601 GTUVATSAL PATEL
 
Control Synthesis by Sum of Squares Optimization
Control Synthesis by Sum of Squares OptimizationControl Synthesis by Sum of Squares Optimization
Control Synthesis by Sum of Squares OptimizationBehzad Samadi
 
Representation formula for traffic flow estimation on a network
Representation formula for traffic flow estimation on a networkRepresentation formula for traffic flow estimation on a network
Representation formula for traffic flow estimation on a networkGuillaume Costeseque
 

What's hot (19)

Non determinism through type isomophism
Non determinism through type isomophismNon determinism through type isomophism
Non determinism through type isomophism
 
Evolutionary Strategies as an alternative to Reinforcement Learning
Evolutionary Strategies as an alternative to Reinforcement LearningEvolutionary Strategies as an alternative to Reinforcement Learning
Evolutionary Strategies as an alternative to Reinforcement Learning
 
Introduction to modern Variational Inference.
Introduction to modern Variational Inference.Introduction to modern Variational Inference.
Introduction to modern Variational Inference.
 
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
 
Loss Calibrated Variational Inference
Loss Calibrated Variational InferenceLoss Calibrated Variational Inference
Loss Calibrated Variational Inference
 
Efficient Random-Walk Methods forApproximating Polytope Volume
Efficient Random-Walk Methods forApproximating Polytope VolumeEfficient Random-Walk Methods forApproximating Polytope Volume
Efficient Random-Walk Methods forApproximating Polytope Volume
 
NIPS2007: learning using many examples
NIPS2007: learning using many examplesNIPS2007: learning using many examples
NIPS2007: learning using many examples
 
Stinespring’s theorem for maps on hilbert c star modules
Stinespring’s theorem for maps on hilbert c star modulesStinespring’s theorem for maps on hilbert c star modules
Stinespring’s theorem for maps on hilbert c star modules
 
High-dimensional polytopes defined by oracles: algorithms, computations and a...
High-dimensional polytopes defined by oracles: algorithms, computations and a...High-dimensional polytopes defined by oracles: algorithms, computations and a...
High-dimensional polytopes defined by oracles: algorithms, computations and a...
 
Athens workshop on MCMC
Athens workshop on MCMCAthens workshop on MCMC
Athens workshop on MCMC
 
Volume computation and applications
Volume computation and applications Volume computation and applications
Volume computation and applications
 
Mapping analysis
Mapping analysisMapping analysis
Mapping analysis
 
My PhD defence
My PhD defenceMy PhD defence
My PhD defence
 
Model Checking Base on Interoplation
Model Checking Base onInteroplationModel Checking Base onInteroplation
Model Checking Base on Interoplation
 
Incremental Topological Ordering (and Cycle Detection)
Incremental Topological Ordering (and Cycle Detection)Incremental Topological Ordering (and Cycle Detection)
Incremental Topological Ordering (and Cycle Detection)
 
Automatic variational inference with latent categorical variables
Automatic variational inference with latent categorical variablesAutomatic variational inference with latent categorical variables
Automatic variational inference with latent categorical variables
 
3. TRIGONOMETRIC LEVELLING (SUR) 3140601 GTU
3. TRIGONOMETRIC LEVELLING (SUR) 3140601 GTU3. TRIGONOMETRIC LEVELLING (SUR) 3140601 GTU
3. TRIGONOMETRIC LEVELLING (SUR) 3140601 GTU
 
Control Synthesis by Sum of Squares Optimization
Control Synthesis by Sum of Squares OptimizationControl Synthesis by Sum of Squares Optimization
Control Synthesis by Sum of Squares Optimization
 
Representation formula for traffic flow estimation on a network
Representation formula for traffic flow estimation on a networkRepresentation formula for traffic flow estimation on a network
Representation formula for traffic flow estimation on a network
 

Viewers also liked

Approximating Bayes Factors
Approximating Bayes FactorsApproximating Bayes Factors
Approximating Bayes FactorsChristian Robert
 
Simulation (AMSI Public Lecture)
Simulation (AMSI Public Lecture)Simulation (AMSI Public Lecture)
Simulation (AMSI Public Lecture)Christian Robert
 
Bayesian Data Analysis for Ecology
Bayesian Data Analysis for EcologyBayesian Data Analysis for Ecology
Bayesian Data Analysis for EcologyChristian Robert
 
Omiros' talk on the Bernoulli factory problem
Omiros' talk on the  Bernoulli factory problemOmiros' talk on the  Bernoulli factory problem
Omiros' talk on the Bernoulli factory problemBigMC
 
Bayesian model choice in cosmology
Bayesian model choice in cosmologyBayesian model choice in cosmology
Bayesian model choice in cosmologyChristian Robert
 
Introducing Monte Carlo Methods with R
Introducing Monte Carlo Methods with RIntroducing Monte Carlo Methods with R
Introducing Monte Carlo Methods with RChristian Robert
 
A Vanilla Rao-Blackwellisation
A Vanilla Rao-BlackwellisationA Vanilla Rao-Blackwellisation
A Vanilla Rao-BlackwellisationChristian Robert
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical MethodsChristian Robert
 
MCMC and likelihood-free methods
MCMC and likelihood-free methodsMCMC and likelihood-free methods
MCMC and likelihood-free methodsChristian Robert
 
Presentation of Bassoum Abou on Stein's 1981 AoS paper
Presentation of Bassoum Abou on Stein's 1981 AoS paperPresentation of Bassoum Abou on Stein's 1981 AoS paper
Presentation of Bassoum Abou on Stein's 1981 AoS paperChristian Robert
 
Reading Birnbaum's (1962) paper, by Li Chenlu
Reading Birnbaum's (1962) paper, by Li ChenluReading Birnbaum's (1962) paper, by Li Chenlu
Reading Birnbaum's (1962) paper, by Li ChenluChristian Robert
 
Reading Testing a point-null hypothesis, by Jiahuan Li, Feb. 25, 2013
Reading Testing a point-null hypothesis, by Jiahuan Li, Feb. 25, 2013Reading Testing a point-null hypothesis, by Jiahuan Li, Feb. 25, 2013
Reading Testing a point-null hypothesis, by Jiahuan Li, Feb. 25, 2013Christian Robert
 
Course on Bayesian computational methods
Course on Bayesian computational methodsCourse on Bayesian computational methods
Course on Bayesian computational methodsChristian Robert
 
Reading the Lindley-Smith 1973 paper on linear Bayes estimators
Reading the Lindley-Smith 1973 paper on linear Bayes estimatorsReading the Lindley-Smith 1973 paper on linear Bayes estimators
Reading the Lindley-Smith 1973 paper on linear Bayes estimatorsChristian Robert
 
Theory of Probability revisited
Theory of Probability revisitedTheory of Probability revisited
Theory of Probability revisitedChristian Robert
 

Viewers also liked (20)

Approximating Bayes Factors
Approximating Bayes FactorsApproximating Bayes Factors
Approximating Bayes Factors
 
Shanghai tutorial
Shanghai tutorialShanghai tutorial
Shanghai tutorial
 
Simulation (AMSI Public Lecture)
Simulation (AMSI Public Lecture)Simulation (AMSI Public Lecture)
Simulation (AMSI Public Lecture)
 
Bayesian Data Analysis for Ecology
Bayesian Data Analysis for EcologyBayesian Data Analysis for Ecology
Bayesian Data Analysis for Ecology
 
Omiros' talk on the Bernoulli factory problem
Omiros' talk on the  Bernoulli factory problemOmiros' talk on the  Bernoulli factory problem
Omiros' talk on the Bernoulli factory problem
 
Bayesian model choice in cosmology
Bayesian model choice in cosmologyBayesian model choice in cosmology
Bayesian model choice in cosmology
 
Introducing Monte Carlo Methods with R
Introducing Monte Carlo Methods with RIntroducing Monte Carlo Methods with R
Introducing Monte Carlo Methods with R
 
Savage-Dickey paradox
Savage-Dickey paradoxSavage-Dickey paradox
Savage-Dickey paradox
 
A Vanilla Rao-Blackwellisation
A Vanilla Rao-BlackwellisationA Vanilla Rao-Blackwellisation
A Vanilla Rao-Blackwellisation
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical Methods
 
Jsm09 talk
Jsm09 talkJsm09 talk
Jsm09 talk
 
MCMC and likelihood-free methods
MCMC and likelihood-free methodsMCMC and likelihood-free methods
MCMC and likelihood-free methods
 
Presentation of Bassoum Abou on Stein's 1981 AoS paper
Presentation of Bassoum Abou on Stein's 1981 AoS paperPresentation of Bassoum Abou on Stein's 1981 AoS paper
Presentation of Bassoum Abou on Stein's 1981 AoS paper
 
Reading Birnbaum's (1962) paper, by Li Chenlu
Reading Birnbaum's (1962) paper, by Li ChenluReading Birnbaum's (1962) paper, by Li Chenlu
Reading Birnbaum's (1962) paper, by Li Chenlu
 
Reading Testing a point-null hypothesis, by Jiahuan Li, Feb. 25, 2013
Reading Testing a point-null hypothesis, by Jiahuan Li, Feb. 25, 2013Reading Testing a point-null hypothesis, by Jiahuan Li, Feb. 25, 2013
Reading Testing a point-null hypothesis, by Jiahuan Li, Feb. 25, 2013
 
Course on Bayesian computational methods
Course on Bayesian computational methodsCourse on Bayesian computational methods
Course on Bayesian computational methods
 
Reading the Lindley-Smith 1973 paper on linear Bayes estimators
Reading the Lindley-Smith 1973 paper on linear Bayes estimatorsReading the Lindley-Smith 1973 paper on linear Bayes estimators
Reading the Lindley-Smith 1973 paper on linear Bayes estimators
 
ABC in Roma
ABC in RomaABC in Roma
ABC in Roma
 
Theory of Probability revisited
Theory of Probability revisitedTheory of Probability revisited
Theory of Probability revisited
 
Reading Neyman's 1933
Reading Neyman's 1933 Reading Neyman's 1933
Reading Neyman's 1933
 

Similar to Semi-automatic ABC: A Concise Discussion

Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011
Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011
Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011Christian Robert
 
Workshop on Bayesian Inference for Latent Gaussian Models with Applications
Workshop on Bayesian Inference for Latent Gaussian Models with ApplicationsWorkshop on Bayesian Inference for Latent Gaussian Models with Applications
Workshop on Bayesian Inference for Latent Gaussian Models with ApplicationsChristian Robert
 
Convergence of ABC methods
Convergence of ABC methodsConvergence of ABC methods
Convergence of ABC methodsChristian Robert
 
Colloquium in honor of Hans Ruedi Künsch
Colloquium in honor of Hans Ruedi KünschColloquium in honor of Hans Ruedi Künsch
Colloquium in honor of Hans Ruedi KünschChristian Robert
 
ABC short course: survey chapter
ABC short course: survey chapterABC short course: survey chapter
ABC short course: survey chapterChristian Robert
 
Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceChristian Robert
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationChristian Robert
 
seminar at Princeton University
seminar at Princeton Universityseminar at Princeton University
seminar at Princeton UniversityChristian Robert
 
from model uncertainty to ABC
from model uncertainty to ABCfrom model uncertainty to ABC
from model uncertainty to ABCChristian Robert
 
Columbia workshop [ABC model choice]
Columbia workshop [ABC model choice]Columbia workshop [ABC model choice]
Columbia workshop [ABC model choice]Christian Robert
 

Similar to Semi-automatic ABC: A Concise Discussion (20)

Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011
Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011
Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011
 
ABC workshop: 17w5025
ABC workshop: 17w5025ABC workshop: 17w5025
ABC workshop: 17w5025
 
Workshop on Bayesian Inference for Latent Gaussian Models with Applications
Workshop on Bayesian Inference for Latent Gaussian Models with ApplicationsWorkshop on Bayesian Inference for Latent Gaussian Models with Applications
Workshop on Bayesian Inference for Latent Gaussian Models with Applications
 
Convergence of ABC methods
Convergence of ABC methodsConvergence of ABC methods
Convergence of ABC methods
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Intro to ABC
Intro to ABCIntro to ABC
Intro to ABC
 
Colloquium in honor of Hans Ruedi Künsch
Colloquium in honor of Hans Ruedi KünschColloquium in honor of Hans Ruedi Künsch
Colloquium in honor of Hans Ruedi Künsch
 
ABC short course: survey chapter
ABC short course: survey chapterABC short course: survey chapter
ABC short course: survey chapter
 
BIRS 12w5105 meeting
BIRS 12w5105 meetingBIRS 12w5105 meeting
BIRS 12w5105 meeting
 
Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de France
 
ABC model choice
ABC model choiceABC model choice
ABC model choice
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimation
 
seminar at Princeton University
seminar at Princeton Universityseminar at Princeton University
seminar at Princeton University
 
from model uncertainty to ABC
from model uncertainty to ABCfrom model uncertainty to ABC
from model uncertainty to ABC
 
Edinburgh, Bayes-250
Edinburgh, Bayes-250Edinburgh, Bayes-250
Edinburgh, Bayes-250
 
Boston talk
Boston talkBoston talk
Boston talk
 
Intractable likelihoods
Intractable likelihoodsIntractable likelihoods
Intractable likelihoods
 
von Mises lecture, Berlin
von Mises lecture, Berlinvon Mises lecture, Berlin
von Mises lecture, Berlin
 
Columbia workshop [ABC model choice]
Columbia workshop [ABC model choice]Columbia workshop [ABC model choice]
Columbia workshop [ABC model choice]
 

More from Christian Robert

Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinChristian Robert
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?Christian Robert
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Christian Robert
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Christian Robert
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking componentsChristian Robert
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihoodChristian Robert
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)Christian Robert
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerChristian Robert
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Christian Robert
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussionChristian Robert
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceChristian Robert
 
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment modelsa discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment modelsChristian Robert
 
ABC based on Wasserstein distances
ABC based on Wasserstein distancesABC based on Wasserstein distances
ABC based on Wasserstein distancesChristian Robert
 

More from Christian Robert (20)

Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael Martin
 
discussion of ICML23.pdf
discussion of ICML23.pdfdiscussion of ICML23.pdf
discussion of ICML23.pdf
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?
 
restore.pdf
restore.pdfrestore.pdf
restore.pdf
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking components
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihood
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like sampler
 
eugenics and statistics
eugenics and statisticseugenics and statistics
eugenics and statistics
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
 
asymptotics of ABC
asymptotics of ABCasymptotics of ABC
asymptotics of ABC
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussion
 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
 
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment modelsa discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
 
ABC based on Wasserstein distances
ABC based on Wasserstein distancesABC based on Wasserstein distances
ABC based on Wasserstein distances
 

Recently uploaded

Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptxSherlyMaeNeri
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatYousafMalik24
 
Quarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayQuarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayMakMakNepo
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPCeline George
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxRaymartEstabillo3
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementmkooblal
 
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxGrade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxChelloAnnAsuncion2
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17Celine George
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for BeginnersSabitha Banu
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfUjwalaBharambe
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfMr Bounab Samir
 

Recently uploaded (20)

Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptx
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice great
 
Quarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayQuarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up Friday
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERP
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
 
Raw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptxRaw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptx
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of management
 
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxGrade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for Beginners
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
 

Semi-automatic ABC: A Concise Discussion

  • 1. Semi-automatic ABC A Discussion Christian P. Robert Universit´ Paris-Dauphine (CEREMADE), IuF, & CREST e Royal Statistical Society, Dec. 14
  • 2. F&P’s ABC Use of a summary statistic S(·), an importance proposal g(·), a kernel K(·) 1 and a bandwidth h > 0 such that (θ, ysim ) ∼ g(θ)f(ysim |θ) is accepted with probability (hence the bound) K[{S(ysim ) − sobs }/h] and the corresponding importance weight defined by π(θ) g(θ)
  • 3. Errors, errors, and errors Three levels of approximation π(θ|yobs ) by π(θ|sobs ) loss of information [ignored] π(θ|sobs ) by π(s)K[{s − sobs }/h]π(θ|s) ds πABC (θ|sobs ) = π(s)K[{s − sobs }/h] ds noisy observations πABC (θ|sobs ) by importance Monte Carlo based on N simulations, represented by var(a(θ)|sobs )/Nacc [expected number of acceptances] [M. Twain/B. Disraeli]
  • 4. Average acceptance asymptotics For the average acceptance probability/approximate likelihood p(θ|sobs ) = f(ysim |θ) K[{S(ysim ) − sobs }/h] dysim , overall acceptance probability p(sobs ) = p(θ|sobs ) π(θ) dθ = π(sobs )hd + o(hd ) [Lemma 1]
  • 5. Calibration of h “This result gives insight into how S(·) and h affect the Monte Carlo error. To minimize Monte Carlo error, we need hd to be not too small. Thus ideally we want S(·) to be a low dimensional summary of the data that is sufficiently informative about θ that π(θ|sobs ) is close, in some sense, to π(θ|yobs )” (p.5) turns h into an absolute value while it should be context-dependent and user-calibrated only addresses one term in the approximation error and acceptance probability h large prevents πABC (θ|sobs ) to be close to π(θ|sobs ) d small prevents π(θ|sobs ) to be close to π(θ|yobs )
  • 6. Calibrating ABC “If πABC is calibrated, then this means that probability statements that are derived from it are appropriate, and in particular that we can use πABC to quantify uncertainty in estimates” (p.5)
  • 7. Calibrating ABC “If πABC is calibrated, then this means that probability statements that are derived from it are appropriate, and in particular that we can use πABC to quantify uncertainty in estimates” (p.5) Definition For 0 < q < 1 and subset A, fix [one specific?/all?] event Eq (A) with PrABC (θ ∈ Eq (A)|sobs ) = q. Then ABC is calibrated if Pr(θ ∈ A|Eq (A)) = q unclear meaning of conditioning on Eq (A)
  • 8. Calibrated ABC Theorem Noisy ABC, where sobs = S(yobs ) + h , ∼ K(·) is calibrated [Wilkinson, 2008] no condition on h
  • 9. Calibrated ABC Consequence: when h = ∞ Theorem The prior distribution is always calibrated is this a relevant property then?
  • 10. Converging ABC Theorem For noisy ABC, the expected noisy-ABC log-likelihood, E {log[p(θ|sobs )]} = log[p(θ|S(yobs ) + )]π(yobs |θ0 )K( )dyobs d , has its maximum at θ = θ0 . True for any choice of summary statistic? even ancilary statistics?! [Imposes at least identifiability...] Relevant in asymptotia and not for the data
  • 11. Converging ABC Corollary For noisy ABC, the ABC posterior converges onto a point mass on the true parameter value as m → ∞. For standard ABC, not always the case (unless h goes to zero). Strength of regularity conditions (c1) and (c2) in Bernardo & Smith, 1994? [out-of-reach constraints on likelihood and posterior] Again, there must be conditions imposed upon summary statistic...
  • 12. Loss motivated statistic Under quadratic loss function, Theorem ˆ (i) The minimal posterior error E[L(θ, θ)|yobs ] occurs when ˆ θ = E(θ|yobs ) (!) (ii) When h → 0, EABC (θ|sobs ) converges to E(θ|yobs ) ˆ (iii) If S(yobs ) = E[θ|yobs ] then for θ = EABC [θ|sobs ] ˆ E[L(θ, θ)|yobs ] = trace(AΣ) + h2 xT AxK(x)dx + o(h2 ). measure-theoretic difficulties? dependence of sobs on h makes me uncomfortable Relevant for choice of K?
  • 13. Optimal summary statistic “We take a different approach, and weaken the requirement for πABC to be a good approximation to π(θ|yobs ). We argue for πABC to be a good approximation solely in terms of the accuracy of certain estimates of the parameters.” (p.5) From this result, F&P derive their choice of summary statistic, S(y) = E(θ|y) [almost sufficient] suggest h = O(N−1/(2+d) ) and h = O(N−1/(4+d) ) as optimal bandwidths for noisy and standard ABC.
  • 14. Optimal summary statistic “We take a different approach, and weaken the requirement for πABC to be a good approximation to π(θ|yobs ). We argue for πABC to be a good approximation solely in terms of the accuracy of certain estimates of the parameters.” (p.5) From this result, F&P derive their choice of summary statistic, S(y) = E(θ|y) [wow! EABC [θ|S(yobs )] = E[θ|yobs ]] suggest h = O(N−1/(2+d) ) and h = O(N−1/(4+d) ) as optimal bandwidths for noisy and standard ABC.
  • 15. Caveat Since E(θ|yobs ) is most usually unavailable, F&P suggest (i) use a pilot run of ABC to determine a region of non-negligible posterior mass; (ii) simulate sets of parameter values and data; (iii) use the simulated sets of parameter values and data to estimate the summary statistic; and (iv) run ABC with this choice of summary statistic.
  • 16. Caveat Since E(θ|yobs ) is most usually unavailable, F&P suggest (i) use a pilot run of ABC to determine a region of non-negligible posterior mass; (ii) simulate sets of parameter values and data; (iii) use the simulated sets of parameter values and data to estimate the summary statistic; and (iv) run ABC with this choice of summary statistic. where is the assessment of the first stage error?
  • 17. Approximating the summary statistic As Beaumont et al. (2002) and Blum and Fran¸ois (2010), F&P c use a linear regression to approximate E(θ|yobs ): (i) θi = β0 + β(i) f(yobs ) + i [Further justifications?]
  • 18. Questions dependence on h and S(·) in the early stage reduction of Bayesian inference to point estimation approximation error in step (iii) not accounted for not parameterisation invariant practice shows that proper approximation to genuine posterior distributions stems from using a (much) larger number of summary statistics than the dimension of the parameter the validity of the approximation to the optimal summary statistic depends on the quality of the pilot run; important inferential issues like model choice are not covered by this approach.
  • 19. envoi