SlideShare a Scribd company logo
1 of 93
Download to read offline
ABC Methods for Bayesian Model Choice




                 ABC Methods for Bayesian Model Choice

                                        Christian P. Robert

                                Universit´ Paris-Dauphine, IuF, & CREST
                                         e
                               http://www.ceremade.dauphine.fr/~xian


                Joint work(s) with Jean-Marie Cornuet, Aude Grelaud,
                 Jean-Michel Marin, Natesh Pillai, & Judith Rousseau
ABC Methods for Bayesian Model Choice
  Approximate Bayesian computation




Approximate Bayesian computation


      Approximate Bayesian computation

      ABC for model choice

      Gibbs random fields

      Generic ABC model choice

      Model choice consistency
ABC Methods for Bayesian Model Choice
  Approximate Bayesian computation




Regular Bayesian computation issues


      When faced with a non-standard posterior distribution

                                        π(θ|y) ∝ π(θ)L(θ|y)

      the standard solution is to use simulation (Monte Carlo) to
      produce a sample
                                   θ1 , . . . , θ T
      from π(θ|y) (or approximately by Markov chain Monte Carlo
      methods)
                                              [Robert & Casella, 2004]
ABC Methods for Bayesian Model Choice
  Approximate Bayesian computation




Untractable likelihoods



      Cases when the likelihood function f (y|θ) is unavailable and when
      the completion step

                                        f (y|θ) =       f (y, z|θ) dz
                                                    Z

      is impossible or too costly because of the dimension of z
       c MCMC cannot be implemented!
ABC Methods for Bayesian Model Choice
  Approximate Bayesian computation




Untractable likelihoods




                               c MCMC cannot be implemented!
ABC Methods for Bayesian Model Choice
  Approximate Bayesian computation




The ABC method

      Bayesian setting: target is π(θ)f (x|θ)
ABC Methods for Bayesian Model Choice
  Approximate Bayesian computation




The ABC method

      Bayesian setting: target is π(θ)f (x|θ)
      When likelihood f (x|θ) not in closed form, likelihood-free rejection
      technique:
ABC Methods for Bayesian Model Choice
  Approximate Bayesian computation




The ABC method

      Bayesian setting: target is π(θ)f (x|θ)
      When likelihood f (x|θ) not in closed form, likelihood-free rejection
      technique:
      ABC algorithm
      For an observation y ∼ f (y|θ), under the prior π(θ), keep jointly
      simulating
                           θ ∼ π(θ) , z ∼ f (z|θ ) ,
      until the auxiliary variable z is equal to the observed value, z = y.

                                        [Rubin, 1984; Tavar´ et al., 1997]
                                                           e
ABC Methods for Bayesian Model Choice
  Approximate Bayesian computation




A as approximative


      When y is a continuous random variable, equality z = y is replaced
      with a tolerance condition,

                                        (y, z) ≤

      where        is a distance
ABC Methods for Bayesian Model Choice
  Approximate Bayesian computation




A as approximative


      When y is a continuous random variable, equality z = y is replaced
      with a tolerance condition,

                                         (y, z) ≤

      where is a distance
      Output distributed from

                           π(θ) Pθ { (y, z) < } ∝ π(θ| (y, z) < )
ABC Methods for Bayesian Model Choice
  Approximate Bayesian computation




ABC algorithm


      Algorithm 1 Likelihood-free rejection sampler
        for i = 1 to N do
          repeat
             generate θ from the prior distribution π(·)
             generate z from the likelihood f (·|θ )
          until ρ{η(z), η(y)} ≤
          set θi = θ
        end for

      where η(y) defines a (maybe in-sufficient) statistic
ABC Methods for Bayesian Model Choice
  Approximate Bayesian computation




Output

      The likelihood-free algorithm samples from the marginal in z of:

                                            π(θ)f (z|θ)IA ,y (z)
                            π (θ, z|y) =                           ,
                                           A ,y ×Θ π(θ)f (z|θ)dzdθ

      where A       ,y   = {z ∈ D|ρ(η(z), η(y)) < }.
ABC Methods for Bayesian Model Choice
  Approximate Bayesian computation




Output

      The likelihood-free algorithm samples from the marginal in z of:

                                             π(θ)f (z|θ)IA ,y (z)
                            π (θ, z|y) =                            ,
                                            A ,y ×Θ π(θ)f (z|θ)dzdθ

      where A       ,y   = {z ∈ D|ρ(η(z), η(y)) < }.
      The idea behind ABC is that the summary statistics coupled with a
      small tolerance should provide a good approximation of the
      posterior distribution:

                           π (θ|y) =       π (θ, z|y)dz ≈ π(θ|η(y)) .

                                                                  [Not garanteed!]
ABC Methods for Bayesian Model Choice
  ABC for model choice




ABC for model choice


      Approximate Bayesian computation

      ABC for model choice

      Gibbs random fields

      Generic ABC model choice

      Model choice consistency
ABC Methods for Bayesian Model Choice
  ABC for model choice




Bayesian model choice

      Principle
      Several models
                                         M1 , M2 , . . .
      are considered simultaneously for dataset y and model index M
      central to inference.
      Use of a prior π(M = m), plus a prior distribution on the
      parameter conditional on the value m of the model index, πm (θ m )
      Goal is to derive the posterior distribution of M,

                                        π(M = m|data)

      a challenging computational target when models are complex.
ABC Methods for Bayesian Model Choice
  ABC for model choice




Generic ABC for model choice


      Algorithm 2 Likelihood-free model choice sampler (ABC-MC)
        for t = 1 to T do
          repeat
             Generate m from the prior π(M = m)
             Generate θ m from the prior πm (θ m )
             Generate z from the model fm (z|θ m )
          until ρ{η(z), η(y)} <
          Set m(t) = m and θ (t) = θ m
        end for

                                        [Grelaud & al., 2009; Toni & al., 2009]
ABC Methods for Bayesian Model Choice
  ABC for model choice




ABC estimates

      Posterior probability π(M = m|y) approximated by the frequency
      of acceptances from model m
                                             T
                                         1
                                                   Im(t) =m .
                                         T
                                             t=1

      Early issues with implementation:
             should tolerances          be the same for all models?
             should summary statistics vary across models? incl. their
             dimension?
             should the distance measure ρ vary across models?
ABC Methods for Bayesian Model Choice
  ABC for model choice




ABC estimates



      Posterior probability π(M = m|y) approximated by the frequency
      of acceptances from model m
                                            T
                                        1
                                                  Im(t) =m .
                                        T
                                            t=1

        Extension to a weighted polychotomous logistic regression
      estimate of π(M = m|y), with non-parametric kernel weights
                                         [Cornuet et al., DIYABC, 2009]
ABC Methods for Bayesian Model Choice
  Gibbs random fields




Potts model

      Potts model
      Distribution with an energy function of the form

                                        θS(y) = θ         δyl =yi
                                                    l∼i

      where l∼i denotes a neighbourhood structure
ABC Methods for Bayesian Model Choice
  Gibbs random fields




Potts model

      Potts model
      Distribution with an energy function of the form

                                         θS(y) = θ           δyl =yi
                                                       l∼i

      where l∼i denotes a neighbourhood structure

      In most realistic settings, summation

                                        Zθ =         exp{θ T S(x)}
                                               x∈X

      involves too many terms to be manageable and numerical
      approximations cannot always be trusted
ABC Methods for Bayesian Model Choice
  Gibbs random fields




Neighbourhood relations



      Setup
      Choice to be made between M neighbourhood relations
                                        m
                                    i∼i           (0 ≤ m ≤ M − 1)

      with
                                            Sm (x) =         I{xi =xi }
                                                       m
                                                       i∼i

      driven by the posterior probabilities of the models.
ABC Methods for Bayesian Model Choice
  Gibbs random fields




Model index


      Computational target:

               P(M = m|x) ∝                  fm (x|θm )πm (θm ) dθm π(M = m)
                                        Θm
ABC Methods for Bayesian Model Choice
  Gibbs random fields




Model index


      Computational target:

               P(M = m|x) ∝                  fm (x|θm )πm (θm ) dθm π(M = m)
                                        Θm

      If S(x) sufficient statistic for the joint parameters
      (M, θ0 , . . . , θM −1 ),

                               P(M = m|x) = P(M = m|S(x)) .
ABC Methods for Bayesian Model Choice
  Gibbs random fields




Sufficient statistics in Gibbs random fields
ABC Methods for Bayesian Model Choice
  Gibbs random fields




Sufficient statistics in Gibbs random fields


      Each model m has its own sufficient statistic Sm (·) and
      S(·) = (S0 (·), . . . , SM −1 (·)) is also (model-)sufficient.
ABC Methods for Bayesian Model Choice
  Gibbs random fields




Sufficient statistics in Gibbs random fields


      Each model m has its own sufficient statistic Sm (·) and
      S(·) = (S0 (·), . . . , SM −1 (·)) is also (model-)sufficient.
      Explanation: For Gibbs random fields,
                                        1           2
                x|M = m ∼ fm (x|θm ) = fm (x|S(x))fm (S(x)|θm )
                                           1
                                     =         f 2 (S(x)|θm )
                                       n(S(x)) m

      where
                              n(S(x)) = {˜ ∈ X : S(˜ ) = S(x)}
                                         x         x
       c S(x) is sufficient for the joint parameters
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




More about sufficiency


             ‘Sufficient statistics for individual models are unlikely to
             be very informative for the model probability. This is
             already well known and understood by the ABC-user
             community.’
                                          [Scott Sisson, Jan. 31, 2011, ’Og]
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




More about sufficiency


             ‘Sufficient statistics for individual models are unlikely to
             be very informative for the model probability. This is
             already well known and understood by the ABC-user
             community.’
                                          [Scott Sisson, Jan. 31, 2011, ’Og]

      If η1 (x) sufficient statistic for model m = 1 and parameter θ1 and
      η2 (x) sufficient statistic for model m = 2 and parameter θ2 ,
      (η1 (x), η2 (x)) is not always sufficient for (m, θm )
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




More about sufficiency


             ‘Sufficient statistics for individual models are unlikely to
             be very informative for the model probability. This is
             already well known and understood by the ABC-user
             community.’
                                          [Scott Sisson, Jan. 31, 2011, ’Og]

      If η1 (x) sufficient statistic for model m = 1 and parameter θ1 and
      η2 (x) sufficient statistic for model m = 2 and parameter θ2 ,
      (η1 (x), η2 (x)) is not always sufficient for (m, θm )
       c Potential loss of information at the testing level
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Limiting behaviour of B12 (T → ∞)

      ABC approximation
                                         T
                                         t=1 Imt =1 Iρ{η(zt ),η(y)}≤
                             B12 (y) =   T
                                                                       ,
                                         t=1 Imt =2 Iρ{η(zt ),η(y)}≤

      where the (mt , z t )’s are simulated from the (joint) prior
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Limiting behaviour of B12 (T → ∞)

      ABC approximation
                                            T
                                            t=1 Imt =1 Iρ{η(zt ),η(y)}≤
                             B12 (y) =      T
                                                                          ,
                                            t=1 Imt =2 Iρ{η(zt ),η(y)}≤

      where the (mt , z t )’s are simulated from the (joint) prior
      As T go to infinity, limit

                                        Iρ{η(z),η(y)}≤ π1 (θ 1 )f1 (z|θ 1 ) dz dθ 1
                  B12 (y) =
                                        Iρ{η(z),η(y)}≤ π2 (θ 2 )f2 (z|θ 2 ) dz dθ 2
                                                              η
                                        Iρ{η,η(y)}≤ π1 (θ 1 )f1 (η|θ 1 ) dη dθ 1
                                =                             η                  ,
                                        Iρ{η,η(y)}≤ π2 (θ 2 )f2 (η|θ 2 ) dη dθ 2
             η               η
      where f1 (η|θ 1 ) and f2 (η|θ 2 ) distributions of η(z)
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Limiting behaviour of B12 ( → 0)




      When         goes to zero,
                                                    η
                               η          π1 (θ 1 )f1 (η(y)|θ 1 ) dθ 1
                              B12 (y) =             η
                                          π2 (θ 2 )f2 (η(y)|θ 2 ) dθ 2
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Limiting behaviour of B12 ( → 0)




      When         goes to zero,
                                                    η
                               η          π1 (θ 1 )f1 (η(y)|θ 1 ) dθ 1
                              B12 (y) =             η
                                          π2 (θ 2 )f2 (η(y)|θ 2 ) dθ 2

                 c Bayes factor based on the sole observation of η(y)
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Limiting behaviour of B12 (under sufficiency)

      If η(y) sufficient statistic in both models,

                                     fi (y|θ i ) = gi (y)fiη (η(y)|θ i )

      Thus
                                                   η
                                Θ1   π(θ 1 )g1 (y)f1 (η(y)|θ 1 ) dθ 1
         B12 (y) =                                 η
                                Θ2   π(θ 2 )g2 (y)f2 (η(y)|θ 2 ) dθ 2
                                                    η
                              g1 (y)      π1 (θ 1 )f1 (η(y)|θ 1 ) dθ 1   g1 (y) η
                        =                           η                  =       B (y) .
                              g2 (y)      π2 (θ 2 )f2 (η(y)|θ 2 ) dθ 2   g2 (y) 12

                                         [Didelot, Everitt, Johansen & Lawson, 2011]
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Limiting behaviour of B12 (under sufficiency)

      If η(y) sufficient statistic in both models,

                                     fi (y|θ i ) = gi (y)fiη (η(y)|θ i )

      Thus
                                                   η
                                Θ1   π(θ 1 )g1 (y)f1 (η(y)|θ 1 ) dθ 1
         B12 (y) =                                 η
                                Θ2   π(θ 2 )g2 (y)f2 (η(y)|θ 2 ) dθ 2
                                                    η
                              g1 (y)      π1 (θ 1 )f1 (η(y)|θ 1 ) dθ 1   g1 (y) η
                        =                           η                  =       B (y) .
                              g2 (y)      π2 (θ 2 )f2 (η(y)|θ 2 ) dθ 2   g2 (y) 12

                                         [Didelot, Everitt, Johansen & Lawson, 2011]
                  c No discrepancy only when cross-model sufficiency
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Poisson/geometric example

      Sample
                                           x = (x1 , . . . , xn )
      from either a Poisson P(λ) or from a geometric G(p)
      Sum
                                                 n
                                          S=          xi = η(x)
                                                i=1

      sufficient statistic for either model but not simultaneously
      Discrepancy ratio

                                        g1 (x)   S!n−S / i xi !
                                               =
                                        g2 (x)    1 n+S−1
                                                        S
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Poisson/geometric discrepancy

                               η
      Range of B12 (x) versus B12 (x): The values produced have
      nothing in common.
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Formal recovery



      Creating an encompassing exponential family
                                      T           T
      f (x|θ1 , θ2 , α1 , α2 ) ∝ exp{θ1 η1 (x) + θ2 η2 (x) + α1 t1 (x) + α2 t2 (x)}

      leads to a sufficient statistic (η1 (x), η2 (x), t1 (x), t2 (x))
                             [Didelot, Everitt, Johansen & Lawson, 2011]
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Formal recovery



      Creating an encompassing exponential family
                                      T           T
      f (x|θ1 , θ2 , α1 , α2 ) ∝ exp{θ1 η1 (x) + θ2 η2 (x) + α1 t1 (x) + α2 t2 (x)}

      leads to a sufficient statistic (η1 (x), η2 (x), t1 (x), t2 (x))
                             [Didelot, Everitt, Johansen & Lawson, 2011]

      In the Poisson/geometric case, if        i xi !   is added to S, no
      discrepancy
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Formal recovery



      Creating an encompassing exponential family
                                      T           T
      f (x|θ1 , θ2 , α1 , α2 ) ∝ exp{θ1 η1 (x) + θ2 η2 (x) + α1 t1 (x) + α2 t2 (x)}

      leads to a sufficient statistic (η1 (x), η2 (x), t1 (x), t2 (x))
                             [Didelot, Everitt, Johansen & Lawson, 2011]

      Only applies in genuine sufficiency settings...
      c Inability to evaluate loss brought by summary statistics
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Meaning of the ABC-Bayes factor

             ‘This is also why focus on model discrimination typically
             (...) proceeds by (...) accepting that the Bayes Factor
             that one obtains is only derived from the summary
             statistics and may in no way correspond to that of the
             full model.’
                                         [Scott Sisson, Jan. 31, 2011, ’Og]
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Meaning of the ABC-Bayes factor

             ‘This is also why focus on model discrimination typically
             (...) proceeds by (...) accepting that the Bayes Factor
             that one obtains is only derived from the summary
             statistics and may in no way correspond to that of the
             full model.’
                                                [Scott Sisson, Jan. 31, 2011, ’Og]

      In the Poisson/geometric case, if E[yi ] = θ0 > 0,

                                         η          (θ0 + 1)2 −θ0
                                    lim B12 (y) =            e
                                   n→∞                  θ0
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




MA example




                                                                                  0.7
                                                                    0.6
                                        0.6




                                                      0.6




                                                                                  0.6
                                                                    0.5
                                        0.5




                                                      0.5




                                                                                  0.5
                                                                    0.4
                                        0.4




                                                      0.4




                                                                                  0.4
                                                                    0.3
                                        0.3




                                                      0.3




                                                                                  0.3
                                                                    0.2
                                        0.2




                                                      0.2




                                                                                  0.2
                                                                    0.1
                                        0.1




                                                      0.1




                                                                                  0.1
                                        0.0




                                                      0.0




                                                                    0.0




                                                                                  0.0
                                              1   2         1   2         1   2         1   2




      Evolution [against ] of ABC Bayes factor, in terms of frequencies of
      visits to models MA(1) (left) and MA(2) (right) when equal to
      10, 1, .1, .01% quantiles on insufficient autocovariance distances. Sample
      of 50 points from a MA(2) with θ1 = 0.6, θ2 = 0.2. True Bayes factor
      equal to 17.71.
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




MA example




                                                                                  0.8
                                        0.6




                                                      0.6




                                                                    0.6




                                                                                  0.6
                                        0.4




                                                      0.4




                                                                    0.4




                                                                                  0.4
                                        0.2




                                                      0.2




                                                                    0.2




                                                                                  0.2
                                        0.0




                                                      0.0




                                                                    0.0




                                                                                  0.0
                                              1   2         1   2         1   2         1   2




      Evolution [against ] of ABC Bayes factor, in terms of frequencies of
      visits to models MA(1) (left) and MA(2) (right) when equal to
      10, 1, .1, .01% quantiles on insufficient autocovariance distances. Sample
      of 50 points from a MA(1) model with θ1 = 0.6. True Bayes factor B21
      equal to .004.
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




A population genetics evaluation


      Population genetics example with
             3 populations
             2 scenari
             15 individuals
             5 loci
             single mutation parameter
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




A population genetics evaluation


      Population genetics example with
             3 populations
             2 scenari
             15 individuals
             5 loci
             single mutation parameter
             24 summary statistics
             2 million ABC proposal
             importance [tree] sampling alternative
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Stability of importance sampling


                             1.0




                                        1.0




                                              1.0




                                                        1.0




                                                                  1.0
                                                              q




                                   q
                             0.8




                                        0.8




                                              0.8




                                                        0.8




                                                                  0.8
                             0.6




                                        0.6




                                              0.6




                                                        0.6




                                                                  0.6
                                                    q
                             0.4




                                        0.4




                                              0.4




                                                        0.4




                                                                  0.4
                             0.2




                                        0.2




                                              0.2




                                                        0.2




                                                                  0.2
                             0.0




                                        0.0




                                              0.0




                                                        0.0




                                                                  0.0
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Comparison with ABC
      Use of 24 summary statistics and DIY-ABC logistic correction

                                                       1.0
                                                                                                                                                                      qq
                                                                                                                                                                      qq
                                                                                                                                                                 qq qq
                                                                                                                                                                    qqq
                                                                                                                                                                 qq q q
                                                                                                                                                        qq       q q qq
                                                                                                                                                                 q q
                                                                                                                                                                     q
                                                                                                                               q            q          q         q q
                                                                                                                                                                 q
                                                                                                                                                         q qq q q
                                                                                                                                                             qq
                                                                                                                                                       q q q          qq
                                                                                                                                                qq     qq
                                                                                                                                                       q      q q q qq
                                                                                                                                                                   q q
                                                                                                                                                           q           q
                                                                                                                                                     q               q
                                                                                                                                                                    q q
                                                                                                                                            q      q
                                                                                                                                               q
                                                                                                                                                      qq
                                                                                                                                                       q      q q qq  q
                                                                                                                                                                      q
                                                       0.8




                                                                                                                                             q                      q
                                                                                                                                                 q             q     q
                                                                                                                                                   q            q
                                                                                                                                               q                 q     q
                                                                                                                                                                       q
                                                                                                                                  q                    qq q
                                                                                                                                                              q q
                                                                                                                                   q    q            q q qq q
                                                                                                                                                            q
                                                                                                                                     q    q           q qqq q q
                                                                                                                                                          qq
                                                                                                                                                        q
                             ABC direct and logistic




                                                                                                                                                       q
                                                                                                                                            q qq
                                                                                                                             q     q q       q
                                                                                                                                                                 q
                                                                                                                                                   q q
                                                       0.6




                                                                                                                                      q      q     q    qq
                                                                                                                                   q           q
                                                                                                                               qq
                                                                                                                                     q q              q q
                                                                                                                           q      q q            q
                                                                                             q                               q q
                                                                                                                              q                q
                                                                                                                   q          q      q                q q
                                                                                                 q q      q                q q
                                                                                                                             q                   q
                                                                                                                                  qq
                                                                                                               qq
                                                                                                   q      q
                                                                                     qq
                                                       0.4




                                                                                                  qq               q
                                                                                q                                      q
                                                                                                      q        q
                                                                            q                         q                q
                                                                       q                                                            q
                                                                                             q                     q
                                                                            q                         q

                                                               q                     q            q
                                                       0.2




                                                                       q                         qq
                                                                                         q         q
                                                                   q
                                                                                                  q

                                                                            q                         q
                                                                                q
                                                                       q
                                                                        q
                                                                            q
                                                       0.0




                                                               qq




                                                             0.0                    0.2                       0.4                  0.6               0.8             1.0

                                                                                                          importance sampling
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Comparison with ABC
      Use of 15 summary statistics and DIY-ABC logistic correction

                                          6
                                          4




                                                                                                                   q                 q
                                                                                                                                 q         q
                                                                                                                           q
                                                                                                                   q
                                                                                                                   q                  qq
                                          2




                                                                                                                          q          q
                                                                                                                         q                         q
                                                                                                                           qq
                             ABC direct




                                                                                                    q      q q              qq       q
                                                                                                            qq         q
                                                                                                                           q
                                                                                                        q
                                                                                                              q                      qq
                                                                                                    qq q qq q
                                                                                                           q q
                                                                                                              q
                                                                                               q    qq q q             q
                                                                                                  q  q qq
                                                                                           q q qq
                                                                                                q    qq        q
                                                                                         q qq q q q qqq
                                                                                         q
                                                                             q     q    q q
                                                                                        q        q
                                                                                                      q
                                          0




                                                                              qq qq     q
                                                                                       q q        q
                                                                        qq     qq
                                                                    q           q qq
                                                        q       q               q
                                                                q
                                                q                             q
                                                            q
                                                    q
                                          −2
                                          −4




                                               −4               −2                 0                    2                        4             6

                                                                                  importance sampling
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




Comparison with ABC
      Use of 15 summary statistics and DIY-ABC logistic correction

                                                                                                                                     qq
                                                       6
                                                                                                                                                     q
                                                                                                                                                             q
                                                                                                                                          q
                                                                                                                                                 q
                                                                                                                                             q
                                                                                                                               q
                                                                                                                                   q q
                                                                                                                           q        q       q
                                                       4




                                                                                                                              q
                                                                                                                  qq       q qq    qq      q
                                                                                                                           q      q        q
                                                                                                                            q   q q
                                                                                                     q    q     q          qq
                                                                                                                      q                    q
                                                                                                                   q q q      q           q
                             ABC direct and logistic




                                                                                                            q   q q q   q                            q
                                                                                                           q    qq                 q    q
                                                                                                                 q      q q q q             qq
                                                                                                                    q
                                                       2




                                                                                                               q                   q       q
                                                                                                                                  q                          q
                                                                                                           qq q qq
                                                                                                                q       q q         qq
                                                                                                                                     qq    q
                                                                                                           qq            qq     q
                                                                                                                                    q
                                                                                                       q    qq q           q              qq
                                                                                                               qq qq q q
                                                                                                       qqqq         qq q q  q
                                                                                                                qqqqq
                                                                                                                 qq q           q
                                                                                                         q qq q
                                                                                                    q q q qq                q
                                                                                                     qq     q q qqq
                                                                                                                  q
                                                                                       q           qqqqq q q
                                                                                                    qq       q
                                                                                             q     qq q q    q qq
                                                       0




                                                                                        qq qqq      q q
                                                                                     qq  q qqq
                                                                                          q
                                                                                          q
                                                                                 q        q q
                                                                             q            q    q      q
                                                                     q                 q q q
                                                                             q
                                                             q                       q   q
                                                                         q               q
                                                                                      q qq
                                                                 q                       q
                                                       −2




                                                                             q            q
                                                                                 q

                                                                     q

                                                                         q
                                                       −4




                                                                             q
                                                                 q
                                                             q



                                                            −4               −2                0                    2                    4               6

                                                                                              importance sampling
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




The only safe cases???



      Besides specific models like Gibbs random fields,
      using distances over the data itself escapes the discrepancy...
                               [Toni & Stumpf, 2010; Sousa & al., 2009]
ABC Methods for Bayesian Model Choice
  Generic ABC model choice




The only safe cases???



      Besides specific models like Gibbs random fields,
      using distances over the data itself escapes the discrepancy...
                               [Toni & Stumpf, 2010; Sousa & al., 2009]

      ...and so does the use of more informal model fitting measures
                                                 [Ratmann & al., 2009]
ABC Methods for Bayesian Model Choice
  Model choice consistency




ABC model choice consistency


      Approximate Bayesian computation

      ABC for model choice

      Gibbs random fields

      Generic ABC model choice

      Model choice consistency
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Formalised framework


The starting point



      Central question to the validation of ABC for model choice:

          When is a Bayes factor based on an insufficient statistic
                            T (y) consistent?
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Formalised framework


The starting point



      Central question to the validation of ABC for model choice:

          When is a Bayes factor based on an insufficient statistic
                            T (y) consistent?

                                      T
      Note: c drawn on T (y) through B12 (y) necessarily differs from
      c drawn on y through B12 (y)
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Formalised framework


A benchmark if toy example




      Comparison suggested by referee of PNAS paper [thanks]:
                                 [X, Cornuet, Marin, & Pillai, Aug. 2011]
      Model M1 : y ∼ N (θ1 , 1) opposed to model M2 :
                   √
      y ∼ L(θ2 , 1/ √2), Laplace distribution with mean θ2 and scale
      parameter 1/ 2 (variance one).
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Formalised framework


A benchmark if toy example


      Comparison suggested by referee of PNAS paper [thanks]:
                                 [X, Cornuet, Marin, & Pillai, Aug. 2011]
      Model M1 : y ∼ N (θ1 , 1) opposed to model M2 :
                   √
      y ∼ L(θ2 , 1/ √2), Laplace distribution with mean θ2 and scale
      parameter 1/ 2 (variance one).
      Four possible statistics
         1. sample mean y (sufficient for M1 if not M2 );
         2. sample median med(y) (insufficient);
         3. sample variance var(y) (ancillary);
         4. median absolute deviation mad(y) = med(y − med(y));
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Formalised framework


A benchmark if toy example
      Comparison suggested by referee of PNAS paper [thanks]:
                                 [X, Cornuet, Marin, & Pillai, Aug. 2011]
      Model M1 : y ∼ N (θ1 , 1) opposed to model M2 :
                   √
      y ∼ L(θ2 , 1/ √2), Laplace distribution with mean θ2 and scale
      parameter 1/ 2 (variance one).
                 6
                 5
                 4
       Density

                 3
                 2
                 1
                 0




                     0.1   0.2   0.3          0.4        0.5   0.6   0.7

                                       posterior probability
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Formalised framework


A benchmark if toy example
      Comparison suggested by referee of PNAS paper [thanks]:
                                 [X, Cornuet, Marin, & Pillai, Aug. 2011]
      Model M1 : y ∼ N (θ1 , 1) opposed to model M2 :
                   √
      y ∼ L(θ2 , 1/ √2), Laplace distribution with mean θ2 and scale
      parameter 1/ 2 (variance one).
                 6




                                                                                     3
                 5
                 4
       Density




                                                                           Density

                                                                                     2
                 3
                 2




                                                                                     1
                 1
                 0




                                                                                     0




                     0.1   0.2   0.3          0.4        0.5   0.6   0.7                 0.0   0.2   0.4                 0.6   0.8   1.0

                                       posterior probability                                               probability
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Formalised framework


Framework

      Starting from sample

                                        y = (y1 , . . . , yn )

      the observed sample, not necessarily iid with true distribution

                                              y ∼ Pn

      Summary statistics

                      T (y) = T n = (T1 (y), T2 (y), · · · , Td (y)) ∈ Rd

      with true distribution T n ∼ Gn .
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Formalised framework


Framework



       c Comparison of
          – under M1 , y ∼ F1,n (·|θ1 ) where θ1 ∈ Θ1 ⊂ Rp1
          – under M2 , y ∼ F2,n (·|θ2 ) where θ2 ∈ Θ2 ⊂ Rp2
      turned into
          – under M1 , T (y) ∼ G1,n (·|θ1 ), and θ1 |T (y) ∼ π1 (·|T n )
          – under M2 , T (y) ∼ G2,n (·|θ2 ), and θ2 |T (y) ∼ π2 (·|T n )
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Consistency results


Assumptions


      A collection of asymptotic “standard” assumptions:
      [A1] There exist a sequence {vn } converging to +∞,
      an a.c. distribution Q with continuous bounded density q(·),
      a symmetric, d × d positive definite matrix V0
      and a vector µ0 ∈ Rd such that
                                    −1/2                 n→∞
                              vn V0        (T n − µ0 )         Q, under Gn

       and for all M > 0
                                           −d                   −1/2
                      sup        |V0 |1/2 vn gn (t) − q vn V0          {t − µ0 }   = o(1)
                 vn |t−µ0 |<M
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Consistency results


Assumptions




      A collection of asymptotic “standard” assumptions:
      [A2] For i = 1, 2, there exist d × d symmetric positive definite matrices
      Vi (θi ) and µi (θi ) ∈ Rd such that
                                                      n→∞
                   vn Vi (θi )−1/2 (T n − µi (θi ))         Q,   under Gi,n (·|θi ) .
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Consistency results


Assumptions


      A collection of asymptotic “standard” assumptions:
      [A3] For i = 1, 2, there exist sets Fn,i ⊂ Θi and constants                  i , τi , αi   >0
      such that for all τ > 0,

                              sup Gi,n |T n − µ(θi )| > τ |µi (θi ) − µ0 | ∧   i   |θi
                             θi ∈Fn,i
                                         −α                            −αi
                                        vn i (|µi (θi ) − µ0 | ∧ i )

       with
                                                   c          −τ
                                              πi (Fn,i ) = o(vn i ).
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Consistency results


Assumptions



      A collection of asymptotic “standard” assumptions:
      [A4] For u > 0
                                                                        −1
                             Sn,i (u) = θi ∈ Fn,i ; |µ(θi ) − µ0 | ≤ u vn

      if inf{|µi (θi ) − µ0 |; θi ∈ Θi } = 0, there exist constants di < τi ∧ αi − 1
      such that
                                                   −d
                              πi (Sn,i (u)) ∼ udi vn i , ∀u vn
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Consistency results


Assumptions

      A collection of asymptotic “standard” assumptions:
      [A5] If inf{|µi (θi ) − µ0 |; θi ∈ Θi } = 0, there exists U > 0 such that for
      any M > 0,

                                                                          −d
                                    sup        sup         |Vi (θi )|1/2 vn gi (t|θi )
                              vn |t−µ0 |<M θi ∈Sn,i (U )

                                  −q vn Vi (θi )−1/2 (t − µ(θi )       = o(1)

       and

                             πi Sn,i (U ) ∩ ||Vi (θi )−1 || + ||Vi (θi )|| > M
          lim lim sup                                                                    =0.
        M →∞           n                        πi (Sn,i (U ))
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Consistency results


Assumptions


      A collection of asymptotic “standard” assumptions:
      [A1]–[A2] are standard central limit theorems ([A1] redundant
      when one model is “true”)
      [A3] controls the large deviations of the estimator T n from the
      estimand µ(θ)
      [A4] is the standard prior mass condition found in Bayesian
      asymptotics (di effective dimension of the parameter)
      [A5] controls more tightly convergence esp. when µi is not
      one-to-one
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Consistency results


Effective dimension




      [A4] Understanding d1 , d2 : defined only when
      µ0 ∈ {µi (θi ), θi ∈ Θi },

                           πi (θi : |µi (θi ) − µ0 | < n−1/2 ) = O(n−di /2 )

      is the effective dimension of the model Θi around µ0
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Consistency results


Asymptotic marginals

      Asymptotically, under [A1]–[A5]

                                   mi (t) =        gi (t|θi ) πi (θi ) dθi
                                              Θi

      is such that
      (i) if inf{|µi (θi ) − µ0 |; θi ∈ Θi } = 0,

                                  Cl vn i ≤ mi (T n ) ≤ Cu vn i
                                      d−d                   d−d


      and
      (ii) if inf{|µi (θi ) − µ0 |; θi ∈ Θi } > 0

                                  mi (T n ) = oPn [vn i + vn i ].
                                                    d−τ    d−α
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Consistency results


Within-model consistency



      Under same assumptions, if inf{|µi (θi ) − µ0 |; θi ∈ Θi } = 0, the
      posterior distribution of µi (θi ) given T n is consistent at rate 1/vn
      provided αi ∧ τi > di .
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Consistency results


Within-model consistency



      Under same assumptions, if inf{|µi (θi ) − µ0 |; θi ∈ Θi } = 0, the
      posterior distribution of µi (θi ) given T n is consistent at rate 1/vn
      provided αi ∧ τi > di .
      Note: di can truly be seen as an effective dimension of the model
      under the posterior πi (.|T n ), since if µ0 ∈ {µi (θi ); θi ∈ Θi },

                                        mi (T n ) ∼ vn i
                                                     d−d
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Consistency results


Between-model consistency




      Consequence of above is that asymptotic behaviour of the Bayes
      factor is driven by the asymptotic mean value of T n under both
      models. And only by this mean value!
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Consistency results


Between-model consistency

      Consequence of above is that asymptotic behaviour of the Bayes
      factor is driven by the asymptotic mean value of T n under both
      models. And only by this mean value!
      Indeed, if

          inf{|µ0 − µ2 (θ2 )|; θ2 ∈ Θ2 } = inf{|µ0 − µ1 (θ1 )|; θ1 ∈ Θ1 } = 0

       then

                      Cl vn 1 −d2 ) ≤ m1 (T n ) m2 (T n ) ≤ Cu vn 1 −d2 ) ,
                          −(d                                   −(d


      where Cl , Cu = OPn (1), irrespective of the true model.
      c Only depends on the difference d1 − d2
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Consistency results


Between-model consistency


      Consequence of above is that asymptotic behaviour of the Bayes
      factor is driven by the asymptotic mean value of T n under both
      models. And only by this mean value!
      Else, if

          inf{|µ0 − µ2 (θ2 )|; θ2 ∈ Θ2 } > inf{|µ0 − µ1 (θ1 )|; θ1 ∈ Θ1 } = 0

       then
                           m1 (T n )
                                     ≥ Cu min vn 1 −α2 ) , vn 1 −τ2 ) ,
                                               −(d          −(d
                           m2 (T n )
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Consistency results


Consistency theorem



      If

           inf{|µ0 − µ2 (θ2 )|; θ2 ∈ Θ2 } = inf{|µ0 − µ1 (θ1 )|; θ1 ∈ Θ1 } = 0,

      Bayes factor
                                        B12 = O(vn 1 −d2 ) )
                                         T       −(d


      irrespective of the true model. It is consistent iff Pn is within the
      model with the smallest dimension
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Consistency results


Consistency theorem



      If Pn belongs to one of the two models and if µ0 cannot be
      attained by the other one :

                       0 = min (inf{|µ0 − µi (θi )|; θi ∈ Θi }, i = 1, 2)
                           < max (inf{|µ0 − µi (θi )|; θi ∈ Θi }, i = 1, 2) ,
                             T
      then the Bayes factor B12 is consistent
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Summary statistics


Consequences on summary statistics



      Bayes factor driven by the means µi (θi ) and the relative position of
      µ0 wrt both sets {µi (θi ); θi ∈ Θi }, i = 1, 2.
      For ABC, this implies the most likely statistics T n are ancillary
      statistics with different mean values under both models
      Else, if T n asymptotically depends on some of the parameters of
      the models, it is quite likely that there exists θi ∈ Θi such that
      µi (θi ) = µ0 even though model M1 is misspecified
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Summary statistics


Toy example: Laplace versus Gauss [1]


      If
                             n
           T n = n−1               Xi4 ,   µ1 (θ) = 3 + θ4 + 6θ2 ,   µ2 (θ) = 6 + · · ·
                             i=1

       and the true distribution is Laplace with mean θ0 = 1, under the
                                        √
      Gaussian model the value θ∗ = 2 3 − 3 leads to µ0 = µ(θ∗ )
                                                  [here d1 = d2 = d = 1]
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Summary statistics


Toy example: Laplace versus Gauss [1]


      If
                             n
           T n = n−1               Xi4 ,   µ1 (θ) = 3 + θ4 + 6θ2 ,   µ2 (θ) = 6 + · · ·
                             i=1

       and the true distribution is Laplace with mean θ0 = 1, under the
                                        √
      Gaussian model the value θ∗ = 2 3 − 3 leads to µ0 = µ(θ∗ )
                                                  [here d1 = d2 = d = 1]
      c a Bayes factor associated with such a statistic is inconsistent
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Summary statistics


Toy example: Laplace versus Gauss [1]
      If
                               n
             n            −1
           T =n                      Xi4 ,        µ1 (θ) = 3 + θ4 + 6θ2 ,                           µ2 (θ) = 6 + · · ·
                               i=1
                                             8
                                             6
                                             4
                                             2
                                             0




                                                 0.0    0.2   0.4                 0.6   0.8   1.0

                                                                    probability




                                                       Fourth moment
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Summary statistics


Toy example: Laplace versus Gauss [1]
      If
                               n
             n            −1
           T =n                      Xi4 ,   µ1 (θ) = 3 + θ4 + 6θ2 ,   µ2 (θ) = 6 + · · ·
                               i=1
                                                      n=1000
                                     0.45




                                                q
                                     0.40
                                     0.35




                                                               q
                                                               q


                                                q
                                     0.30




                                                q



                                                M1             M2
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Summary statistics


Toy example: Laplace versus Gauss [1]

      If
                               n
             n            −1
           T =n                      Xi4 ,   µ1 (θ) = 3 + θ4 + 6θ2 ,   µ2 (θ) = 6 + · · ·
                               i=1

      Caption: Comparison of the distributions of the posterior
      probabilities that the data is from a normal model (as opposed to a
      Laplace model) with unknown mean when the data is made of
      n = 1000 observations either from a normal (M1) or Laplace (M2)
      distribution with mean one and when the summary statistic in the
      ABC algorithm is restricted to the empirical fourth moment. The
      ABC algorithm uses proposals from the prior N (0, 4) and selects
      the tolerance as the 1% distance quantile.
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Summary statistics


Toy example: Laplace versus Gauss [0]

      When
                                                ¯(4) ¯(6)
                                        T (y) = yn , yn

      and the true distribution is Laplace with mean θ0 = 0, then
                                       √
                   ∗              ∗
      µ0 = 6, µ1 (θ1 ) = 6 with θ1 = 2 3 − 3
                                                   [d1 = 1 and d2 = 1/2]
      thus
                        B12 ∼ n−1/4 → 0 : consistent
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Summary statistics


Toy example: Laplace versus Gauss [0]

      When
                                                  ¯(4) ¯(6)
                                          T (y) = yn , yn

      and the true distribution is Laplace with mean θ0 = 0, then
                                       √
                   ∗              ∗
      µ0 = 6, µ1 (θ1 ) = 6 with θ1 = 2 3 − 3
                                                   [d1 = 1 and d2 = 1/2]
      thus
                        B12 ∼ n−1/4 → 0 : consistent
      Under the Gaussian model µ0 = 3 µ2 (θ2 ) ≥ 6 > 3 ∀θ2

                                        B12 → +∞ :    consistent
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Summary statistics


Toy example: Laplace versus Gauss [0]
      When
                                                              ¯(4) ¯(6)
                                                      T (y) = yn , yn


                                                  5
                                                  4
                                                  3
                                        Density

                                                  2
                                                  1
                                                  0




                                                      0.0   0.2      0.4          0.6       0.8   1.0

                                                                  posterior probabilities




                                   Fourth and sixth moments
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Summary statistics


Embedded models




      When M1 submodel of M2 , and if the true distribution belongs to
      the smaller model M1 , Bayes factor is of order

                                        vn 1 −d2 )
                                         −(d
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Summary statistics


Embedded models



      When M1 submodel of M2 , and if the true distribution belongs to
      the smaller model M1 , Bayes factor is of order

                                        vn 1 −d2 )
                                         −(d


      If summary statistic only informative on a parameter that is the
      same under both models, i.e if d1 = d2 , then
       c the Bayes factor is not consistent
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Summary statistics


Embedded models



      When M1 submodel of M2 , and if the true distribution belongs to
      the smaller model M1 , Bayes factor is of order

                                        vn 1 −d2 )
                                         −(d


      Else, d1 < d2 and Bayes factor is consistent under M1 . If true
      distribution not in M1 , then
       c Bayes factor is consistent only if µ1 = µ2 = µ0
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Summary statistics


Another toy example: Quantile distribution




                                        1 − exp{−gz(p)}               k
      Q(p; A, B, g, k) = A+B 1 +                          1 + z(p)2       z(p)
                                        1 + exp{−gz(p)}

      A, B, g and k, location, scale, skewness and kurtosis parameters
      Embedded models:
              M1 : g = 0 and k ∼ U[−1/2, 5]
              M2 : g ∼ U[0, 4] and k ∼ U[−1/2, 5].
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Summary statistics


Consistency [or not]


                             0.7




                                             0.7




                                                             0.7
                             0.6




                                             0.6




                                                             0.6
                                                   q




                             0.5




                                             0.5




                                                             0.5
                             0.4




                                             0.4




                                                             0.4
                                                   q
                                   q               q

                                                        q
                             0.3




                                             0.3




                                                             0.3
                                   M1   M2         M1   M2         M1   M2




                                             1.0




                                                             1.0
                             0.8




                                                                   q
                                                                   q
                                                                   q




                                             0.8




                                                             0.8
                                                   q
                                                                   q
                                                   q
                                                   q               q
                             0.6




                                             0.6




                                                             0.6
                                                   q
                                                   q               q


                                                   q
                             0.4




                                             0.4




                                                             0.4
                                   q               q    q

                                   q
                                   q                    q
                                   q                    q
                                                        q
                                                        q
                             0.2




                                             0.2




                                                             0.2
                                   q                    q
                                   q                    q
                                                        q

                                   q
                                             0.0




                                                             0.0
                             0.0




                                   M1   M2         M1   M2         M1   M2
                                             1.0




                                                             1.0
                                                                   q
                                                                   q
                             0.8




                                                                   q
                                                                   q
                                             0.8




                                                             0.8
                                                   q
                                                   q
                                                   q
                                                   q
                             0.6




                                                        q
                                             0.6




                                                        q    0.6

                                                        q
                                                        q
                             0.4




                                             0.4




                                                        q
                                                             0.4



                                                        q
                                                        q
                                   q
                                   q                    q
                                   q
                                                        q
                             0.2




                                             0.2




                                                        q
                                                             0.2




                                   q

                                   q
                                   q
                                   q
                                             0.0




                                                             0.0
                             0.0




                                   M1   M2         M1   M2         M1   M2
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Summary statistics


Consistency [or not]


      Caption: Comparison of the distributions of the posterior
      probabilities that the data is from model M1 when the data is
      made of 100 observations (left column), 1000 observations (central
      column) and 10,000 observations (right column) either from M1
      (M1) or M2 (M2) when the summary statistics in the ABC
      algorithm are made of the empirical quantile at level 10% (first
      row), the empirical quantiles at levels 10% and 90% (second row),
      and the empirical quantiles at levels 10%, 40%, 60% and 90%
      (third row), respectively. The boxplots rely on 100 replicas and the
      ABC algorithms are based on 104 proposals from the prior, with
      the tolerance being chosen as the 1% quantile on the distances.
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Conclusions


Conclusions




      • Model selection feasible with ABC
      • Choice of summary statistics is paramount
      • At best, ABC → π(. | T (y)) which concentrates around µ0
ABC Methods for Bayesian Model Choice
  Model choice consistency
     Conclusions


Conclusions




      • Model selection feasible with ABC
      • Choice of summary statistics is paramount
      • At best, ABC → π(. | T (y)) which concentrates around µ0
      • For estimation : {θ; µ(θ) = µ0 } = θ0
      • For testing {µ1 (θ1 ), θ1 ∈ Θ1 } ∩ {µ2 (θ2 ), θ2 ∈ Θ2 } = ∅

More Related Content

What's hot

Convergence of ABC methods
Convergence of ABC methodsConvergence of ABC methods
Convergence of ABC methodsChristian Robert
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationChristian Robert
 
ABC short course: survey chapter
ABC short course: survey chapterABC short course: survey chapter
ABC short course: survey chapterChristian Robert
 
Approximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forestsApproximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forestsChristian Robert
 
ABC short course: model choice chapter
ABC short course: model choice chapterABC short course: model choice chapter
ABC short course: model choice chapterChristian Robert
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixturesChristian Robert
 
Inference in generative models using the Wasserstein distance [[INI]
Inference in generative models using the Wasserstein distance [[INI]Inference in generative models using the Wasserstein distance [[INI]
Inference in generative models using the Wasserstein distance [[INI]Christian Robert
 
ABC short course: introduction chapters
ABC short course: introduction chaptersABC short course: introduction chapters
ABC short course: introduction chaptersChristian Robert
 
ABC short course: final chapters
ABC short course: final chaptersABC short course: final chapters
ABC short course: final chaptersChristian Robert
 
Multiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximationsMultiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximationsChristian Robert
 
An overview of Bayesian testing
An overview of Bayesian testingAn overview of Bayesian testing
An overview of Bayesian testingChristian Robert
 
Monte Carlo in Montréal 2017
Monte Carlo in Montréal 2017Monte Carlo in Montréal 2017
Monte Carlo in Montréal 2017Christian Robert
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerChristian Robert
 
Considerate Approaches to ABC Model Selection
Considerate Approaches to ABC Model SelectionConsiderate Approaches to ABC Model Selection
Considerate Approaches to ABC Model SelectionMichael Stumpf
 
Reliable ABC model choice via random forests
Reliable ABC model choice via random forestsReliable ABC model choice via random forests
Reliable ABC model choice via random forestsChristian Robert
 
from model uncertainty to ABC
from model uncertainty to ABCfrom model uncertainty to ABC
from model uncertainty to ABCChristian Robert
 
Approximating Bayes Factors
Approximating Bayes FactorsApproximating Bayes Factors
Approximating Bayes FactorsChristian Robert
 

What's hot (20)

Convergence of ABC methods
Convergence of ABC methodsConvergence of ABC methods
Convergence of ABC methods
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimation
 
ABC short course: survey chapter
ABC short course: survey chapterABC short course: survey chapter
ABC short course: survey chapter
 
Approximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forestsApproximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forests
 
ABC workshop: 17w5025
ABC workshop: 17w5025ABC workshop: 17w5025
ABC workshop: 17w5025
 
ABC short course: model choice chapter
ABC short course: model choice chapterABC short course: model choice chapter
ABC short course: model choice chapter
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Inference in generative models using the Wasserstein distance [[INI]
Inference in generative models using the Wasserstein distance [[INI]Inference in generative models using the Wasserstein distance [[INI]
Inference in generative models using the Wasserstein distance [[INI]
 
ABC short course: introduction chapters
ABC short course: introduction chaptersABC short course: introduction chapters
ABC short course: introduction chapters
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
ABC short course: final chapters
ABC short course: final chaptersABC short course: final chapters
ABC short course: final chapters
 
Multiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximationsMultiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximations
 
An overview of Bayesian testing
An overview of Bayesian testingAn overview of Bayesian testing
An overview of Bayesian testing
 
Monte Carlo in Montréal 2017
Monte Carlo in Montréal 2017Monte Carlo in Montréal 2017
Monte Carlo in Montréal 2017
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like sampler
 
Considerate Approaches to ABC Model Selection
Considerate Approaches to ABC Model SelectionConsiderate Approaches to ABC Model Selection
Considerate Approaches to ABC Model Selection
 
Reliable ABC model choice via random forests
Reliable ABC model choice via random forestsReliable ABC model choice via random forests
Reliable ABC model choice via random forests
 
from model uncertainty to ABC
from model uncertainty to ABCfrom model uncertainty to ABC
from model uncertainty to ABC
 
Approximating Bayes Factors
Approximating Bayes FactorsApproximating Bayes Factors
Approximating Bayes Factors
 

Similar to ABC model choice

Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011
Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011
Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011Christian Robert
 
4th joint Warwick Oxford Statistics Seminar
4th joint Warwick Oxford Statistics Seminar4th joint Warwick Oxford Statistics Seminar
4th joint Warwick Oxford Statistics SeminarChristian Robert
 
Semi-automatic ABC: a discussion
Semi-automatic ABC: a discussionSemi-automatic ABC: a discussion
Semi-automatic ABC: a discussionChristian Robert
 
Yes III: Computational methods for model choice
Yes III: Computational methods for model choiceYes III: Computational methods for model choice
Yes III: Computational methods for model choiceChristian Robert
 
Likelihood free computational statistics
Likelihood free computational statisticsLikelihood free computational statistics
Likelihood free computational statisticsPierre Pudlo
 
NBBC15, Reyjavik, June 08, 2015
NBBC15, Reyjavik, June 08, 2015NBBC15, Reyjavik, June 08, 2015
NBBC15, Reyjavik, June 08, 2015Christian Robert
 
(Approximate) Bayesian computation as a new empirical Bayes (something)?
(Approximate) Bayesian computation as a new empirical Bayes (something)?(Approximate) Bayesian computation as a new empirical Bayes (something)?
(Approximate) Bayesian computation as a new empirical Bayes (something)?Christian Robert
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Christian Robert
 
Is ABC a new empirical Bayes approach?
Is ABC a new empirical Bayes approach?Is ABC a new empirical Bayes approach?
Is ABC a new empirical Bayes approach?Christian Robert
 
Approximate Bayesian computation for the Ising/Potts model
Approximate Bayesian computation for the Ising/Potts modelApproximate Bayesian computation for the Ising/Potts model
Approximate Bayesian computation for the Ising/Potts modelMatt Moores
 
3rd NIPS Workshop on PROBABILISTIC PROGRAMMING
3rd NIPS Workshop on PROBABILISTIC PROGRAMMING3rd NIPS Workshop on PROBABILISTIC PROGRAMMING
3rd NIPS Workshop on PROBABILISTIC PROGRAMMINGChristian Robert
 
45th SIS Meeting, Padova, Italy
45th SIS Meeting, Padova, Italy45th SIS Meeting, Padova, Italy
45th SIS Meeting, Padova, ItalyChristian Robert
 

Similar to ABC model choice (20)

Jsm09 talk
Jsm09 talkJsm09 talk
Jsm09 talk
 
Boston talk
Boston talkBoston talk
Boston talk
 
Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011
Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011
Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011
 
4th joint Warwick Oxford Statistics Seminar
4th joint Warwick Oxford Statistics Seminar4th joint Warwick Oxford Statistics Seminar
4th joint Warwick Oxford Statistics Seminar
 
Intro to ABC
Intro to ABCIntro to ABC
Intro to ABC
 
Semi-automatic ABC: a discussion
Semi-automatic ABC: a discussionSemi-automatic ABC: a discussion
Semi-automatic ABC: a discussion
 
Yes III: Computational methods for model choice
Yes III: Computational methods for model choiceYes III: Computational methods for model choice
Yes III: Computational methods for model choice
 
MaxEnt 2009 talk
MaxEnt 2009 talkMaxEnt 2009 talk
MaxEnt 2009 talk
 
Desy
DesyDesy
Desy
 
Intractable likelihoods
Intractable likelihoodsIntractable likelihoods
Intractable likelihoods
 
Likelihood free computational statistics
Likelihood free computational statisticsLikelihood free computational statistics
Likelihood free computational statistics
 
NBBC15, Reyjavik, June 08, 2015
NBBC15, Reyjavik, June 08, 2015NBBC15, Reyjavik, June 08, 2015
NBBC15, Reyjavik, June 08, 2015
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
(Approximate) Bayesian computation as a new empirical Bayes (something)?
(Approximate) Bayesian computation as a new empirical Bayes (something)?(Approximate) Bayesian computation as a new empirical Bayes (something)?
(Approximate) Bayesian computation as a new empirical Bayes (something)?
 
ABC in Varanasi
ABC in VaranasiABC in Varanasi
ABC in Varanasi
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
 
Is ABC a new empirical Bayes approach?
Is ABC a new empirical Bayes approach?Is ABC a new empirical Bayes approach?
Is ABC a new empirical Bayes approach?
 
Approximate Bayesian computation for the Ising/Potts model
Approximate Bayesian computation for the Ising/Potts modelApproximate Bayesian computation for the Ising/Potts model
Approximate Bayesian computation for the Ising/Potts model
 
3rd NIPS Workshop on PROBABILISTIC PROGRAMMING
3rd NIPS Workshop on PROBABILISTIC PROGRAMMING3rd NIPS Workshop on PROBABILISTIC PROGRAMMING
3rd NIPS Workshop on PROBABILISTIC PROGRAMMING
 
45th SIS Meeting, Padova, Italy
45th SIS Meeting, Padova, Italy45th SIS Meeting, Padova, Italy
45th SIS Meeting, Padova, Italy
 

More from Christian Robert

Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceChristian Robert
 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinChristian Robert
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?Christian Robert
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Christian Robert
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Christian Robert
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking componentsChristian Robert
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihoodChristian Robert
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)Christian Robert
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussionChristian Robert
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceChristian Robert
 
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment modelsa discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment modelsChristian Robert
 
ABC based on Wasserstein distances
ABC based on Wasserstein distancesABC based on Wasserstein distances
ABC based on Wasserstein distancesChristian Robert
 
Poster for Bayesian Statistics in the Big Data Era conference
Poster for Bayesian Statistics in the Big Data Era conferencePoster for Bayesian Statistics in the Big Data Era conference
Poster for Bayesian Statistics in the Big Data Era conferenceChristian Robert
 
short course at CIRM, Bayesian Masterclass, October 2018
short course at CIRM, Bayesian Masterclass, October 2018short course at CIRM, Bayesian Masterclass, October 2018
short course at CIRM, Bayesian Masterclass, October 2018Christian Robert
 

More from Christian Robert (20)

Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de France
 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael Martin
 
discussion of ICML23.pdf
discussion of ICML23.pdfdiscussion of ICML23.pdf
discussion of ICML23.pdf
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?
 
restore.pdf
restore.pdfrestore.pdf
restore.pdf
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking components
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihood
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)
 
eugenics and statistics
eugenics and statisticseugenics and statistics
eugenics and statistics
 
asymptotics of ABC
asymptotics of ABCasymptotics of ABC
asymptotics of ABC
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussion
 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
 
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment modelsa discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
a discussion of Chib, Shin, and Simoni (2017-8) Bayesian moment models
 
ABC based on Wasserstein distances
ABC based on Wasserstein distancesABC based on Wasserstein distances
ABC based on Wasserstein distances
 
Poster for Bayesian Statistics in the Big Data Era conference
Poster for Bayesian Statistics in the Big Data Era conferencePoster for Bayesian Statistics in the Big Data Era conference
Poster for Bayesian Statistics in the Big Data Era conference
 
short course at CIRM, Bayesian Masterclass, October 2018
short course at CIRM, Bayesian Masterclass, October 2018short course at CIRM, Bayesian Masterclass, October 2018
short course at CIRM, Bayesian Masterclass, October 2018
 

Recently uploaded

Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piececharlottematthew16
 
costume and set research powerpoint presentation
costume and set research powerpoint presentationcostume and set research powerpoint presentation
costume and set research powerpoint presentationphoebematthew05
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr LapshynFwdays
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 3652toLead Limited
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyAlfredo García Lavilla
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Commit University
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
Artificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxArtificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxhariprasad279825
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Wonjun Hwang
 

Recently uploaded (20)

Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piece
 
costume and set research powerpoint presentation
costume and set research powerpoint presentationcostume and set research powerpoint presentation
costume and set research powerpoint presentation
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easy
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
Artificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxArtificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptx
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food Manufacturing
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
 

ABC model choice

  • 1. ABC Methods for Bayesian Model Choice ABC Methods for Bayesian Model Choice Christian P. Robert Universit´ Paris-Dauphine, IuF, & CREST e http://www.ceremade.dauphine.fr/~xian Joint work(s) with Jean-Marie Cornuet, Aude Grelaud, Jean-Michel Marin, Natesh Pillai, & Judith Rousseau
  • 2. ABC Methods for Bayesian Model Choice Approximate Bayesian computation Approximate Bayesian computation Approximate Bayesian computation ABC for model choice Gibbs random fields Generic ABC model choice Model choice consistency
  • 3. ABC Methods for Bayesian Model Choice Approximate Bayesian computation Regular Bayesian computation issues When faced with a non-standard posterior distribution π(θ|y) ∝ π(θ)L(θ|y) the standard solution is to use simulation (Monte Carlo) to produce a sample θ1 , . . . , θ T from π(θ|y) (or approximately by Markov chain Monte Carlo methods) [Robert & Casella, 2004]
  • 4. ABC Methods for Bayesian Model Choice Approximate Bayesian computation Untractable likelihoods Cases when the likelihood function f (y|θ) is unavailable and when the completion step f (y|θ) = f (y, z|θ) dz Z is impossible or too costly because of the dimension of z c MCMC cannot be implemented!
  • 5. ABC Methods for Bayesian Model Choice Approximate Bayesian computation Untractable likelihoods c MCMC cannot be implemented!
  • 6. ABC Methods for Bayesian Model Choice Approximate Bayesian computation The ABC method Bayesian setting: target is π(θ)f (x|θ)
  • 7. ABC Methods for Bayesian Model Choice Approximate Bayesian computation The ABC method Bayesian setting: target is π(θ)f (x|θ) When likelihood f (x|θ) not in closed form, likelihood-free rejection technique:
  • 8. ABC Methods for Bayesian Model Choice Approximate Bayesian computation The ABC method Bayesian setting: target is π(θ)f (x|θ) When likelihood f (x|θ) not in closed form, likelihood-free rejection technique: ABC algorithm For an observation y ∼ f (y|θ), under the prior π(θ), keep jointly simulating θ ∼ π(θ) , z ∼ f (z|θ ) , until the auxiliary variable z is equal to the observed value, z = y. [Rubin, 1984; Tavar´ et al., 1997] e
  • 9. ABC Methods for Bayesian Model Choice Approximate Bayesian computation A as approximative When y is a continuous random variable, equality z = y is replaced with a tolerance condition, (y, z) ≤ where is a distance
  • 10. ABC Methods for Bayesian Model Choice Approximate Bayesian computation A as approximative When y is a continuous random variable, equality z = y is replaced with a tolerance condition, (y, z) ≤ where is a distance Output distributed from π(θ) Pθ { (y, z) < } ∝ π(θ| (y, z) < )
  • 11. ABC Methods for Bayesian Model Choice Approximate Bayesian computation ABC algorithm Algorithm 1 Likelihood-free rejection sampler for i = 1 to N do repeat generate θ from the prior distribution π(·) generate z from the likelihood f (·|θ ) until ρ{η(z), η(y)} ≤ set θi = θ end for where η(y) defines a (maybe in-sufficient) statistic
  • 12. ABC Methods for Bayesian Model Choice Approximate Bayesian computation Output The likelihood-free algorithm samples from the marginal in z of: π(θ)f (z|θ)IA ,y (z) π (θ, z|y) = , A ,y ×Θ π(θ)f (z|θ)dzdθ where A ,y = {z ∈ D|ρ(η(z), η(y)) < }.
  • 13. ABC Methods for Bayesian Model Choice Approximate Bayesian computation Output The likelihood-free algorithm samples from the marginal in z of: π(θ)f (z|θ)IA ,y (z) π (θ, z|y) = , A ,y ×Θ π(θ)f (z|θ)dzdθ where A ,y = {z ∈ D|ρ(η(z), η(y)) < }. The idea behind ABC is that the summary statistics coupled with a small tolerance should provide a good approximation of the posterior distribution: π (θ|y) = π (θ, z|y)dz ≈ π(θ|η(y)) . [Not garanteed!]
  • 14. ABC Methods for Bayesian Model Choice ABC for model choice ABC for model choice Approximate Bayesian computation ABC for model choice Gibbs random fields Generic ABC model choice Model choice consistency
  • 15. ABC Methods for Bayesian Model Choice ABC for model choice Bayesian model choice Principle Several models M1 , M2 , . . . are considered simultaneously for dataset y and model index M central to inference. Use of a prior π(M = m), plus a prior distribution on the parameter conditional on the value m of the model index, πm (θ m ) Goal is to derive the posterior distribution of M, π(M = m|data) a challenging computational target when models are complex.
  • 16. ABC Methods for Bayesian Model Choice ABC for model choice Generic ABC for model choice Algorithm 2 Likelihood-free model choice sampler (ABC-MC) for t = 1 to T do repeat Generate m from the prior π(M = m) Generate θ m from the prior πm (θ m ) Generate z from the model fm (z|θ m ) until ρ{η(z), η(y)} < Set m(t) = m and θ (t) = θ m end for [Grelaud & al., 2009; Toni & al., 2009]
  • 17. ABC Methods for Bayesian Model Choice ABC for model choice ABC estimates Posterior probability π(M = m|y) approximated by the frequency of acceptances from model m T 1 Im(t) =m . T t=1 Early issues with implementation: should tolerances be the same for all models? should summary statistics vary across models? incl. their dimension? should the distance measure ρ vary across models?
  • 18. ABC Methods for Bayesian Model Choice ABC for model choice ABC estimates Posterior probability π(M = m|y) approximated by the frequency of acceptances from model m T 1 Im(t) =m . T t=1 Extension to a weighted polychotomous logistic regression estimate of π(M = m|y), with non-parametric kernel weights [Cornuet et al., DIYABC, 2009]
  • 19. ABC Methods for Bayesian Model Choice Gibbs random fields Potts model Potts model Distribution with an energy function of the form θS(y) = θ δyl =yi l∼i where l∼i denotes a neighbourhood structure
  • 20. ABC Methods for Bayesian Model Choice Gibbs random fields Potts model Potts model Distribution with an energy function of the form θS(y) = θ δyl =yi l∼i where l∼i denotes a neighbourhood structure In most realistic settings, summation Zθ = exp{θ T S(x)} x∈X involves too many terms to be manageable and numerical approximations cannot always be trusted
  • 21. ABC Methods for Bayesian Model Choice Gibbs random fields Neighbourhood relations Setup Choice to be made between M neighbourhood relations m i∼i (0 ≤ m ≤ M − 1) with Sm (x) = I{xi =xi } m i∼i driven by the posterior probabilities of the models.
  • 22. ABC Methods for Bayesian Model Choice Gibbs random fields Model index Computational target: P(M = m|x) ∝ fm (x|θm )πm (θm ) dθm π(M = m) Θm
  • 23. ABC Methods for Bayesian Model Choice Gibbs random fields Model index Computational target: P(M = m|x) ∝ fm (x|θm )πm (θm ) dθm π(M = m) Θm If S(x) sufficient statistic for the joint parameters (M, θ0 , . . . , θM −1 ), P(M = m|x) = P(M = m|S(x)) .
  • 24. ABC Methods for Bayesian Model Choice Gibbs random fields Sufficient statistics in Gibbs random fields
  • 25. ABC Methods for Bayesian Model Choice Gibbs random fields Sufficient statistics in Gibbs random fields Each model m has its own sufficient statistic Sm (·) and S(·) = (S0 (·), . . . , SM −1 (·)) is also (model-)sufficient.
  • 26. ABC Methods for Bayesian Model Choice Gibbs random fields Sufficient statistics in Gibbs random fields Each model m has its own sufficient statistic Sm (·) and S(·) = (S0 (·), . . . , SM −1 (·)) is also (model-)sufficient. Explanation: For Gibbs random fields, 1 2 x|M = m ∼ fm (x|θm ) = fm (x|S(x))fm (S(x)|θm ) 1 = f 2 (S(x)|θm ) n(S(x)) m where n(S(x)) = {˜ ∈ X : S(˜ ) = S(x)} x x c S(x) is sufficient for the joint parameters
  • 27. ABC Methods for Bayesian Model Choice Generic ABC model choice More about sufficiency ‘Sufficient statistics for individual models are unlikely to be very informative for the model probability. This is already well known and understood by the ABC-user community.’ [Scott Sisson, Jan. 31, 2011, ’Og]
  • 28. ABC Methods for Bayesian Model Choice Generic ABC model choice More about sufficiency ‘Sufficient statistics for individual models are unlikely to be very informative for the model probability. This is already well known and understood by the ABC-user community.’ [Scott Sisson, Jan. 31, 2011, ’Og] If η1 (x) sufficient statistic for model m = 1 and parameter θ1 and η2 (x) sufficient statistic for model m = 2 and parameter θ2 , (η1 (x), η2 (x)) is not always sufficient for (m, θm )
  • 29. ABC Methods for Bayesian Model Choice Generic ABC model choice More about sufficiency ‘Sufficient statistics for individual models are unlikely to be very informative for the model probability. This is already well known and understood by the ABC-user community.’ [Scott Sisson, Jan. 31, 2011, ’Og] If η1 (x) sufficient statistic for model m = 1 and parameter θ1 and η2 (x) sufficient statistic for model m = 2 and parameter θ2 , (η1 (x), η2 (x)) is not always sufficient for (m, θm ) c Potential loss of information at the testing level
  • 30. ABC Methods for Bayesian Model Choice Generic ABC model choice Limiting behaviour of B12 (T → ∞) ABC approximation T t=1 Imt =1 Iρ{η(zt ),η(y)}≤ B12 (y) = T , t=1 Imt =2 Iρ{η(zt ),η(y)}≤ where the (mt , z t )’s are simulated from the (joint) prior
  • 31. ABC Methods for Bayesian Model Choice Generic ABC model choice Limiting behaviour of B12 (T → ∞) ABC approximation T t=1 Imt =1 Iρ{η(zt ),η(y)}≤ B12 (y) = T , t=1 Imt =2 Iρ{η(zt ),η(y)}≤ where the (mt , z t )’s are simulated from the (joint) prior As T go to infinity, limit Iρ{η(z),η(y)}≤ π1 (θ 1 )f1 (z|θ 1 ) dz dθ 1 B12 (y) = Iρ{η(z),η(y)}≤ π2 (θ 2 )f2 (z|θ 2 ) dz dθ 2 η Iρ{η,η(y)}≤ π1 (θ 1 )f1 (η|θ 1 ) dη dθ 1 = η , Iρ{η,η(y)}≤ π2 (θ 2 )f2 (η|θ 2 ) dη dθ 2 η η where f1 (η|θ 1 ) and f2 (η|θ 2 ) distributions of η(z)
  • 32. ABC Methods for Bayesian Model Choice Generic ABC model choice Limiting behaviour of B12 ( → 0) When goes to zero, η η π1 (θ 1 )f1 (η(y)|θ 1 ) dθ 1 B12 (y) = η π2 (θ 2 )f2 (η(y)|θ 2 ) dθ 2
  • 33. ABC Methods for Bayesian Model Choice Generic ABC model choice Limiting behaviour of B12 ( → 0) When goes to zero, η η π1 (θ 1 )f1 (η(y)|θ 1 ) dθ 1 B12 (y) = η π2 (θ 2 )f2 (η(y)|θ 2 ) dθ 2 c Bayes factor based on the sole observation of η(y)
  • 34. ABC Methods for Bayesian Model Choice Generic ABC model choice Limiting behaviour of B12 (under sufficiency) If η(y) sufficient statistic in both models, fi (y|θ i ) = gi (y)fiη (η(y)|θ i ) Thus η Θ1 π(θ 1 )g1 (y)f1 (η(y)|θ 1 ) dθ 1 B12 (y) = η Θ2 π(θ 2 )g2 (y)f2 (η(y)|θ 2 ) dθ 2 η g1 (y) π1 (θ 1 )f1 (η(y)|θ 1 ) dθ 1 g1 (y) η = η = B (y) . g2 (y) π2 (θ 2 )f2 (η(y)|θ 2 ) dθ 2 g2 (y) 12 [Didelot, Everitt, Johansen & Lawson, 2011]
  • 35. ABC Methods for Bayesian Model Choice Generic ABC model choice Limiting behaviour of B12 (under sufficiency) If η(y) sufficient statistic in both models, fi (y|θ i ) = gi (y)fiη (η(y)|θ i ) Thus η Θ1 π(θ 1 )g1 (y)f1 (η(y)|θ 1 ) dθ 1 B12 (y) = η Θ2 π(θ 2 )g2 (y)f2 (η(y)|θ 2 ) dθ 2 η g1 (y) π1 (θ 1 )f1 (η(y)|θ 1 ) dθ 1 g1 (y) η = η = B (y) . g2 (y) π2 (θ 2 )f2 (η(y)|θ 2 ) dθ 2 g2 (y) 12 [Didelot, Everitt, Johansen & Lawson, 2011] c No discrepancy only when cross-model sufficiency
  • 36. ABC Methods for Bayesian Model Choice Generic ABC model choice Poisson/geometric example Sample x = (x1 , . . . , xn ) from either a Poisson P(λ) or from a geometric G(p) Sum n S= xi = η(x) i=1 sufficient statistic for either model but not simultaneously Discrepancy ratio g1 (x) S!n−S / i xi ! = g2 (x) 1 n+S−1 S
  • 37. ABC Methods for Bayesian Model Choice Generic ABC model choice Poisson/geometric discrepancy η Range of B12 (x) versus B12 (x): The values produced have nothing in common.
  • 38. ABC Methods for Bayesian Model Choice Generic ABC model choice Formal recovery Creating an encompassing exponential family T T f (x|θ1 , θ2 , α1 , α2 ) ∝ exp{θ1 η1 (x) + θ2 η2 (x) + α1 t1 (x) + α2 t2 (x)} leads to a sufficient statistic (η1 (x), η2 (x), t1 (x), t2 (x)) [Didelot, Everitt, Johansen & Lawson, 2011]
  • 39. ABC Methods for Bayesian Model Choice Generic ABC model choice Formal recovery Creating an encompassing exponential family T T f (x|θ1 , θ2 , α1 , α2 ) ∝ exp{θ1 η1 (x) + θ2 η2 (x) + α1 t1 (x) + α2 t2 (x)} leads to a sufficient statistic (η1 (x), η2 (x), t1 (x), t2 (x)) [Didelot, Everitt, Johansen & Lawson, 2011] In the Poisson/geometric case, if i xi ! is added to S, no discrepancy
  • 40. ABC Methods for Bayesian Model Choice Generic ABC model choice Formal recovery Creating an encompassing exponential family T T f (x|θ1 , θ2 , α1 , α2 ) ∝ exp{θ1 η1 (x) + θ2 η2 (x) + α1 t1 (x) + α2 t2 (x)} leads to a sufficient statistic (η1 (x), η2 (x), t1 (x), t2 (x)) [Didelot, Everitt, Johansen & Lawson, 2011] Only applies in genuine sufficiency settings... c Inability to evaluate loss brought by summary statistics
  • 41. ABC Methods for Bayesian Model Choice Generic ABC model choice Meaning of the ABC-Bayes factor ‘This is also why focus on model discrimination typically (...) proceeds by (...) accepting that the Bayes Factor that one obtains is only derived from the summary statistics and may in no way correspond to that of the full model.’ [Scott Sisson, Jan. 31, 2011, ’Og]
  • 42. ABC Methods for Bayesian Model Choice Generic ABC model choice Meaning of the ABC-Bayes factor ‘This is also why focus on model discrimination typically (...) proceeds by (...) accepting that the Bayes Factor that one obtains is only derived from the summary statistics and may in no way correspond to that of the full model.’ [Scott Sisson, Jan. 31, 2011, ’Og] In the Poisson/geometric case, if E[yi ] = θ0 > 0, η (θ0 + 1)2 −θ0 lim B12 (y) = e n→∞ θ0
  • 43. ABC Methods for Bayesian Model Choice Generic ABC model choice MA example 0.7 0.6 0.6 0.6 0.6 0.5 0.5 0.5 0.5 0.4 0.4 0.4 0.4 0.3 0.3 0.3 0.3 0.2 0.2 0.2 0.2 0.1 0.1 0.1 0.1 0.0 0.0 0.0 0.0 1 2 1 2 1 2 1 2 Evolution [against ] of ABC Bayes factor, in terms of frequencies of visits to models MA(1) (left) and MA(2) (right) when equal to 10, 1, .1, .01% quantiles on insufficient autocovariance distances. Sample of 50 points from a MA(2) with θ1 = 0.6, θ2 = 0.2. True Bayes factor equal to 17.71.
  • 44. ABC Methods for Bayesian Model Choice Generic ABC model choice MA example 0.8 0.6 0.6 0.6 0.6 0.4 0.4 0.4 0.4 0.2 0.2 0.2 0.2 0.0 0.0 0.0 0.0 1 2 1 2 1 2 1 2 Evolution [against ] of ABC Bayes factor, in terms of frequencies of visits to models MA(1) (left) and MA(2) (right) when equal to 10, 1, .1, .01% quantiles on insufficient autocovariance distances. Sample of 50 points from a MA(1) model with θ1 = 0.6. True Bayes factor B21 equal to .004.
  • 45. ABC Methods for Bayesian Model Choice Generic ABC model choice A population genetics evaluation Population genetics example with 3 populations 2 scenari 15 individuals 5 loci single mutation parameter
  • 46. ABC Methods for Bayesian Model Choice Generic ABC model choice A population genetics evaluation Population genetics example with 3 populations 2 scenari 15 individuals 5 loci single mutation parameter 24 summary statistics 2 million ABC proposal importance [tree] sampling alternative
  • 47. ABC Methods for Bayesian Model Choice Generic ABC model choice Stability of importance sampling 1.0 1.0 1.0 1.0 1.0 q q 0.8 0.8 0.8 0.8 0.8 0.6 0.6 0.6 0.6 0.6 q 0.4 0.4 0.4 0.4 0.4 0.2 0.2 0.2 0.2 0.2 0.0 0.0 0.0 0.0 0.0
  • 48. ABC Methods for Bayesian Model Choice Generic ABC model choice Comparison with ABC Use of 24 summary statistics and DIY-ABC logistic correction 1.0 qq qq qq qq qqq qq q q qq q q qq q q q q q q q q q q qq q q qq q q q qq qq qq q q q q qq q q q q q q q q q q q qq q q q qq q q 0.8 q q q q q q q q q q q q qq q q q q q q q qq q q q q q qqq q q qq q ABC direct and logistic q q qq q q q q q q q 0.6 q q q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q qq qq q q qq 0.4 qq q q q q q q q q q q q q q q q q q 0.2 q qq q q q q q q q q q q 0.0 qq 0.0 0.2 0.4 0.6 0.8 1.0 importance sampling
  • 49. ABC Methods for Bayesian Model Choice Generic ABC model choice Comparison with ABC Use of 15 summary statistics and DIY-ABC logistic correction 6 4 q q q q q q q qq 2 q q q q qq ABC direct q q q qq q qq q q q q qq qq q qq q q q q q qq q q q q q qq q q qq q qq q q qq q q q qqq q q q q q q q q 0 qq qq q q q q qq qq q q qq q q q q q q q q −2 −4 −4 −2 0 2 4 6 importance sampling
  • 50. ABC Methods for Bayesian Model Choice Generic ABC model choice Comparison with ABC Use of 15 summary statistics and DIY-ABC logistic correction qq 6 q q q q q q q q q q q 4 q qq q qq qq q q q q q q q q q q qq q q q q q q q ABC direct and logistic q q q q q q q qq q q q q q q q qq q 2 q q q q q qq q qq q q q qq qq q qq qq q q q qq q q qq qq qq q q qqqq qq q q q qqqqq qq q q q qq q q q q qq q qq q q qqq q q qqqqq q q qq q q qq q q q qq 0 qq qqq q q qq q qqq q q q q q q q q q q q q q q q q q q q q qq q q −2 q q q q q −4 q q q −4 −2 0 2 4 6 importance sampling
  • 51. ABC Methods for Bayesian Model Choice Generic ABC model choice The only safe cases??? Besides specific models like Gibbs random fields, using distances over the data itself escapes the discrepancy... [Toni & Stumpf, 2010; Sousa & al., 2009]
  • 52. ABC Methods for Bayesian Model Choice Generic ABC model choice The only safe cases??? Besides specific models like Gibbs random fields, using distances over the data itself escapes the discrepancy... [Toni & Stumpf, 2010; Sousa & al., 2009] ...and so does the use of more informal model fitting measures [Ratmann & al., 2009]
  • 53. ABC Methods for Bayesian Model Choice Model choice consistency ABC model choice consistency Approximate Bayesian computation ABC for model choice Gibbs random fields Generic ABC model choice Model choice consistency
  • 54. ABC Methods for Bayesian Model Choice Model choice consistency Formalised framework The starting point Central question to the validation of ABC for model choice: When is a Bayes factor based on an insufficient statistic T (y) consistent?
  • 55. ABC Methods for Bayesian Model Choice Model choice consistency Formalised framework The starting point Central question to the validation of ABC for model choice: When is a Bayes factor based on an insufficient statistic T (y) consistent? T Note: c drawn on T (y) through B12 (y) necessarily differs from c drawn on y through B12 (y)
  • 56. ABC Methods for Bayesian Model Choice Model choice consistency Formalised framework A benchmark if toy example Comparison suggested by referee of PNAS paper [thanks]: [X, Cornuet, Marin, & Pillai, Aug. 2011] Model M1 : y ∼ N (θ1 , 1) opposed to model M2 : √ y ∼ L(θ2 , 1/ √2), Laplace distribution with mean θ2 and scale parameter 1/ 2 (variance one).
  • 57. ABC Methods for Bayesian Model Choice Model choice consistency Formalised framework A benchmark if toy example Comparison suggested by referee of PNAS paper [thanks]: [X, Cornuet, Marin, & Pillai, Aug. 2011] Model M1 : y ∼ N (θ1 , 1) opposed to model M2 : √ y ∼ L(θ2 , 1/ √2), Laplace distribution with mean θ2 and scale parameter 1/ 2 (variance one). Four possible statistics 1. sample mean y (sufficient for M1 if not M2 ); 2. sample median med(y) (insufficient); 3. sample variance var(y) (ancillary); 4. median absolute deviation mad(y) = med(y − med(y));
  • 58. ABC Methods for Bayesian Model Choice Model choice consistency Formalised framework A benchmark if toy example Comparison suggested by referee of PNAS paper [thanks]: [X, Cornuet, Marin, & Pillai, Aug. 2011] Model M1 : y ∼ N (θ1 , 1) opposed to model M2 : √ y ∼ L(θ2 , 1/ √2), Laplace distribution with mean θ2 and scale parameter 1/ 2 (variance one). 6 5 4 Density 3 2 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 posterior probability
  • 59. ABC Methods for Bayesian Model Choice Model choice consistency Formalised framework A benchmark if toy example Comparison suggested by referee of PNAS paper [thanks]: [X, Cornuet, Marin, & Pillai, Aug. 2011] Model M1 : y ∼ N (θ1 , 1) opposed to model M2 : √ y ∼ L(θ2 , 1/ √2), Laplace distribution with mean θ2 and scale parameter 1/ 2 (variance one). 6 3 5 4 Density Density 2 3 2 1 1 0 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.0 0.2 0.4 0.6 0.8 1.0 posterior probability probability
  • 60. ABC Methods for Bayesian Model Choice Model choice consistency Formalised framework Framework Starting from sample y = (y1 , . . . , yn ) the observed sample, not necessarily iid with true distribution y ∼ Pn Summary statistics T (y) = T n = (T1 (y), T2 (y), · · · , Td (y)) ∈ Rd with true distribution T n ∼ Gn .
  • 61. ABC Methods for Bayesian Model Choice Model choice consistency Formalised framework Framework c Comparison of – under M1 , y ∼ F1,n (·|θ1 ) where θ1 ∈ Θ1 ⊂ Rp1 – under M2 , y ∼ F2,n (·|θ2 ) where θ2 ∈ Θ2 ⊂ Rp2 turned into – under M1 , T (y) ∼ G1,n (·|θ1 ), and θ1 |T (y) ∼ π1 (·|T n ) – under M2 , T (y) ∼ G2,n (·|θ2 ), and θ2 |T (y) ∼ π2 (·|T n )
  • 62. ABC Methods for Bayesian Model Choice Model choice consistency Consistency results Assumptions A collection of asymptotic “standard” assumptions: [A1] There exist a sequence {vn } converging to +∞, an a.c. distribution Q with continuous bounded density q(·), a symmetric, d × d positive definite matrix V0 and a vector µ0 ∈ Rd such that −1/2 n→∞ vn V0 (T n − µ0 ) Q, under Gn and for all M > 0 −d −1/2 sup |V0 |1/2 vn gn (t) − q vn V0 {t − µ0 } = o(1) vn |t−µ0 |<M
  • 63. ABC Methods for Bayesian Model Choice Model choice consistency Consistency results Assumptions A collection of asymptotic “standard” assumptions: [A2] For i = 1, 2, there exist d × d symmetric positive definite matrices Vi (θi ) and µi (θi ) ∈ Rd such that n→∞ vn Vi (θi )−1/2 (T n − µi (θi )) Q, under Gi,n (·|θi ) .
  • 64. ABC Methods for Bayesian Model Choice Model choice consistency Consistency results Assumptions A collection of asymptotic “standard” assumptions: [A3] For i = 1, 2, there exist sets Fn,i ⊂ Θi and constants i , τi , αi >0 such that for all τ > 0, sup Gi,n |T n − µ(θi )| > τ |µi (θi ) − µ0 | ∧ i |θi θi ∈Fn,i −α −αi vn i (|µi (θi ) − µ0 | ∧ i ) with c −τ πi (Fn,i ) = o(vn i ).
  • 65. ABC Methods for Bayesian Model Choice Model choice consistency Consistency results Assumptions A collection of asymptotic “standard” assumptions: [A4] For u > 0 −1 Sn,i (u) = θi ∈ Fn,i ; |µ(θi ) − µ0 | ≤ u vn if inf{|µi (θi ) − µ0 |; θi ∈ Θi } = 0, there exist constants di < τi ∧ αi − 1 such that −d πi (Sn,i (u)) ∼ udi vn i , ∀u vn
  • 66. ABC Methods for Bayesian Model Choice Model choice consistency Consistency results Assumptions A collection of asymptotic “standard” assumptions: [A5] If inf{|µi (θi ) − µ0 |; θi ∈ Θi } = 0, there exists U > 0 such that for any M > 0, −d sup sup |Vi (θi )|1/2 vn gi (t|θi ) vn |t−µ0 |<M θi ∈Sn,i (U ) −q vn Vi (θi )−1/2 (t − µ(θi ) = o(1) and πi Sn,i (U ) ∩ ||Vi (θi )−1 || + ||Vi (θi )|| > M lim lim sup =0. M →∞ n πi (Sn,i (U ))
  • 67. ABC Methods for Bayesian Model Choice Model choice consistency Consistency results Assumptions A collection of asymptotic “standard” assumptions: [A1]–[A2] are standard central limit theorems ([A1] redundant when one model is “true”) [A3] controls the large deviations of the estimator T n from the estimand µ(θ) [A4] is the standard prior mass condition found in Bayesian asymptotics (di effective dimension of the parameter) [A5] controls more tightly convergence esp. when µi is not one-to-one
  • 68. ABC Methods for Bayesian Model Choice Model choice consistency Consistency results Effective dimension [A4] Understanding d1 , d2 : defined only when µ0 ∈ {µi (θi ), θi ∈ Θi }, πi (θi : |µi (θi ) − µ0 | < n−1/2 ) = O(n−di /2 ) is the effective dimension of the model Θi around µ0
  • 69. ABC Methods for Bayesian Model Choice Model choice consistency Consistency results Asymptotic marginals Asymptotically, under [A1]–[A5] mi (t) = gi (t|θi ) πi (θi ) dθi Θi is such that (i) if inf{|µi (θi ) − µ0 |; θi ∈ Θi } = 0, Cl vn i ≤ mi (T n ) ≤ Cu vn i d−d d−d and (ii) if inf{|µi (θi ) − µ0 |; θi ∈ Θi } > 0 mi (T n ) = oPn [vn i + vn i ]. d−τ d−α
  • 70. ABC Methods for Bayesian Model Choice Model choice consistency Consistency results Within-model consistency Under same assumptions, if inf{|µi (θi ) − µ0 |; θi ∈ Θi } = 0, the posterior distribution of µi (θi ) given T n is consistent at rate 1/vn provided αi ∧ τi > di .
  • 71. ABC Methods for Bayesian Model Choice Model choice consistency Consistency results Within-model consistency Under same assumptions, if inf{|µi (θi ) − µ0 |; θi ∈ Θi } = 0, the posterior distribution of µi (θi ) given T n is consistent at rate 1/vn provided αi ∧ τi > di . Note: di can truly be seen as an effective dimension of the model under the posterior πi (.|T n ), since if µ0 ∈ {µi (θi ); θi ∈ Θi }, mi (T n ) ∼ vn i d−d
  • 72. ABC Methods for Bayesian Model Choice Model choice consistency Consistency results Between-model consistency Consequence of above is that asymptotic behaviour of the Bayes factor is driven by the asymptotic mean value of T n under both models. And only by this mean value!
  • 73. ABC Methods for Bayesian Model Choice Model choice consistency Consistency results Between-model consistency Consequence of above is that asymptotic behaviour of the Bayes factor is driven by the asymptotic mean value of T n under both models. And only by this mean value! Indeed, if inf{|µ0 − µ2 (θ2 )|; θ2 ∈ Θ2 } = inf{|µ0 − µ1 (θ1 )|; θ1 ∈ Θ1 } = 0 then Cl vn 1 −d2 ) ≤ m1 (T n ) m2 (T n ) ≤ Cu vn 1 −d2 ) , −(d −(d where Cl , Cu = OPn (1), irrespective of the true model. c Only depends on the difference d1 − d2
  • 74. ABC Methods for Bayesian Model Choice Model choice consistency Consistency results Between-model consistency Consequence of above is that asymptotic behaviour of the Bayes factor is driven by the asymptotic mean value of T n under both models. And only by this mean value! Else, if inf{|µ0 − µ2 (θ2 )|; θ2 ∈ Θ2 } > inf{|µ0 − µ1 (θ1 )|; θ1 ∈ Θ1 } = 0 then m1 (T n ) ≥ Cu min vn 1 −α2 ) , vn 1 −τ2 ) , −(d −(d m2 (T n )
  • 75. ABC Methods for Bayesian Model Choice Model choice consistency Consistency results Consistency theorem If inf{|µ0 − µ2 (θ2 )|; θ2 ∈ Θ2 } = inf{|µ0 − µ1 (θ1 )|; θ1 ∈ Θ1 } = 0, Bayes factor B12 = O(vn 1 −d2 ) ) T −(d irrespective of the true model. It is consistent iff Pn is within the model with the smallest dimension
  • 76. ABC Methods for Bayesian Model Choice Model choice consistency Consistency results Consistency theorem If Pn belongs to one of the two models and if µ0 cannot be attained by the other one : 0 = min (inf{|µ0 − µi (θi )|; θi ∈ Θi }, i = 1, 2) < max (inf{|µ0 − µi (θi )|; θi ∈ Θi }, i = 1, 2) , T then the Bayes factor B12 is consistent
  • 77. ABC Methods for Bayesian Model Choice Model choice consistency Summary statistics Consequences on summary statistics Bayes factor driven by the means µi (θi ) and the relative position of µ0 wrt both sets {µi (θi ); θi ∈ Θi }, i = 1, 2. For ABC, this implies the most likely statistics T n are ancillary statistics with different mean values under both models Else, if T n asymptotically depends on some of the parameters of the models, it is quite likely that there exists θi ∈ Θi such that µi (θi ) = µ0 even though model M1 is misspecified
  • 78. ABC Methods for Bayesian Model Choice Model choice consistency Summary statistics Toy example: Laplace versus Gauss [1] If n T n = n−1 Xi4 , µ1 (θ) = 3 + θ4 + 6θ2 , µ2 (θ) = 6 + · · · i=1 and the true distribution is Laplace with mean θ0 = 1, under the √ Gaussian model the value θ∗ = 2 3 − 3 leads to µ0 = µ(θ∗ ) [here d1 = d2 = d = 1]
  • 79. ABC Methods for Bayesian Model Choice Model choice consistency Summary statistics Toy example: Laplace versus Gauss [1] If n T n = n−1 Xi4 , µ1 (θ) = 3 + θ4 + 6θ2 , µ2 (θ) = 6 + · · · i=1 and the true distribution is Laplace with mean θ0 = 1, under the √ Gaussian model the value θ∗ = 2 3 − 3 leads to µ0 = µ(θ∗ ) [here d1 = d2 = d = 1] c a Bayes factor associated with such a statistic is inconsistent
  • 80. ABC Methods for Bayesian Model Choice Model choice consistency Summary statistics Toy example: Laplace versus Gauss [1] If n n −1 T =n Xi4 , µ1 (θ) = 3 + θ4 + 6θ2 , µ2 (θ) = 6 + · · · i=1 8 6 4 2 0 0.0 0.2 0.4 0.6 0.8 1.0 probability Fourth moment
  • 81. ABC Methods for Bayesian Model Choice Model choice consistency Summary statistics Toy example: Laplace versus Gauss [1] If n n −1 T =n Xi4 , µ1 (θ) = 3 + θ4 + 6θ2 , µ2 (θ) = 6 + · · · i=1 n=1000 0.45 q 0.40 0.35 q q q 0.30 q M1 M2
  • 82. ABC Methods for Bayesian Model Choice Model choice consistency Summary statistics Toy example: Laplace versus Gauss [1] If n n −1 T =n Xi4 , µ1 (θ) = 3 + θ4 + 6θ2 , µ2 (θ) = 6 + · · · i=1 Caption: Comparison of the distributions of the posterior probabilities that the data is from a normal model (as opposed to a Laplace model) with unknown mean when the data is made of n = 1000 observations either from a normal (M1) or Laplace (M2) distribution with mean one and when the summary statistic in the ABC algorithm is restricted to the empirical fourth moment. The ABC algorithm uses proposals from the prior N (0, 4) and selects the tolerance as the 1% distance quantile.
  • 83. ABC Methods for Bayesian Model Choice Model choice consistency Summary statistics Toy example: Laplace versus Gauss [0] When ¯(4) ¯(6) T (y) = yn , yn and the true distribution is Laplace with mean θ0 = 0, then √ ∗ ∗ µ0 = 6, µ1 (θ1 ) = 6 with θ1 = 2 3 − 3 [d1 = 1 and d2 = 1/2] thus B12 ∼ n−1/4 → 0 : consistent
  • 84. ABC Methods for Bayesian Model Choice Model choice consistency Summary statistics Toy example: Laplace versus Gauss [0] When ¯(4) ¯(6) T (y) = yn , yn and the true distribution is Laplace with mean θ0 = 0, then √ ∗ ∗ µ0 = 6, µ1 (θ1 ) = 6 with θ1 = 2 3 − 3 [d1 = 1 and d2 = 1/2] thus B12 ∼ n−1/4 → 0 : consistent Under the Gaussian model µ0 = 3 µ2 (θ2 ) ≥ 6 > 3 ∀θ2 B12 → +∞ : consistent
  • 85. ABC Methods for Bayesian Model Choice Model choice consistency Summary statistics Toy example: Laplace versus Gauss [0] When ¯(4) ¯(6) T (y) = yn , yn 5 4 3 Density 2 1 0 0.0 0.2 0.4 0.6 0.8 1.0 posterior probabilities Fourth and sixth moments
  • 86. ABC Methods for Bayesian Model Choice Model choice consistency Summary statistics Embedded models When M1 submodel of M2 , and if the true distribution belongs to the smaller model M1 , Bayes factor is of order vn 1 −d2 ) −(d
  • 87. ABC Methods for Bayesian Model Choice Model choice consistency Summary statistics Embedded models When M1 submodel of M2 , and if the true distribution belongs to the smaller model M1 , Bayes factor is of order vn 1 −d2 ) −(d If summary statistic only informative on a parameter that is the same under both models, i.e if d1 = d2 , then c the Bayes factor is not consistent
  • 88. ABC Methods for Bayesian Model Choice Model choice consistency Summary statistics Embedded models When M1 submodel of M2 , and if the true distribution belongs to the smaller model M1 , Bayes factor is of order vn 1 −d2 ) −(d Else, d1 < d2 and Bayes factor is consistent under M1 . If true distribution not in M1 , then c Bayes factor is consistent only if µ1 = µ2 = µ0
  • 89. ABC Methods for Bayesian Model Choice Model choice consistency Summary statistics Another toy example: Quantile distribution 1 − exp{−gz(p)} k Q(p; A, B, g, k) = A+B 1 + 1 + z(p)2 z(p) 1 + exp{−gz(p)} A, B, g and k, location, scale, skewness and kurtosis parameters Embedded models: M1 : g = 0 and k ∼ U[−1/2, 5] M2 : g ∼ U[0, 4] and k ∼ U[−1/2, 5].
  • 90. ABC Methods for Bayesian Model Choice Model choice consistency Summary statistics Consistency [or not] 0.7 0.7 0.7 0.6 0.6 0.6 q 0.5 0.5 0.5 0.4 0.4 0.4 q q q q 0.3 0.3 0.3 M1 M2 M1 M2 M1 M2 1.0 1.0 0.8 q q q 0.8 0.8 q q q q q 0.6 0.6 0.6 q q q q 0.4 0.4 0.4 q q q q q q q q q q 0.2 0.2 0.2 q q q q q q 0.0 0.0 0.0 M1 M2 M1 M2 M1 M2 1.0 1.0 q q 0.8 q q 0.8 0.8 q q q q 0.6 q 0.6 q 0.6 q q 0.4 0.4 q 0.4 q q q q q q q 0.2 0.2 q 0.2 q q q q 0.0 0.0 0.0 M1 M2 M1 M2 M1 M2
  • 91. ABC Methods for Bayesian Model Choice Model choice consistency Summary statistics Consistency [or not] Caption: Comparison of the distributions of the posterior probabilities that the data is from model M1 when the data is made of 100 observations (left column), 1000 observations (central column) and 10,000 observations (right column) either from M1 (M1) or M2 (M2) when the summary statistics in the ABC algorithm are made of the empirical quantile at level 10% (first row), the empirical quantiles at levels 10% and 90% (second row), and the empirical quantiles at levels 10%, 40%, 60% and 90% (third row), respectively. The boxplots rely on 100 replicas and the ABC algorithms are based on 104 proposals from the prior, with the tolerance being chosen as the 1% quantile on the distances.
  • 92. ABC Methods for Bayesian Model Choice Model choice consistency Conclusions Conclusions • Model selection feasible with ABC • Choice of summary statistics is paramount • At best, ABC → π(. | T (y)) which concentrates around µ0
  • 93. ABC Methods for Bayesian Model Choice Model choice consistency Conclusions Conclusions • Model selection feasible with ABC • Choice of summary statistics is paramount • At best, ABC → π(. | T (y)) which concentrates around µ0 • For estimation : {θ; µ(θ) = µ0 } = θ0 • For testing {µ1 (θ1 ), θ1 ∈ Θ1 } ∩ {µ2 (θ2 ), θ2 ∈ Θ2 } = ∅