This document discusses approximate Bayesian computation (ABC) methods for Bayesian model choice. ABC methods allow Bayesian inference when the likelihood function is intractable or unavailable. The ABC algorithm samples parameter values from the prior that produce simulated data close to the observed data. This provides an approximation to the posterior distribution. ABC can also be used for model choice by sampling models from their prior and parameters from each model's prior. The frequency with which each model is sampled approximates the posterior model probabilities. The document provides an example of using ABC for choosing between neighborhood structures in Gibbs random field models.
Approximate Bayesian computation for the Ising/Potts modelMatt Moores
Bayes’ formula involves the likelihood function, p(y|theta), which is a problem when the likelihood is unavailable in closed form. ABC is a method for approximating the posterior p(theta|y) without evaluating the likelihood. Instead, pseudo-data is simulated from a generative model and compared with the observations. This talk will give an introduction to ABC algorithms: rejection sampling, ABC-MCMC and ABC-SMC. Application of these algorithms to image analysis will be presented as an illustrative example. These methods have been implemented in the R package bayesImageS.
This is joint work with Christian Robert (Warwick/Dauphine), Kerrie Mengersen and Christopher Drovandi (QUT).
An introduction to some Bayesian variable selection techniques, plus some ideas about how to (mis-)use them.
Talk given at IWSM 2012 in Prague (which is rather rainy)
These are the slides I gave during three lectures at the YES III meeting in Eindhoven, October 5-7, 2009. They include a presentation of the Savage-Dickey paradox and a comparison of major importance sampling solution, both obtained in collaboration with Jean-Michel Marin.
R package bayesImageS: Scalable Inference for Intractable LikelihoodsMatt Moores
There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm and approximate Bayesian computation (ABC). A serious drawback of these algorithms is that they do not scale well for models with a large state space. Markov random fields, such as the Ising/Potts model and exponential random graph model (ERGM), are particularly challenging because the number of discrete variables increases linearly with the size of the image or graph. The likelihood of these models cannot be computed directly, due to the presence of an intractable normalising constant. In this context, it is necessary to employ algorithms that provide a suitable compromise between accuracy and computational cost.
Bayesian indirect likelihood (BIL) is a class of methods that approximate the likelihood function using a surrogate model. This model can be trained using a pre-computation step, utilising massively parallel hardware to simulate auxiliary variables. We review various types of surrogate model that can be used in BIL. In the case of the Potts model, we introduce a parametric approximation to the score function that incorporates its known properties, such as heteroskedasticity and critical temperature. We demonstrate this method on 2D satellite remote sensing and 3D computed tomography (CT) images. We achieve a hundredfold improvement in the elapsed runtime, compared to the exchange algorithm or ABC. Our algorithm has been implemented in the R package “bayesImageS,” which is available from CRAN.
Ethnobotany and Ethnopharmacology:
Ethnobotany in herbal drug evaluation,
Impact of Ethnobotany in traditional medicine,
New development in herbals,
Bio-prospecting tools for drug discovery,
Role of Ethnopharmacology in drug evaluation,
Reverse Pharmacology.
This is a presentation by Dada Robert in a Your Skill Boost masterclass organised by the Excellence Foundation for South Sudan (EFSS) on Saturday, the 25th and Sunday, the 26th of May 2024.
He discussed the concept of quality improvement, emphasizing its applicability to various aspects of life, including personal, project, and program improvements. He defined quality as doing the right thing at the right time in the right way to achieve the best possible results and discussed the concept of the "gap" between what we know and what we do, and how this gap represents the areas we need to improve. He explained the scientific approach to quality improvement, which involves systematic performance analysis, testing and learning, and implementing change ideas. He also highlighted the importance of client focus and a team approach to quality improvement.
How to Split Bills in the Odoo 17 POS ModuleCeline George
Bills have a main role in point of sale procedure. It will help to track sales, handling payments and giving receipts to customers. Bill splitting also has an important role in POS. For example, If some friends come together for dinner and if they want to divide the bill then it is possible by POS bill splitting. This slide will show how to split bills in odoo 17 POS.
The Art Pastor's Guide to Sabbath | Steve ThomasonSteve Thomason
What is the purpose of the Sabbath Law in the Torah. It is interesting to compare how the context of the law shifts from Exodus to Deuteronomy. Who gets to rest, and why?
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
How to Create Map Views in the Odoo 17 ERPCeline George
The map views are useful for providing a geographical representation of data. They allow users to visualize and analyze the data in a more intuitive manner.
1. ABC Methods for Bayesian Model Choice
ABC Methods for Bayesian Model Choice
Christian P. Robert
Universit´ Paris-Dauphine, IuF, & CREST
e
http://www.ceremade.dauphine.fr/~xian
Computational Methods in Applied Sciences, Columbia
University, New York, September 24, 2011
Joint work(s) with Jean-Marie Cornuet, Aude Grelaud,
Jean-Michel Marin, Jean-Michel Marin, Natesh Pillai, Judith
Rousseau
2. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Approximate Bayesian computation
Approximate Bayesian computation
ABC for model choice
Gibbs random fields
Generic ABC model choice
Model choice consistency
3. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Regular Bayesian computation issues
When faced with a non-standard posterior distribution
π(θ|y) ∝ π(θ)L(θ|y)
the standard solution is to use simulation (Monte Carlo) to
produce a sample
θ1 , . . . , θ T
from π(θ|y) (or approximately by Markov chain Monte Carlo
methods)
[Robert & Casella, 2004]
4. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Untractable likelihoods
Cases when the likelihood function f (y|θ) is unavailable and when
the completion step
f (y|θ) = f (y, z|θ) dz
Z
is impossible or too costly because of the dimension of z
c MCMC cannot be implemented!
5. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Untractable likelihoods
c MCMC cannot be implemented!
6. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
The ABC method
Bayesian setting: target is π(θ)f (x|θ)
7. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
The ABC method
Bayesian setting: target is π(θ)f (x|θ)
When likelihood f (x|θ) not in closed form, likelihood-free rejection
technique:
8. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
The ABC method
Bayesian setting: target is π(θ)f (x|θ)
When likelihood f (x|θ) not in closed form, likelihood-free rejection
technique:
ABC algorithm
For an observation y ∼ f (y|θ), under the prior π(θ), keep jointly
simulating
θ ∼ π(θ) , z ∼ f (z|θ ) ,
until the auxiliary variable z is equal to the observed value, z = y.
[Tavar´ et al., 1997]
e
9. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
A as approximative
When y is a continuous random variable, equality z = y is replaced
with a tolerance condition,
(y, z) ≤
where is a distance
10. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
A as approximative
When y is a continuous random variable, equality z = y is replaced
with a tolerance condition,
(y, z) ≤
where is a distance
Output distributed from
π(θ) Pθ { (y, z) < } ∝ π(θ| (y, z) < )
11. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
ABC algorithm
Algorithm 1 Likelihood-free rejection sampler
for i = 1 to N do
repeat
generate θ from the prior distribution π(·)
generate z from the likelihood f (·|θ )
until ρ{η(z), η(y)} ≤
set θi = θ
end for
where η(y) defines a (maybe in-sufficient) statistic
12. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Output
The likelihood-free algorithm samples from the marginal in z of:
π(θ)f (z|θ)IA ,y (z)
π (θ, z|y) = ,
A ,y ×Θ π(θ)f (z|θ)dzdθ
where A ,y = {z ∈ D|ρ(η(z), η(y)) < }.
13. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Output
The likelihood-free algorithm samples from the marginal in z of:
π(θ)f (z|θ)IA ,y (z)
π (θ, z|y) = ,
A ,y ×Θ π(θ)f (z|θ)dzdθ
where A ,y = {z ∈ D|ρ(η(z), η(y)) < }.
The idea behind ABC is that the summary statistics coupled with a
small tolerance should provide a good approximation of the
posterior distribution:
π (θ|y) = π (θ, z|y)dz ≈ π(θ|y) .
14. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
MA example
Consider the MA(q) model
q
xt = t+ ϑi t−i
i=1
Simple prior: uniform prior over the identifiability zone, e.g.
triangle for MA(2)
15. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
MA example (2)
ABC algorithm thus made of
1. picking a new value (ϑ1 , ϑ2 ) in the triangle
2. generating an iid sequence ( t )−q<t≤T
3. producing a simulated series (xt )1≤t≤T
16. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
MA example (2)
ABC algorithm thus made of
1. picking a new value (ϑ1 , ϑ2 ) in the triangle
2. generating an iid sequence ( t )−q<t≤T
3. producing a simulated series (xt )1≤t≤T
Distance: basic distance between the series
T
ρ((xt )1≤t≤T , (xt )1≤t≤T ) = (xt − xt )2
t=1
or between summary statistics like the first q autocorrelations
T
τj = xt xt−j
t=j+1
17. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Comparison of distance impact
Evaluation of the tolerance on the ABC sample against both
distances ( = 100%, 10%, 1%, 0.1%) for an MA(2) model
18. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Comparison of distance impact
4
1.5
3
1.0
2
0.5
1
0.0
0
0.0 0.2 0.4 0.6 0.8 −2.0 −1.0 0.0 0.5 1.0 1.5
θ1 θ2
Evaluation of the tolerance on the ABC sample against both
distances ( = 100%, 10%, 1%, 0.1%) for an MA(2) model
19. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Comparison of distance impact
4
1.5
3
1.0
2
0.5
1
0.0
0
0.0 0.2 0.4 0.6 0.8 −2.0 −1.0 0.0 0.5 1.0 1.5
θ1 θ2
Evaluation of the tolerance on the ABC sample against both
distances ( = 100%, 10%, 1%, 0.1%) for an MA(2) model
20. ABC Methods for Bayesian Model Choice
ABC for model choice
ABC for model choice
Approximate Bayesian computation
ABC for model choice
Gibbs random fields
Generic ABC model choice
Model choice consistency
21. ABC Methods for Bayesian Model Choice
ABC for model choice
Bayesian model choice
Principle
Several models M1 , M2 , . . . are considered simultaneously for
dataset y and model index M central to inference.
Use of a prior π(M = m), plus a prior distribution on the
parameter conditional on the value m of the model index, πm (θ m )
Goal is to derive the posterior distribution of M, a challenging
computational target when models are complex.
22. ABC Methods for Bayesian Model Choice
ABC for model choice
Generic ABC for model choice
Algorithm 2 Likelihood-free model choice sampler (ABC-MC)
for t = 1 to T do
repeat
Generate m from the prior π(M = m)
Generate θ m from the prior πm (θ m )
Generate z from the model fm (z|θ m )
until ρ{η(z), η(y)} <
Set m(t) = m and θ (t) = θ m
end for
[Toni, Welch, Strelkowa, Ipsen & Stumpf, 2009]
23. ABC Methods for Bayesian Model Choice
ABC for model choice
ABC estimates
Posterior probability π(M = m|y) approximated by the frequency
of acceptances from model m
T
1
Im(t) =m .
T
t=1
Early issues with implementation:
should tolerances be the same for all models?
should summary statistics vary across models? incl. their
dimension?
should the distance measure ρ vary across models?
24. ABC Methods for Bayesian Model Choice
ABC for model choice
ABC estimates
Posterior probability π(M = m|y) approximated by the frequency
of acceptances from model m
T
1
Im(t) =m .
T
t=1
Early issues with implementation:
then needs to become part of the model
Varying statistics incompatible with Bayesian model choice
proper
ρ then part of the model
Extension to a weighted polychotomous logistic regression estimate
of π(M = m|y), with non-parametric kernel weights
[Cornuet et al., DIYABC, 2009]
25. ABC Methods for Bayesian Model Choice
ABC for model choice
The great ABC controversy
On-going controvery in phylogeographic genetics about the validity
of using ABC for testing
Against: Templeton, 2008,
2009, 2010a, 2010b, 2010c, &tc
argues that nested hypotheses
cannot have higher probabilities
than nesting hypotheses (!)
26. ABC Methods for Bayesian Model Choice
ABC for model choice
The great ABC controversy
On-going controvery in phylogeographic genetics about the validity
of using ABC for testing
Replies: Fagundes et al., 2008,
Against: Templeton, 2008, Beaumont et al., 2010, Berger et
2009, 2010a, 2010b, 2010c, &tc al., 2010, Csill`ry et al., 2010
e
argues that nested hypotheses point out that the criticisms are
cannot have higher probabilities addressed at [Bayesian]
than nesting hypotheses (!) model-based inference and have
nothing to do with ABC...
27. ABC Methods for Bayesian Model Choice
Gibbs random fields
Potts model
Potts model
c∈C Vc (y) is of the form
Vc (y) = θS(y) = θ δyl =yi
c∈C l∼i
where l∼i denotes a neighbourhood structure
28. ABC Methods for Bayesian Model Choice
Gibbs random fields
Potts model
Potts model
c∈C Vc (y) is of the form
Vc (y) = θS(y) = θ δyl =yi
c∈C l∼i
where l∼i denotes a neighbourhood structure
In most realistic settings, summation
Zθ = exp{θ T S(x)}
x∈X
involves too many terms to be manageable and numerical
approximations cannot always be trusted
[Cucala et al., 2009]
29. ABC Methods for Bayesian Model Choice
Gibbs random fields
Neighbourhood relations
Setup
Choice to be made between M neighbourhood relations
m
i∼i (0 ≤ m ≤ M − 1)
with
Sm (x) = I{xi =xi }
m
i∼i
driven by the posterior probabilities of the models.
30. ABC Methods for Bayesian Model Choice
Gibbs random fields
Model index
Computational target:
P(M = m|x) ∝ fm (x|θm )πm (θm ) dθm π(M = m)
Θm
31. ABC Methods for Bayesian Model Choice
Gibbs random fields
Model index
Computational target:
P(M = m|x) ∝ fm (x|θm )πm (θm ) dθm π(M = m)
Θm
If S(x) sufficient statistic for the joint parameters
(M, θ0 , . . . , θM −1 ),
P(M = m|x) = P(M = m|S(x)) .
32. ABC Methods for Bayesian Model Choice
Gibbs random fields
Sufficient statistics in Gibbs random fields
33. ABC Methods for Bayesian Model Choice
Gibbs random fields
Sufficient statistics in Gibbs random fields
Each model m has its own sufficient statistic Sm (·) and
S(·) = (S0 (·), . . . , SM −1 (·)) is also (model-)sufficient.
34. ABC Methods for Bayesian Model Choice
Gibbs random fields
Sufficient statistics in Gibbs random fields
Each model m has its own sufficient statistic Sm (·) and
S(·) = (S0 (·), . . . , SM −1 (·)) is also (model-)sufficient.
Explanation: For Gibbs random fields,
1 2
x|M = m ∼ fm (x|θm ) = fm (x|S(x))fm (S(x)|θm )
1
= f 2 (S(x)|θm )
n(S(x)) m
where
n(S(x)) = {˜ ∈ X : S(˜ ) = S(x)}
x x
c S(x) is therefore also sufficient for the joint parameters
35. ABC Methods for Bayesian Model Choice
Gibbs random fields
Toy example
iid Bernoulli model versus two-state first-order Markov chain, i.e.
n
f0 (x|θ0 ) = exp θ0 I{xi =1} {1 + exp(θ0 )}n ,
i=1
versus
n
1
f1 (x|θ1 ) = exp θ1 I{xi =xi−1 } {1 + exp(θ1 )}n−1 ,
2
i=2
with priors θ0 ∼ U(−5, 5) and θ1 ∼ U(0, 6) (inspired by “phase
transition” boundaries).
36. ABC Methods for Bayesian Model Choice
Gibbs random fields
Toy example (2)
^
BF01
−5 0 5
−40
−20
BF01
0
10
^
BF01
−10 −5 0 5 10
−40
−20
BF01
0
10
(left) Comparison of the true BF m0 /m1 (x0 ) with BF m0 /m1 (x0 )
(in logs) over 2, 000 simulations and 4.106 proposals from the
prior. (right) Same when using tolerance corresponding to the
1% quantile on the distances.
37. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Back to sufficiency
‘Sufficient statistics for individual models are unlikely to
be very informative for the model probability. This is
already well known and understood by the ABC-user
community.’
[Scott Sisson, Jan. 31, 2011, ’Og]
38. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Back to sufficiency
‘Sufficient statistics for individual models are unlikely to
be very informative for the model probability. This is
already well known and understood by the ABC-user
community.’
[Scott Sisson, Jan. 31, 2011, ’Og]
If η1 (x) sufficient statistic for model m = 1 and parameter θ1 and
η2 (x) sufficient statistic for model m = 2 and parameter θ2 ,
(η1 (x), η2 (x)) is not always sufficient for (m, θm )
39. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Back to sufficiency
‘Sufficient statistics for individual models are unlikely to
be very informative for the model probability. This is
already well known and understood by the ABC-user
community.’
[Scott Sisson, Jan. 31, 2011, ’Og]
If η1 (x) sufficient statistic for model m = 1 and parameter θ1 and
η2 (x) sufficient statistic for model m = 2 and parameter θ2 ,
(η1 (x), η2 (x)) is not always sufficient for (m, θm )
c Potential loss of information at the testing level
40. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Limiting behaviour of B12 (T → ∞)
ABC approximation
T
t=1 Imt =1 Iρ{η(zt ),η(y)}≤
B12 (y) = T
,
t=1 Imt =2 Iρ{η(zt ),η(y)}≤
where the (mt , z t )’s are simulated from the (joint) prior
41. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Limiting behaviour of B12 (T → ∞)
ABC approximation
T
t=1 Imt =1 Iρ{η(zt ),η(y)}≤
B12 (y) = T
,
t=1 Imt =2 Iρ{η(zt ),η(y)}≤
where the (mt , z t )’s are simulated from the (joint) prior
As T go to infinity, limit
Iρ{η(z),η(y)}≤ π1 (θ 1 )f1 (z|θ 1 ) dz dθ 1
B12 (y) =
Iρ{η(z),η(y)}≤ π2 (θ 2 )f2 (z|θ 2 ) dz dθ 2
η
Iρ{η,η(y)}≤ π1 (θ 1 )f1 (η|θ 1 ) dη dθ 1
= η ,
Iρ{η,η(y)}≤ π2 (θ 2 )f2 (η|θ 2 ) dη dθ 2
η η
where f1 (η|θ 1 ) and f2 (η|θ 2 ) distributions of η(z)
42. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Limiting behaviour of B12 ( → 0)
When goes to zero,
η
η π1 (θ 1 )f1 (η(y)|θ 1 ) dθ 1
B12 (y) = η
π2 (θ 2 )f2 (η(y)|θ 2 ) dθ 2
43. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Limiting behaviour of B12 ( → 0)
When goes to zero,
η
η π1 (θ 1 )f1 (η(y)|θ 1 ) dθ 1
B12 (y) = η
π2 (θ 2 )f2 (η(y)|θ 2 ) dθ 2
Bayes factor based on the sole observation of η(y)
44. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Limiting behaviour of B12 (under sufficiency)
If η(y) sufficient statistic in both models,
fi (y|θ i ) = gi (y)fiη (η(y)|θ i )
Thus
η
Θ1 π(θ 1 )g1 (y)f1 (η(y)|θ 1 ) dθ 1
B12 (y) = η
Θ2 π(θ 2 )g2 (y)f2 (η(y)|θ 2 ) dθ 2
η
g1 (y) π1 (θ 1 )f1 (η(y)|θ 1 ) dθ 1 g1 (y) η
= η = B (y) .
g2 (y) π2 (θ 2 )f2 (η(y)|θ 2 ) dθ 2 g2 (y) 12
[Didelot, Everitt, Johansen & Lawson, 2011]
45. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Limiting behaviour of B12 (under sufficiency)
If η(y) sufficient statistic in both models,
fi (y|θ i ) = gi (y)fiη (η(y)|θ i )
Thus
η
Θ1 π(θ 1 )g1 (y)f1 (η(y)|θ 1 ) dθ 1
B12 (y) = η
Θ2 π(θ 2 )g2 (y)f2 (η(y)|θ 2 ) dθ 2
η
g1 (y) π1 (θ 1 )f1 (η(y)|θ 1 ) dθ 1 g1 (y) η
= η = B (y) .
g2 (y) π2 (θ 2 )f2 (η(y)|θ 2 ) dθ 2 g2 (y) 12
[Didelot, Everitt, Johansen & Lawson, 2011]
c No discrepancy only when cross-model sufficiency
46. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Poisson/geometric example
Sample
x = (x1 , . . . , xn )
from either a Poisson P(λ) or from a geometric G(p)
Sum
n
S= xi = η(x)
i=1
sufficient statistic for either model but not simultaneously
Discrepancy ratio
g1 (x) S!n−S / i xi !
=
g2 (x) 1 n+S−1
S
47. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Poisson/geometric discrepancy
η
Range of B12 (x) versus B12 (x): The values produced have
nothing in common.
48. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Formal recovery
Creating an encompassing exponential family
T T
f (x|θ1 , θ2 , α1 , α2 ) ∝ exp{θ1 η1 (x) + θ1 η1 (x) + α1 t1 (x) + α2 t2 (x)}
leads to a sufficient statistic (η1 (x), η2 (x), t1 (x), t2 (x))
[Didelot, Everitt, Johansen & Lawson, 2011]
49. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Formal recovery
Creating an encompassing exponential family
T T
f (x|θ1 , θ2 , α1 , α2 ) ∝ exp{θ1 η1 (x) + θ1 η1 (x) + α1 t1 (x) + α2 t2 (x)}
leads to a sufficient statistic (η1 (x), η2 (x), t1 (x), t2 (x))
[Didelot, Everitt, Johansen & Lawson, 2011]
In the Poisson/geometric case, if i xi ! is added to S, no
discrepancy
50. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Formal recovery
Creating an encompassing exponential family
T T
f (x|θ1 , θ2 , α1 , α2 ) ∝ exp{θ1 η1 (x) + θ1 η1 (x) + α1 t1 (x) + α2 t2 (x)}
leads to a sufficient statistic (η1 (x), η2 (x), t1 (x), t2 (x))
[Didelot, Everitt, Johansen & Lawson, 2011]
Only applies in genuine sufficiency settings...
c Inability to evaluate loss brought by summary statistics
51. ABC Methods for Bayesian Model Choice
Generic ABC model choice
The Pitman–Koopman lemma
Efficient sufficiency is not such a common occurrence:
Lemma
A necessary and sufficient condition for the existence of a sufficient
statistic with fixed dimension whatever the sample size is that the
sampling distribution belongs to an exponential family.
[Pitman, 1933; Koopman, 1933]
52. ABC Methods for Bayesian Model Choice
Generic ABC model choice
The Pitman–Koopman lemma
Efficient sufficiency is not such a common occurrence:
Lemma
A necessary and sufficient condition for the existence of a sufficient
statistic with fixed dimension whatever the sample size is that the
sampling distribution belongs to an exponential family.
[Pitman, 1933; Koopman, 1933]
Provision of fixed support (consider U(0, θ) counterexample)
53. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Meaning of the ABC-Bayes factor
‘This is also why focus on model discrimination typically
(...) proceeds by (...) accepting that the Bayes Factor
that one obtains is only derived from the summary
statistics and may in no way correspond to that of the
full model.’
[Scott Sisson, Jan. 31, 2011, ’Og]
54. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Meaning of the ABC-Bayes factor
‘This is also why focus on model discrimination typically
(...) proceeds by (...) accepting that the Bayes Factor
that one obtains is only derived from the summary
statistics and may in no way correspond to that of the
full model.’
[Scott Sisson, Jan. 31, 2011, ’Og]
In the Poisson/geometric case, if E[yi ] = θ0 > 0,
η (θ0 + 1)2 −θ0
lim B12 (y) = e
n→∞ θ0
55. ABC Methods for Bayesian Model Choice
Generic ABC model choice
MA example
0.7
0.6
0.6
0.6
0.6
0.5
0.5
0.5
0.5
0.4
0.4
0.4
0.4
0.3
0.3
0.3
0.3
0.2
0.2
0.2
0.2
0.1
0.1
0.1
0.1
0.0
0.0
0.0
0.0
1 2 1 2 1 2 1 2
Evolution [against ] of ABC Bayes factor, in terms of frequencies of
visits to models MA(1) (left) and MA(2) (right) when equal to
10, 1, .1, .01% quantiles on insufficient autocovariance distances. Sample
of 50 points from a MA(2) with θ1 = 0.6, θ2 = 0.2. True Bayes factor
equal to 17.71.
56. ABC Methods for Bayesian Model Choice
Generic ABC model choice
MA example
0.8
0.6
0.6
0.6
0.6
0.4
0.4
0.4
0.4
0.2
0.2
0.2
0.2
0.0
0.0
0.0
0.0
1 2 1 2 1 2 1 2
Evolution [against ] of ABC Bayes factor, in terms of frequencies of
visits to models MA(1) (left) and MA(2) (right) when equal to
10, 1, .1, .01% quantiles on insufficient autocovariance distances. Sample
of 50 points from a MA(1) model with θ1 = 0.6. True Bayes factor B21
equal to .004.
57. ABC Methods for Bayesian Model Choice
Generic ABC model choice
A population genetics evaluation
Population genetics example with
3 populations
2 scenari
15 individuals
5 loci
single mutation parameter
58. ABC Methods for Bayesian Model Choice
Generic ABC model choice
A population genetics evaluation
Population genetics example with
3 populations
2 scenari
15 individuals
5 loci
single mutation parameter
24 summary statistics
2 million ABC proposal
importance [tree] sampling alternative
64. ABC Methods for Bayesian Model Choice
Generic ABC model choice
A second population genetics experiment
three populations, two divergent 100 gen. ago
two scenarios [third pop. recent admixture between first two
pop. / diverging from pop. 1 5 gen. ago]
In scenario 1, admixture rate 0.7 from pop. 1
100 datasets with 100 diploid individuals per population, 50
independent microsatellite loci.
Effective population size of 1000 and mutation rates of
0.0005.
6 parameters: admixture rate (U [0.1, 0.9]), three effective
population sizes (U [200, 2000]), the time of admixture/second
divergence (U [1, 10]) and time of first divergence (U [50, 500]).
65. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Results
IS algorithm performed with 100 coalescent trees per particle and
50,000 particles [12 calendar days using 376 processors]
Using ten times as many loci and seven times as many individuals
degrades the confidence in the importance sampling approximation
because of an increased variability in the likelihood. 1.0
q qq q qq q q qqqq
qq
q
q qq
q
q
q q qq
qq
q
q
qq
q
q q
q
q
0.8
q
Importance Sampling estimates of
q
P(M=1|y) with 1,000 particles
q q
0.6
0.4
0.2
0.0
0.0 0.2 0.4 0.6 0.8 1.0
Importance Sampling estimates of
66. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Results
Blurs potential divergence between ABC and genuine posterior
probabilities because both are overwhelmingly close to one, due to
the high information content of the data.
1.0
q q qqq q qqqq
q q qqqqq
q qqq
qq
qq
qq
qq
qq
q
q q qq qqq
Importance Sampling estimates of P(M=1|y) with 50,000 particles
q q qq
q
q q
qq
q
q q
q
q
0.8
q
q
q q
0.6
0.4
0.2
0.0
0.0 0.2 0.4 0.6 0.8 1.0
ABC estimates of P(M=1|y)
67. ABC Methods for Bayesian Model Choice
Generic ABC model choice
The only safe cases???
Besides specific models like Gibbs random fields,
using distances over the data itself escapes the discrepancy...
[Toni & Stumpf, 2010;Sousa et al., 2009]
68. ABC Methods for Bayesian Model Choice
Generic ABC model choice
The only safe cases???
Besides specific models like Gibbs random fields,
using distances over the data itself escapes the discrepancy...
[Toni & Stumpf, 2010;Sousa et al., 2009]
...and so does the use of more informal model fitting measures
[Ratmann, Andrieu, Richardson and Wiujf, 2009]
69. ABC Methods for Bayesian Model Choice
Model choice consistency
ABC model choice consistency
Approximate Bayesian computation
ABC for model choice
Gibbs random fields
Generic ABC model choice
Model choice consistency
70. ABC Methods for Bayesian Model Choice
Model choice consistency
The starting point
Central question to the validation of ABC for model choice:
When is a Bayes factor based on an insufficient statistic
T (y) consistent?
71. ABC Methods for Bayesian Model Choice
Model choice consistency
The starting point
Central question to the validation of ABC for model choice:
When is a Bayes factor based on an insufficient statistic
T (y) consistent?
T
Note: conclusion drawn on T (y) through B12 (y) necessarily differs
from the conclusion drawn on y through B12 (y)
72. ABC Methods for Bayesian Model Choice
Model choice consistency
A benchmark if toy example
Comparison suggested by referee of PNAS paper:
[X, Cornuet, Marin, & Pillai, Aug. 2011]
Model M1 : y ∼ N (θ1 , 1) opposed to model M2 :
√
y ∼ L(θ2 , 1/ √2), Laplace distribution with mean θ2 and scale
parameter 1/ 2 (variance one).
73. ABC Methods for Bayesian Model Choice
Model choice consistency
A benchmark if toy example
Comparison suggested by referee of PNAS paper:
[X, Cornuet, Marin, & Pillai, Aug. 2011]
Model M1 : y ∼ N (θ1 , 1) opposed to model M2 :
√
y ∼ L(θ2 , 1/ √2), Laplace distribution with mean θ2 and scale
parameter 1/ 2 (variance one).
Four possible statistics
1. sample mean y (sufficient for both M1 and M2 );
2. sample median med(y) (insufficient);
3. sample variance var(y) (ancillary);
4. median absolute deviation mad(y) = med(y − med(y));
74. ABC Methods for Bayesian Model Choice
Model choice consistency
A benchmark if toy example
Comparison suggested by referee of PNAS paper:
[X, Cornuet, Marin, & Pillai, Aug. 2011]
Model M1 : y ∼ N (θ1 , 1) opposed to model M2 :
√
y ∼ L(θ2 , 1/ √2), Laplace distribution with mean θ2 and scale
parameter 1/ 2 (variance one).
6
5
4
Density
3
2
1
0
0.1 0.2 0.3 0.4 0.5 0.6 0.7
posterior probability
75. ABC Methods for Bayesian Model Choice
Model choice consistency
A benchmark if toy example
Comparison suggested by referee of PNAS paper:
[X, Cornuet, Marin, & Pillai, Aug. 2011]
Model M1 : y ∼ N (θ1 , 1) opposed to model M2 :
√
y ∼ L(θ2 , 1/ √2), Laplace distribution with mean θ2 and scale
parameter 1/ 2 (variance one).
6
3
5
4
Density
Density
2
3
2
1
1
0
0
0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.0 0.2 0.4 0.6 0.8 1.0
posterior probability probability
76. ABC Methods for Bayesian Model Choice
Model choice consistency
Framework
Starting from sample y = (y1 , . . . , yn ) be the observed sample, not
necessarily iid with true distribution y ∼ Pn
Summary statistics T (y) = T n = (T1 (y), T2 (y), · · · , Td (y)) ∈ Rd
with true distribution T n ∼ Gn .
77. ABC Methods for Bayesian Model Choice
Model choice consistency
Framework
Comparison of
– under M1 , y ∼ F1,n (·|θ1 ) where θ1 ∈ Θ1 ⊂ Rp1
– under M2 , y ∼ F2,n (·|θ2 ) where θ2 ∈ Θ2 ⊂ Rp2
turned into
– under M1 , T (y) ∼ G1,n (·|θ1 ), and θ1 |T (y) ∼ π1 (·|T n )
– under M2 , T (y) ∼ G2,n (·|θ2 ), and θ2 |T (y) ∼ π2 (·|T n )
78. ABC Methods for Bayesian Model Choice
Model choice consistency
Assumptions
A collection of asymptotic “standard” assumptions:
[A1] There exist a sequence {vn } converging to +∞,
an a.c. distribution Q with continuous bounded density q(·),
a symmetric, d × d positive definite matrix V0
and a vector µ0 ∈ Rd such that
−1/2 n→∞
vn V0 (T n − µ0 ) Q, under Gn
and for all M > 0
−d −1/2
sup |V0 |1/2 vn gn (t) − q vn V0 {t − µ0 } = o(1)
vn |t−µ0 |<M
79. ABC Methods for Bayesian Model Choice
Model choice consistency
Assumptions
A collection of asymptotic “standard” assumptions:
[A2] For i = 1, 2, there exist d × d symmetric positive definite matrices
Vi (θi ) and µi (θi ) ∈ Rd such that
n→∞
vn Vi (θi )−1/2 (T n − µi (θi )) Q, under Gi,n (·|θi ) .
80. ABC Methods for Bayesian Model Choice
Model choice consistency
Assumptions
A collection of asymptotic “standard” assumptions:
[A3] For i = 1, 2, there exist sets Fn,i ⊂ Θi and constants i , τi , αi >0
such that for all τ > 0,
sup Gi,n |T n − µ(θi )| > τ |µi (θi ) − µ0 | ∧ i |θi
θi ∈Fn,i
−α −αi
vn i (|µi (θi ) − µ0 | ∧ i )
with
c −τ
πi (Fn,i ) = o(vn i ).
81. ABC Methods for Bayesian Model Choice
Model choice consistency
Assumptions
A collection of asymptotic “standard” assumptions:
[A4] For (u > 0)
−1
Sn,i (u) = θi ∈ Fn,i ; |µ(θi ) − µ0 | ≤ u vn
if inf{|µi (θi ) − µ0 |; θi ∈ Θi } = 0, there exist constants di < τi ∧ αi − 1
such that
−d
πi (Sn,i (u)) ∼ udi vn i , ∀u vn
82. ABC Methods for Bayesian Model Choice
Model choice consistency
Assumptions
A collection of asymptotic “standard” assumptions:
[A5] If inf{|µi (θi ) − µ0 |; θi ∈ Θi } = 0, there exists U > 0 such that for
any M > 0,
−d
sup sup |Vi (θi )|1/2 vn gi (t|θi )
vn |t−µ0 |<M θi ∈Sn,i (U )
−q vn Vi (θi )−1/2 (t − µ(θi ) = o(1)
and
πi Sn,i (U ) ∩ ||Vi (θi )−1 || + ||Vi (θi )|| > M
lim lim sup =0.
M →∞ n πi (Sn,i (U ))
83. ABC Methods for Bayesian Model Choice
Model choice consistency
Assumptions
A collection of asymptotic “standard” assumptions:
[A1]–[A2] are standard central limit theorems
[A3] controls the large deviations of the estimator T n from the
estimand µ(θ)
[A4] is the standard prior mass condition found in Bayesian
asymptotics
[A5] controls more tightly convergence esp. when µi is not
one-to-one
84. ABC Methods for Bayesian Model Choice
Model choice consistency
Asymptotic marginals
Asymptotically, under [A1]–[A5]
mi (t) = gi (t|θi ) πi (θi ) dθi
Θi
is such that
(i) if inf{|µi (θi ) − µ0 |; θi ∈ Θi } = 0,
Cl vn i ≤ mi (T n ) ≤ Cu vn i
d−d d−d
and
(ii) if inf{|µi (θi ) − µ0 |; θi ∈ Θi } > 0
mi (T n ) = oPn [vn i + vn i ].
d−τ d−α
85. ABC Methods for Bayesian Model Choice
Model choice consistency
Within-model consistency
Under same assumptions, if inf{|µi (θi ) − µ0 |; θi ∈ Θi } = 0, the
posterior distribution of µi (θi ) given T n is consistent at rate 1/vn
provided αi ∧ τi > di .
86. ABC Methods for Bayesian Model Choice
Model choice consistency
Within-model consistency
Under same assumptions, if inf{|µi (θi ) − µ0 |; θi ∈ Θi } = 0, the
posterior distribution of µi (θi ) given T n is consistent at rate 1/vn
provided αi ∧ τi > di .
Note: di can be seen as an effective dimension of the model under
the posterior πi (.|T n ), since if µ0 ∈ {µi (θi ); θi ∈ Θi },
mi (T n ) ∼ vn i
d−d
and n (T
n d
) ∼ vn
87. ABC Methods for Bayesian Model Choice
Model choice consistency
Between-model consistency
Consequence of above is that asymptotic behaviour of the Bayes
factor is driven by the asymptotic mean value of T n under both
models. And only by this mean value!
Indeed, if
inf{|µ0 − µ2 (θ2 )|; θ2 ∈ Θ2 } = inf{|µ0 − µ1 (θ1 )|; θ1 ∈ Θ1 } = 0
then
m1 (T n )
Cl vn 1 −d2 ) ≤
−(d
≤ Cu vn 1 −d2 ) ,
−(d
m2 (T n )
where Cl , Cu = OPn (1), irrespective of the true model. Only
depends on the difference d1 − d2
88. ABC Methods for Bayesian Model Choice
Model choice consistency
Between-model consistency
Consequence of above is that asymptotic behaviour of the Bayes
factor is driven by the asymptotic mean value of T n under both
models. And only by this mean value!
Else, if
inf{|µ0 − µ2 (θ2 )|; θ2 ∈ Θ2 } > inf{|µ0 − µ1 (θ1 )|; θ1 ∈ Θ1 } = 0
then
m1 (T n )
≥ Cu min vn 1 −α2 ) , vn 1 −τ2 ) ,
−(d −(d
m2 (T n )
89. ABC Methods for Bayesian Model Choice
Model choice consistency
Consistency theorem
If
{|µ0 − µ2 (θ2 )|; θ2 ∈ Θ2 } = inf{|µ0 − µ1 (θ1 )|; θ1 ∈ Θ1 } = 0,
T −(d −d )
Bayes factor B12 is O(vn 1 2 ) irrespective of the true model. It
is consistent iff Pn is within the model with the smallest dimension
90. ABC Methods for Bayesian Model Choice
Model choice consistency
Consistency theorem
If
{|µ0 − µ2 (θ2 )|; θ2 ∈ Θ2 } = inf{|µ0 − µ1 (θ1 )|; θ1 ∈ Θ1 } = 0,
T −(d −d )
Bayes factor B12 is O(vn 1 2 ) irrespective of the true model. It
is consistent iff Pn is within the model with the smallest dimension
If Pn belongs to one of the two models and if µ0 cannot be
attained by the other one :
0 = min (inf{|µ0 − µi (θi )|; θi ∈ Θi }, i = 1, 2)
< max (inf{|µ0 − µi (θi )|; θi ∈ Θi }, i = 1, 2) ,
T
then the Bayes factor B12 is consistent
91. ABC Methods for Bayesian Model Choice
Model choice consistency
Consequences on summary statistics
Bayes factor driven by the means µi (θi ) and the relative position of
µ0 wrt both sets {µi (θi ); θi ∈ Θi }, i = 1, 2.
For ABC, this implies the most likely statistics T n are ancillary
statistics with different mean values under both models
Else, if T n asymptotically depends on some of the parameters of
the models, it is quite likely that there exists θi ∈ Θi such that
µi (θi ) = µ0 even though model M1 is misspecified
92. ABC Methods for Bayesian Model Choice
Model choice consistency
Consequences on summary statistics
Bayes factor driven by the means µi (θi ) and the relative position of
µ0 wrt both sets {µi (θi ); θi ∈ Θi }, i = 1, 2.
Toy example Laplace versus Gauss: If
n
T n = n−1 Xi4
i=1
and the true distribution is Laplace with mean 0, so that µ0 = 6.
Since under the Gaussian model
µ(θ) = 3 + θ4 + 6θ2
√
the value θ∗ = 2 3 − 3 leads to µ0 = µ(θ∗ ) and a Bayes factor
associated with such a statistic is not consistent (here
d1 = d2 = d = 1).
93. ABC Methods for Bayesian Model Choice
Model choice consistency
Consequences on summary statistics
Bayes factor driven by the means µi (θi ) and the relative position of
µ0 wrt both sets {µi (θi ); θi ∈ Θi }, i = 1, 2.
8
6
4
2
0
0.0 0.2 0.4 0.6 0.8 1.0
probability
Fourth moment
94. ABC Methods for Bayesian Model Choice
Model choice consistency
Consequences on summary statistics
Bayes factor driven by the means µi (θi ) and the relative position of
µ0 wrt both sets {µi (θi ); θi ∈ Θi }, i = 1, 2.
6
5
4
3
2
1
0
0.0 0.2 0.4 0.6 0.8 1.0
probability
Fourth and sixth moments
95. ABC Methods for Bayesian Model Choice
Model choice consistency
Embedded models
When M1 submodel of M2 , and if the true distribution belongs to
−(d −d )
the smaller model M1 , Bayes factor is of order vn 1 2 .
If summary statistic only informative on a parameter that is the
same under both models, i.e if d1 = d2 , then the Bayes factor is
not consistent
Else, d1 < d2 and Bayes factor is consistent under M1 . If true
distribution not in M1 , then Bayes factor is consistent only if
µ 1 = µ 2 = µ0