The document discusses approximate Bayesian computation (ABC) methods for Bayesian model choice when likelihoods are intractable. It introduces ABC, which approximates posteriors by simulating data under different parameter values and accepting simulations that are close to the observed data according to a distance measure. It then discusses applying ABC to model choice by simulating parameters and data from different candidate models. The document provides an example of using ABC for an MA time series model and compares distance measures. It also outlines a generic ABC algorithm for Bayesian model choice.
Approximate Bayesian computation for the Ising/Potts modelMatt Moores
Bayes’ formula involves the likelihood function, p(y|theta), which is a problem when the likelihood is unavailable in closed form. ABC is a method for approximating the posterior p(theta|y) without evaluating the likelihood. Instead, pseudo-data is simulated from a generative model and compared with the observations. This talk will give an introduction to ABC algorithms: rejection sampling, ABC-MCMC and ABC-SMC. Application of these algorithms to image analysis will be presented as an illustrative example. These methods have been implemented in the R package bayesImageS.
This is joint work with Christian Robert (Warwick/Dauphine), Kerrie Mengersen and Christopher Drovandi (QUT).
An introduction to some Bayesian variable selection techniques, plus some ideas about how to (mis-)use them.
Talk given at IWSM 2012 in Prague (which is rather rainy)
These are the slides I gave during three lectures at the YES III meeting in Eindhoven, October 5-7, 2009. They include a presentation of the Savage-Dickey paradox and a comparison of major importance sampling solution, both obtained in collaboration with Jean-Michel Marin.
This is a presentation by Dada Robert in a Your Skill Boost masterclass organised by the Excellence Foundation for South Sudan (EFSS) on Saturday, the 25th and Sunday, the 26th of May 2024.
He discussed the concept of quality improvement, emphasizing its applicability to various aspects of life, including personal, project, and program improvements. He defined quality as doing the right thing at the right time in the right way to achieve the best possible results and discussed the concept of the "gap" between what we know and what we do, and how this gap represents the areas we need to improve. He explained the scientific approach to quality improvement, which involves systematic performance analysis, testing and learning, and implementing change ideas. He also highlighted the importance of client focus and a team approach to quality improvement.
We all have good and bad thoughts from time to time and situation to situation. We are bombarded daily with spiraling thoughts(both negative and positive) creating all-consuming feel , making us difficult to manage with associated suffering. Good thoughts are like our Mob Signal (Positive thought) amidst noise(negative thought) in the atmosphere. Negative thoughts like noise outweigh positive thoughts. These thoughts often create unwanted confusion, trouble, stress and frustration in our mind as well as chaos in our physical world. Negative thoughts are also known as “distorted thinking”.
Ethnobotany and Ethnopharmacology:
Ethnobotany in herbal drug evaluation,
Impact of Ethnobotany in traditional medicine,
New development in herbals,
Bio-prospecting tools for drug discovery,
Role of Ethnopharmacology in drug evaluation,
Reverse Pharmacology.
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
The French Revolution Class 9 Study Material pdf free download
Edinburgh, Bayes-250
1. ABC Methods for Bayesian Model Choice
ABC Methods for Bayesian Model Choice
Christian P. Robert
Universit´ Paris-Dauphine, IuF, & CREST
e
http://www.ceremade.dauphine.fr/~xian
Bayes-250, Edinburgh, September 6, 2011
2. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Approximate Bayesian computation
Approximate Bayesian computation
ABC for model choice
Gibbs random fields
Generic ABC model choice
3. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Regular Bayesian computation issues
When faced with a non-standard posterior distribution
π(θ|y) ∝ π(θ)L(θ|y)
the standard solution is to use simulation (Monte Carlo) to
produce a sample
θ1 , . . . , θ T
from π(θ|y) (or approximately by Markov chain Monte Carlo
methods)
[Robert & Casella, 2004]
4. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Untractable likelihoods
Cases when the likelihood function f (y|θ) is unavailable and when
the completion step
f (y|θ) = f (y, z|θ) dz
Z
is impossible or too costly because of the dimension of z
c MCMC cannot be implemented!
5. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Untractable likelihoods
c MCMC cannot be implemented!
6. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
The ABC method
Bayesian setting: target is π(θ)f (x|θ)
7. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
The ABC method
Bayesian setting: target is π(θ)f (x|θ)
When likelihood f (x|θ) not in closed form, likelihood-free rejection
technique:
8. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
The ABC method
Bayesian setting: target is π(θ)f (x|θ)
When likelihood f (x|θ) not in closed form, likelihood-free rejection
technique:
ABC algorithm
For an observation y ∼ f (y|θ), under the prior π(θ), keep jointly
simulating
θ ∼ π(θ) , z ∼ f (z|θ ) ,
until the auxiliary variable z is equal to the observed value, z = y.
[Tavar´ et al., 1997]
e
9. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
A as approximative
When y is a continuous random variable, equality z = y is replaced
with a tolerance condition,
(y, z) ≤
where is a distance
10. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
A as approximative
When y is a continuous random variable, equality z = y is replaced
with a tolerance condition,
(y, z) ≤
where is a distance
Output distributed from
π(θ) Pθ { (y, z) < } ∝ π(θ| (y, z) < )
11. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
ABC algorithm
Algorithm 1 Likelihood-free rejection sampler
for i = 1 to N do
repeat
generate θ from the prior distribution π(·)
generate z from the likelihood f (·|θ )
until ρ{η(z), η(y)} ≤
set θi = θ
end for
where η(y) defines a (maybe in-sufficient) statistic
12. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Output
The likelihood-free algorithm samples from the marginal in z of:
π(θ)f (z|θ)IA ,y (z)
π (θ, z|y) = ,
A ,y ×Θ π(θ)f (z|θ)dzdθ
where A ,y = {z ∈ D|ρ(η(z), η(y)) < }.
13. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Output
The likelihood-free algorithm samples from the marginal in z of:
π(θ)f (z|θ)IA ,y (z)
π (θ, z|y) = ,
A ,y ×Θ π(θ)f (z|θ)dzdθ
where A ,y = {z ∈ D|ρ(η(z), η(y)) < }.
The idea behind ABC is that the summary statistics coupled with a
small tolerance should provide a good approximation of the
posterior distribution:
π (θ|y) = π (θ, z|y)dz ≈ π(θ|y) .
14. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
MA example
Consider the MA(q) model
q
xt = t+ ϑi t−i
i=1
Simple prior: uniform prior over the identifiability zone, e.g.
triangle for MA(2)
15. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
MA example (2)
ABC algorithm thus made of
1. picking a new value (ϑ1 , ϑ2 ) in the triangle
2. generating an iid sequence ( t )−q<t≤T
3. producing a simulated series (xt )1≤t≤T
16. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
MA example (2)
ABC algorithm thus made of
1. picking a new value (ϑ1 , ϑ2 ) in the triangle
2. generating an iid sequence ( t )−q<t≤T
3. producing a simulated series (xt )1≤t≤T
Distance: basic distance between the series
T
ρ((xt )1≤t≤T , (xt )1≤t≤T ) = (xt − xt )2
t=1
or between summary statistics like the first q autocorrelations
T
τj = xt xt−j
t=j+1
17. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Comparison of distance impact
Evaluation of the tolerance on the ABC sample against both
distances ( = 100%, 10%, 1%, 0.1%) for an MA(2) model
18. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Comparison of distance impact
4
1.5
3
1.0
2
0.5
1
0.0
0
0.0 0.2 0.4 0.6 0.8 −2.0 −1.0 0.0 0.5 1.0 1.5
θ1 θ2
Evaluation of the tolerance on the ABC sample against both
distances ( = 100%, 10%, 1%, 0.1%) for an MA(2) model
19. ABC Methods for Bayesian Model Choice
Approximate Bayesian computation
Comparison of distance impact
4
1.5
3
1.0
2
0.5
1
0.0
0
0.0 0.2 0.4 0.6 0.8 −2.0 −1.0 0.0 0.5 1.0 1.5
θ1 θ2
Evaluation of the tolerance on the ABC sample against both
distances ( = 100%, 10%, 1%, 0.1%) for an MA(2) model
20. ABC Methods for Bayesian Model Choice
ABC for model choice
ABC for model choice
Approximate Bayesian computation
ABC for model choice
Gibbs random fields
Generic ABC model choice
21. ABC Methods for Bayesian Model Choice
ABC for model choice
Bayesian model choice
Principle
Several models M1 , M2 , . . . are considered simultaneously for
dataset y and model index M central to inference.
Use of a prior π(M = m), plus a prior distribution on the
parameter conditional on the value m of the model index, πm (θ m )
Goal is to derive the posterior distribution of M, a challenging
computational target when models are complex.
22. ABC Methods for Bayesian Model Choice
ABC for model choice
Generic ABC for model choice
Algorithm 2 Likelihood-free model choice sampler (ABC-MC)
for t = 1 to T do
repeat
Generate m from the prior π(M = m)
Generate θ m from the prior πm (θ m )
Generate z from the model fm (z|θ m )
until ρ{η(z), η(y)} <
Set m(t) = m and θ (t) = θ m
end for
[Toni, Welch, Strelkowa, Ipsen & Stumpf, 2009]
23. ABC Methods for Bayesian Model Choice
ABC for model choice
ABC estimates
Posterior probability π(M = m|y) approximated by the frequency
of acceptances from model m
T
1
Im(t) =m .
T
t=1
Early issues with implementation:
should tolerances be the same for all models?
should summary statistics vary across models? incl. their
dimension?
should the distance measure ρ vary across models?
24. ABC Methods for Bayesian Model Choice
ABC for model choice
ABC estimates
Posterior probability π(M = m|y) approximated by the frequency
of acceptances from model m
T
1
Im(t) =m .
T
t=1
Early issues with implementation:
then needs to become part of the model
Varying statistics incompatible with Bayesian model choice
proper
ρ then part of the model
Extension to a weighted polychotomous logistic regression estimate
of π(M = m|y), with non-parametric kernel weights
[Cornuet et al., DIYABC, 2009]
25. ABC Methods for Bayesian Model Choice
ABC for model choice
The great ABC controversy
On-going controvery in phylogeographic genetics about the validity
of using ABC for testing
Against: Templeton, 2008,
2009, 2010a, 2010b, 2010c, &tc
argues that nested hypotheses
cannot have higher probabilities
than nesting hypotheses (!)
26. ABC Methods for Bayesian Model Choice
ABC for model choice
The great ABC controversy
On-going controvery in phylogeographic genetics about the validity
of using ABC for testing
Replies: Fagundes et al., 2008,
Against: Templeton, 2008, Beaumont et al., 2010, Berger et
2009, 2010a, 2010b, 2010c, &tc al., 2010, Csill`ry et al., 2010
e
argues that nested hypotheses point out that the criticisms are
cannot have higher probabilities addressed at [Bayesian]
than nesting hypotheses (!) model-based inference and have
nothing to do with ABC...
27. ABC Methods for Bayesian Model Choice
Gibbs random fields
Potts model
Potts model
c∈C Vc (y) is of the form
Vc (y) = θS(y) = θ δyl =yi
c∈C l∼i
where l∼i denotes a neighbourhood structure
28. ABC Methods for Bayesian Model Choice
Gibbs random fields
Potts model
Potts model
c∈C Vc (y) is of the form
Vc (y) = θS(y) = θ δyl =yi
c∈C l∼i
where l∼i denotes a neighbourhood structure
In most realistic settings, summation
Zθ = exp{θ T S(x)}
x∈X
involves too many terms to be manageable and numerical
approximations cannot always be trusted
[Cucala et al., 2009]
29. ABC Methods for Bayesian Model Choice
Gibbs random fields
Neighbourhood relations
Setup
Choice to be made between M neighbourhood relations
m
i∼i (0 ≤ m ≤ M − 1)
with
Sm (x) = I{xi =xi }
m
i∼i
driven by the posterior probabilities of the models.
30. ABC Methods for Bayesian Model Choice
Gibbs random fields
Model index
Computational target:
P(M = m|x) ∝ fm (x|θm )πm (θm ) dθm π(M = m)
Θm
31. ABC Methods for Bayesian Model Choice
Gibbs random fields
Model index
Computational target:
P(M = m|x) ∝ fm (x|θm )πm (θm ) dθm π(M = m)
Θm
If S(x) sufficient statistic for the joint parameters
(M, θ0 , . . . , θM −1 ),
P(M = m|x) = P(M = m|S(x)) .
32. ABC Methods for Bayesian Model Choice
Gibbs random fields
Sufficient statistics in Gibbs random fields
33. ABC Methods for Bayesian Model Choice
Gibbs random fields
Sufficient statistics in Gibbs random fields
Each model m has its own sufficient statistic Sm (·) and
S(·) = (S0 (·), . . . , SM −1 (·)) is also (model-)sufficient.
34. ABC Methods for Bayesian Model Choice
Gibbs random fields
Sufficient statistics in Gibbs random fields
Each model m has its own sufficient statistic Sm (·) and
S(·) = (S0 (·), . . . , SM −1 (·)) is also (model-)sufficient.
Explanation: For Gibbs random fields,
1 2
x|M = m ∼ fm (x|θm ) = fm (x|S(x))fm (S(x)|θm )
1
= f 2 (S(x)|θm )
n(S(x)) m
where
n(S(x)) = {˜ ∈ X : S(˜ ) = S(x)}
x x
c S(x) is therefore also sufficient for the joint parameters
35. ABC Methods for Bayesian Model Choice
Gibbs random fields
Toy example
iid Bernoulli model versus two-state first-order Markov chain, i.e.
n
f0 (x|θ0 ) = exp θ0 I{xi =1} {1 + exp(θ0 )}n ,
i=1
versus
n
1
f1 (x|θ1 ) = exp θ1 I{xi =xi−1 } {1 + exp(θ1 )}n−1 ,
2
i=2
with priors θ0 ∼ U(−5, 5) and θ1 ∼ U(0, 6) (inspired by “phase
transition” boundaries).
36. ABC Methods for Bayesian Model Choice
Gibbs random fields
Toy example (2)
^
BF01
−5 0 5
−40
−20
BF01
0
10
^
BF01
−10 −5 0 5 10
−40
−20
BF01
0
10
(left) Comparison of the true BF m0 /m1 (x0 ) with BF m0 /m1 (x0 )
(in logs) over 2, 000 simulations and 4.106 proposals from the
prior. (right) Same when using tolerance corresponding to the
1% quantile on the distances.
37. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Back to sufficiency
‘Sufficient statistics for individual models are unlikely to
be very informative for the model probability. This is
already well known and understood by the ABC-user
community.’
[Scott Sisson, Jan. 31, 2011, ’Og]
38. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Back to sufficiency
‘Sufficient statistics for individual models are unlikely to
be very informative for the model probability. This is
already well known and understood by the ABC-user
community.’
[Scott Sisson, Jan. 31, 2011, ’Og]
If η1 (x) sufficient statistic for model m = 1 and parameter θ1 and
η2 (x) sufficient statistic for model m = 2 and parameter θ2 ,
(η1 (x), η2 (x)) is not always sufficient for (m, θm )
39. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Back to sufficiency
‘Sufficient statistics for individual models are unlikely to
be very informative for the model probability. This is
already well known and understood by the ABC-user
community.’
[Scott Sisson, Jan. 31, 2011, ’Og]
If η1 (x) sufficient statistic for model m = 1 and parameter θ1 and
η2 (x) sufficient statistic for model m = 2 and parameter θ2 ,
(η1 (x), η2 (x)) is not always sufficient for (m, θm )
c Potential loss of information at the testing level
40. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Limiting behaviour of B12 (T → ∞)
ABC approximation
T
t=1 Imt =1 Iρ{η(zt ),η(y)}≤
B12 (y) = T
,
t=1 Imt =2 Iρ{η(zt ),η(y)}≤
where the (mt , z t )’s are simulated from the (joint) prior
41. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Limiting behaviour of B12 (T → ∞)
ABC approximation
T
t=1 Imt =1 Iρ{η(zt ),η(y)}≤
B12 (y) = T
,
t=1 Imt =2 Iρ{η(zt ),η(y)}≤
where the (mt , z t )’s are simulated from the (joint) prior
As T go to infinity, limit
Iρ{η(z),η(y)}≤ π1 (θ 1 )f1 (z|θ 1 ) dz dθ 1
B12 (y) =
Iρ{η(z),η(y)}≤ π2 (θ 2 )f2 (z|θ 2 ) dz dθ 2
η
Iρ{η,η(y)}≤ π1 (θ 1 )f1 (η|θ 1 ) dη dθ 1
= η ,
Iρ{η,η(y)}≤ π2 (θ 2 )f2 (η|θ 2 ) dη dθ 2
η η
where f1 (η|θ 1 ) and f2 (η|θ 2 ) distributions of η(z)
42. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Limiting behaviour of B12 ( → 0)
When goes to zero,
η
η π1 (θ 1 )f1 (η(y)|θ 1 ) dθ 1
B12 (y) = η
π2 (θ 2 )f2 (η(y)|θ 2 ) dθ 2
43. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Limiting behaviour of B12 ( → 0)
When goes to zero,
η
η π1 (θ 1 )f1 (η(y)|θ 1 ) dθ 1
B12 (y) = η
π2 (θ 2 )f2 (η(y)|θ 2 ) dθ 2
Bayes factor based on the sole observation of η(y)
44. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Limiting behaviour of B12 (under sufficiency)
If η(y) sufficient statistic in both models,
fi (y|θ i ) = gi (y)fiη (η(y)|θ i )
Thus
η
Θ1 π(θ 1 )g1 (y)f1 (η(y)|θ 1 ) dθ 1
B12 (y) = η
Θ2 π(θ 2 )g2 (y)f2 (η(y)|θ 2 ) dθ 2
η
g1 (y) π1 (θ 1 )f1 (η(y)|θ 1 ) dθ 1 g1 (y) η
= η = B (y) .
g2 (y) π2 (θ 2 )f2 (η(y)|θ 2 ) dθ 2 g2 (y) 12
[Didelot, Everitt, Johansen & Lawson, 2011]
45. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Limiting behaviour of B12 (under sufficiency)
If η(y) sufficient statistic in both models,
fi (y|θ i ) = gi (y)fiη (η(y)|θ i )
Thus
η
Θ1 π(θ 1 )g1 (y)f1 (η(y)|θ 1 ) dθ 1
B12 (y) = η
Θ2 π(θ 2 )g2 (y)f2 (η(y)|θ 2 ) dθ 2
η
g1 (y) π1 (θ 1 )f1 (η(y)|θ 1 ) dθ 1 g1 (y) η
= η = B (y) .
g2 (y) π2 (θ 2 )f2 (η(y)|θ 2 ) dθ 2 g2 (y) 12
[Didelot, Everitt, Johansen & Lawson, 2011]
c No discrepancy only when cross-model sufficiency
46. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Poisson/geometric example
Sample
x = (x1 , . . . , xn )
from either a Poisson P(λ) or from a geometric G(p)
Sum
n
S= xi = η(x)
i=1
sufficient statistic for either model but not simultaneously
Discrepancy ratio
g1 (x) S!n−S / i xi !
=
g2 (x) 1 n+S−1
S
47. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Poisson/geometric discrepancy
η
Range of B12 (x) versus B12 (x): The values produced have
nothing in common.
48. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Formal recovery
Creating an encompassing exponential family
T T
f (x|θ1 , θ2 , α1 , α2 ) ∝ exp{θ1 η1 (x) + θ1 η1 (x) + α1 t1 (x) + α2 t2 (x)}
leads to a sufficient statistic (η1 (x), η2 (x), t1 (x), t2 (x))
[Didelot, Everitt, Johansen & Lawson, 2011]
49. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Formal recovery
Creating an encompassing exponential family
T T
f (x|θ1 , θ2 , α1 , α2 ) ∝ exp{θ1 η1 (x) + θ1 η1 (x) + α1 t1 (x) + α2 t2 (x)}
leads to a sufficient statistic (η1 (x), η2 (x), t1 (x), t2 (x))
[Didelot, Everitt, Johansen & Lawson, 2011]
In the Poisson/geometric case, if i xi ! is added to S, no
discrepancy
50. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Formal recovery
Creating an encompassing exponential family
T T
f (x|θ1 , θ2 , α1 , α2 ) ∝ exp{θ1 η1 (x) + θ1 η1 (x) + α1 t1 (x) + α2 t2 (x)}
leads to a sufficient statistic (η1 (x), η2 (x), t1 (x), t2 (x))
[Didelot, Everitt, Johansen & Lawson, 2011]
Only applies in genuine sufficiency settings...
c Inability to evaluate loss brought by summary statistics
51. ABC Methods for Bayesian Model Choice
Generic ABC model choice
The Pitman–Koopman lemma
Efficient sufficiency is not such a common occurrence:
Lemma
A necessary and sufficient condition for the existence of a sufficient
statistic with fixed dimension whatever the sample size is that the
sampling distribution belongs to an exponential family.
[Pitman, 1933; Koopman, 1933]
52. ABC Methods for Bayesian Model Choice
Generic ABC model choice
The Pitman–Koopman lemma
Efficient sufficiency is not such a common occurrence:
Lemma
A necessary and sufficient condition for the existence of a sufficient
statistic with fixed dimension whatever the sample size is that the
sampling distribution belongs to an exponential family.
[Pitman, 1933; Koopman, 1933]
Provision of fixed support (consider U(0, θ) counterexample)
53. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Meaning of the ABC-Bayes factor
‘This is also why focus on model discrimination typically
(...) proceeds by (...) accepting that the Bayes Factor
that one obtains is only derived from the summary
statistics and may in no way correspond to that of the
full model.’
[Scott Sisson, Jan. 31, 2011, ’Og]
54. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Meaning of the ABC-Bayes factor
‘This is also why focus on model discrimination typically
(...) proceeds by (...) accepting that the Bayes Factor
that one obtains is only derived from the summary
statistics and may in no way correspond to that of the
full model.’
[Scott Sisson, Jan. 31, 2011, ’Og]
In the Poisson/geometric case, if E[yi ] = θ0 > 0,
η (θ0 + 1)2 −θ0
lim B12 (y) = e
n→∞ θ0
55. ABC Methods for Bayesian Model Choice
Generic ABC model choice
MA example
0.7
0.6
0.6
0.6
0.6
0.5
0.5
0.5
0.5
0.4
0.4
0.4
0.4
0.3
0.3
0.3
0.3
0.2
0.2
0.2
0.2
0.1
0.1
0.1
0.1
0.0
0.0
0.0
0.0
1 2 1 2 1 2 1 2
Evolution [against ] of ABC Bayes factor, in terms of frequencies of
visits to models MA(1) (left) and MA(2) (right) when equal to
10, 1, .1, .01% quantiles on insufficient autocovariance distances. Sample
of 50 points from a MA(2) with θ1 = 0.6, θ2 = 0.2. True Bayes factor
equal to 17.71.
56. ABC Methods for Bayesian Model Choice
Generic ABC model choice
MA example
0.8
0.6
0.6
0.6
0.6
0.4
0.4
0.4
0.4
0.2
0.2
0.2
0.2
0.0
0.0
0.0
0.0
1 2 1 2 1 2 1 2
Evolution [against ] of ABC Bayes factor, in terms of frequencies of
visits to models MA(1) (left) and MA(2) (right) when equal to
10, 1, .1, .01% quantiles on insufficient autocovariance distances. Sample
of 50 points from a MA(1) model with θ1 = 0.6. True Bayes factor B21
equal to .004.
57. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Further comments
‘There should be the possibility that for the same model,
but different (non-minimal) [summary] statistics (so
∗
different η’s: η1 and η1 ) the ratio of evidences may no
longer be equal to one.’
[Michael Stumpf, Jan. 28, 2011, ’Og]
Using different summary statistics [on different models] may
indicate the loss of information brought by each set but agreement
does not lead to trustworthy approximations.
58. ABC Methods for Bayesian Model Choice
Generic ABC model choice
A population genetics evaluation
Population genetics example with
3 populations
2 scenari
15 individuals
5 loci
single mutation parameter
59. ABC Methods for Bayesian Model Choice
Generic ABC model choice
A population genetics evaluation
Population genetics example with
3 populations
2 scenari
15 individuals
5 loci
single mutation parameter
24 summary statistics
2 million ABC proposal
importance [tree] sampling alternative
65. ABC Methods for Bayesian Model Choice
Generic ABC model choice
A second population genetics experiment
three populations, two divergent 100 gen. ago
two scenarios [third pop. recent admixture between first two
pop. / diverging from pop. 1 5 gen. ago]
In scenario 1, admixture rate 0.7 from pop. 1
100 datasets with 100 diploid individuals per population, 50
independent microsatellite loci.
Effective population size of 1000 and mutation rates of
0.0005.
6 parameters: admixture rate (U [0.1, 0.9]), three effective
population sizes (U [200, 2000]), the time of admixture/second
divergence (U [1, 10]) and time of first divergence (U [50, 500]).
66. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Results
IS algorithm performed with 100 coalescent trees per particle and
50,000 particles [12 calendar days using 376 processors]
Using ten times as many loci and seven times as many individuals
degrades the confidence in the importance sampling approximation
because of an increased variability in the likelihood. 1.0
q qq q qq q q qqqq
qq
q
q qq
q
q
q q qq
qq
q
q
qq
q
q q
q
q
0.8
q
Importance Sampling estimates of
q
P(M=1|y) with 1,000 particles
q q
0.6
0.4
0.2
0.0
0.0 0.2 0.4 0.6 0.8 1.0
Importance Sampling estimates of
67. ABC Methods for Bayesian Model Choice
Generic ABC model choice
Results
Blurs potential divergence between ABC and genuine posterior
probabilities because both are overwhelmingly close to one, due to
the high information content of the data.
1.0
q q qqq q qqqq
q q qqqqq
q qqq
qq
qq
qq
qq
qq
q
q q qq qqq
Importance Sampling estimates of P(M=1|y) with 50,000 particles
q q qq
q
q q
qq
q
q q
q
q
0.8
q
q
q q
0.6
0.4
0.2
0.0
0.0 0.2 0.4 0.6 0.8 1.0
ABC estimates of P(M=1|y)
68. ABC Methods for Bayesian Model Choice
Generic ABC model choice
The only safe cases???
Besides specific models like Gibbs random fields,
using distances over the data itself escapes the discrepancy...
[Toni & Stumpf, 2010;Sousa et al., 2009]
69. ABC Methods for Bayesian Model Choice
Generic ABC model choice
The only safe cases???
Besides specific models like Gibbs random fields,
using distances over the data itself escapes the discrepancy...
[Toni & Stumpf, 2010;Sousa et al., 2009]
...and so does the use of more informal model fitting measures
[Ratmann, Andrieu, Richardson and Wiujf, 2009]