Slides bayes-london-2014
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
10,034
On Slideshare
2,697
From Embeds
7,337
Number of Embeds
65

Actions

Shares
Downloads
80
Comments
0
Likes
2

Embeds 7,337

http://freakonometrics.hypotheses.org 2,773
http://www.r-bloggers.com 1,407
http://feedly.com 975
http://lamages.blogspot.co.uk 836
http://lamages.blogspot.com 271
http://java.dzone.com 260
http://www.feedspot.com 115
http://www.statsblogs.com 85
http://digg.com 77
http://lamages.blogspot.de 41
http://lamages.blogspot.fr 41
http://feeds.feedburner.com 40
http://www.inoreader.com 38
http://www.newsblur.com 36
http://lamages.blogspot.nl 33
http://lamages.blogspot.ca 29
http://lamages.blogspot.com.es 27
http://www.magesblog.com 20
http://newsblur.com 19
http://feedproxy.google.com 12
http://lamages.blogspot.in 12
http://lamages.blogspot.ie 11
http://lamages.blogspot.com.br 11
http://lamages.blogspot.be 10
http://reader.aol.com 10
http://lamages.blogspot.com.au 9
http://lamages.blogspot.fi 9
http://lamages.blogspot.ch 9
http://lamages.blogspot.no 9
http://lamages.blogspot.it 8
http://lamages.blogspot.jp 8
http://lamages.blogspot.pt 8
http://lamages.blogspot.com.tr 7
http://lamages.blogspot.gr 6
https://lamages.blogspot.com 5
http://lamages.blogspot.ru 5
http://lamages.blogspot.co.at 5
https://www.commafeed.com 4
https://twitter.com 4
http://lamages.blogspot.tw 4
http://feedreader.com 4
http://lamages.blogspot.hk 4
http://lamages.blogspot.com.ar 3
http://lamages.blogspot.dk 3
http://smashingreader.com 3
http://lamages.blogspot.ae 3
http://www.hanrss.com 3
http://127.0.0.1 3
https://reader.aol.com 2
http://lamages.blogspot.co.il 2

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Getting into Bayesian Wizardry... (with the eyes of a muggle actuary) Arthur Charpentier charpentier.arthur@uqam.ca http://freakonometrics.hypotheses.org/ R in Insurance, London, July 2014 Professor of Actuarial Sciences, Mathematics Department, UQàM (previously Economics Department, Univ. Rennes 1 & ENSAE actuary in Hong Kong, IT & Stats FFSA) PhD in Statistics (KU Leuven), Fellow Institute of Actuaries MSc in Financial Mathematics (Paris Dauphine) & ENSAE Editor of the freakonometrics.hypotheses.org’s blog Editor of (forthcoming) Computational Actuarial Science, CRC 1
  • 2. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Getting into Bayesian Wizardry... (with the eyes of a frequentist actuary) Arthur Charpentier charpentier.arthur@uqam.ca http://freakonometrics.hypotheses.org/ R in Insurance, London, July 2014 Professor of Actuarial Sciences, Mathematics Department, UQàM (previously Economics Department, Univ. Rennes 1 & ENSAE Paristech actuary in Hong Kong, IT & Stats FFSA) PhD in Statistics (KU Leuven), Fellow Institute of Actuaries MSc in Financial Mathematics (Paris Dauphine) & ENSAE Editor of the freakonometrics.hypotheses.org’s blog Editor of (forthcoming) Computational Actuarial Science, CRC 2
  • 3. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences “it’s time to adopt modern Bayesian data analysis as standard procedure in our scientific practice and in our educational curriculum. Three reasons: 1. Scientific disciplines from astronomy to zoology are moving to Bayesian analysis. We should be leaders of the move, not followers. 2. Modern Bayesian methods provide richer information, with greater flexibility and broader applicability than 20th century methods. Bayesian methods are intellectually coherent and intuitive. Bayesian analyses are readily computed with modern software and hardware. 3. Null-hypothesis significance testing (NHST), with its reliance on p values, has many problems. There is little reason to persist with NHST now that Bayesian methods are accessible to everyone. My conclusion from those points is that we should do whatever we can to encourage the move to Bayesian data analysis.” John Kruschke, (quoted in Meyers & Guszcza (2013)) 3
  • 4. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Bayes vs. Frequentist, inference on heads/tails Consider some Bernoulli sample x = {x1, x2, · · · , xn}, where xi ∈ {0, 1}. Xi’s are i.i.d. B(p) variables, fX(x) = px [1 − p]1−x , x ∈ {0, 1}. Standard frequentist approach p = 1 n n i=1 xi = argmin n i=1 fX(xi) L(p;x) From the central limit theorem √ n p − p p(1 − p) L → N(0, 1) as n → ∞ we can derive an approximated 95% confidence interval p ± 1.96 √ n p(1 − p) 4
  • 5. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Bayes vs. Frequentist, inference on heads/tails Example out of 1,047 contracts, 159 claimed a loss Number of Insured Claiming a Loss Probability 100 120 140 160 180 200 220 0.0000.0050.0100.0150.0200.0250.0300.035 (True) Binomial Distribution Poisson Approximation Gaussian Approximation 5
  • 6. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Small Data and Black Swans Example [Operational risk] What if our sample is x = {0, 0, 0, 0, 0} ? How would we derive a confidence interval for p ? “INA’s chief executive officer, dressed as Santa Claus, asked an unthinkable question: Could anyone predict the probability of two planes colliding in midair? Santa was asking his chief actuary, L. H. Longley-Cook, to make a prediction based on no experience at all. There had never been a serious midair collision of commer- cial planes. Without any past experience or repetitive experimentation, any orthodox statistician had to an- swer Santa’s question with a resounding no.” 6
  • 7. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Bayes, the theory that would not die Liu et al. (1996) claim that “Statistical methods with a Bayesian flavor [...] have long been used in the insurance industry”. History of Bayesian statistics, the theory that would not die by Sharon Bertsch McGrayne “[Arthur] Bailey spent his first year in New York [in 1918] trying to prove to himself that ‘all of the fancy actuarial [Bayesian] procedures of the casualty busi- ness were mathematically unsound.’ After a year of in- tense mental struggle, however, realized to his conster- nation that actuarial sledgehammering worked” [...] 7
  • 8. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Bayes, the theory that would not die [...] “He even preferred it to the elegance of frequen- tism. He positively liked formulae that described ‘ac- tual data . . . I realized that the hard-shelled under- writers were recognizing certain facts of life neglected by the statistical theorists.’ He wanted to give more weight to a large volume of data than to the frequen- tists small sample; doing so felt surprisingly ‘logical and reasonable’. He concluded that only a ‘suicidal’ actuary would use Fishers method of maximum likeli- hood, which assigned a zero probability to nonevents. Since many businesses file no insurance claims at all, Fishers method would produce premiums too low to cover future losses.” 8
  • 9. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Bayes’s theorem Consider some hypothesis H and some evidence E, then PE(H) = P(H|E) = P(H ∩ E) P(E) = P(H) · P(E|H) P(E) Bayes rule,    prior probability P(H) versus posterior probability after receiving evidence E, PE(H) = P(H|E). In Bayesian (parametric) statistics, H = {θ ∈ Θ} and E = {X = x}. Bayes’ Theorem, π(θ|x) = π(θ) · f(x|θ) f(x) = π(θ) · f(x|θ) f(x|θ)π(θ)dθ ∝ π(θ) · f(x|θ) 9
  • 10. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Small Data and Black Swans Consider sample x = {0, 0, 0, 0, 0}. Here the likelihood is    (xi|θ) = θxi [1 − θ]1−xi f(x|θ) = θxT 1 [1 − θ]n−xT 1 and we need a priori distribution π(·) e.g. a beta distribution π(θ) = θα [1 − θ]β B(α, β) π(θ|x) = θα+xT 1 [1 − θ]β+n−xT 1 B(α + xT1, β + n − xT1) 10 0 0.2 0.4 0.6 0.8 1 0
  • 11. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences On Bayesian Philosophy, Confidence vs. Credibility for frequentists, a probability is a measure of the the frequency of repeated events → parameters are fixed (but unknown), and data are random for Bayesians, a probability is a measure of the degree of certainty about values → parameters are random and data are fixed “Bayesians : Given our observed data, there is a 95% probability that the true value of θ falls within the credible region vs. Frequentists : There is a 95% probability that when I compute a confidence interval from data of this sort, the true value of θ will fall within it.” in Vanderplas (2014) Example see Jaynes (1976), e.g. the truncated exponential 11
  • 12. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences On Bayesian Philosophy, Confidence vs. Credibility Example What is a 95% confidence interval of a proportion ? Here x = 159 and n = 1047. 1. draw sets (˜x1, · · · , ˜xn)k with Xi ∼ B(x/n) 2. compute for each set of values confidence intervals 3. determine the fraction of these confidence interval that contain x → the parameter is fixed, and we guarantee that 95% of the confidence intervals will con- tain it. q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q 140 160 180 200 12
  • 13. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences On Bayesian Philosophy, Confidence vs. Credibility Example What is 95% credible region of a pro- portion ? Here x = 159 and n = 1047. 1. draw random parameters pk with from the posterior distribution, π(·|x) 2. sample sets (˜x1, · · · , ˜xn)k with Xi,k ∼ B(pk) 3. compute for each set of values means xk 4. look at the proportion of those xk that are within this credible region [Π−1 (.025|x); Π−1 (.975|x)] → the credible region is fixed, and we guarantee that 95% of possible values of x will fall within it it. 13 140 160 180 200
  • 14. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Difficult concepts ? Difficult computations ? We have a sample x = {x1, · · · , xd) i.i.d. from distribution fθ(·). In predictive modeling, we need E(g(X)|x) = xfθ|x(x)dx where fθ|x(x) = f(x|x) = f(x|θ) · π(θ|x)dθ How can we derive π(θ|x) ? Can we sample from π(θ|x) (use monte carlo technique to approximate the integral) ? Computations not that simple... until the 90’s : MCMC 14
  • 15. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Markov Chain Stochastic process, (Xt)t∈N , on some discrete space Ω P(Xt+1 = y|Xt = x, Xt−1 = xt−1) = P(Xt+1 = y|Xt = x) = P(x, y) where P is a transition probability, that can be stored in a transition matrix, P = [Px,y] = [P(x, y)]. Observe that P(Xt+k = y|Xt = x) = Pk(x, y) where P k = [Pk(x, y)]. Under some condition, lim n→∞ P n = Λ = [λT ], Problem given a distribution λ, is it possible to generate a Markov Chain that converges to this distribution ? 15
  • 16. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Bonus Malus and Markov Chains Ex no-claim bonus, see Lemaire (1995). Assume that the number of claims is N ∼ P(10.536), so that P(N = 0) = 10%. 16 q q q q q q t+1 vs. t 1 2 3 4 5 6 654321 0.8 0.8 0.8 0.178 0.8 0.178 0.8 0.8 0.021 0.021 0.199 0.199 0.199 0.199
  • 17. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Hastings-Metropolis Back to our problem, we want to sample from π(θ|x) i.e. generate θ1, · · · , θn, · · · from π(θ|x). Hastings-Metropolis sampler will generate a Markov Chain (θt) as follows, • generate θ1 • generate θ and U ∼ U([0, 1]), compute R = π(θ |x) π(θt|x) P(θt|θ ) P(θ |θt−1) if U < R set θt+1 = θ if U ≥ R set θt+1 = θt R is the acceptance ratio, we accept the new state θ with probability min{1, R}. 17
  • 18. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Hastings-Metropolis Observe that R = π(θ ) · f(x|θ ) π(θt) · f(x|θt) P(θt|θ ) P(θ |θt−1) In a more general case, we can have a Markov process, not a Markov chain. E.g. P(θ |θt) ∼ N(θt, 1) 18
  • 19. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Using MCMC to generate Gaussian values > metrop1 <- function(n=1000,eps=0.5){ + vec <- vector("numeric", n) + x=0 + vec[1] <- x + for (i in 2:n) { + innov <- runif(1,-eps,eps) + mov <- x+innov + aprob <- min(1,dnorm(mov)/dnorm(x)) + u <- runif(1) + if (u < aprob) + x <- mov + vec[i] <- x + } + return(vec)} 19 q q −2 −1 0 1 2 q
  • 20. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Using MCMC to generate Gaussian values > plot.mcmc <- function(mcmc.out) { + op <- par(mfrow=c(2,2)) + plot(ts(mcmc.out),col="red") + hist(mcmc.out,30,probability=TRUE, + col="light blue") + lines(seq(-4,4,by=.01),dnorm(seq(-4,4, + by=.01)),col="red") + qqnorm(mcmc.out) + abline(a=mean(mcmc.out),b=sd(mcmc.out)) + acf(mcmc.out,col="blue",lag.max=100) + par(op) } > metrop.out<-metrop1(10000,1) > plot.mcmc(metrop.out) 20 Time ts(mcmc.out) 0 2000 4000 6000 8000 10000 −3−2−10123 Histogram of mcmc.out mcmc.out Density −4 −3 −2 −1 0 1 2 3 0.00.10.20.30.4 qq q q q qq qq qq q q q qq q qq q q q q q q q q qqqq qq q q q q q qq qq q q q q q q q q q qq q q q q q q q q q q q q qqq q qq q q q q q q q q q q q qq q q qqqqq q qqqqq q q q q qqqqq qq q qq q q q q qq q qq qq q q q q q q q q qqq q q q qq q qq q q q q q q q qq qq q q qqq q q q q qq qqqqq q q qq qq q q q q q q q q q qq q q q q q qq q q q q q qq qq q q q q q qqq q q qq q q qq qq q q qq q q qq q q qq q qq q q qqq q q q qq qq q q q q q q qqqqq qq q q q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q q qq q q qqq q q qq q q qq q q q qq q q q q q q qq qq qqqqqq q qq q q q qq q q q q qq q q q q q q q q q q q q qqq q q q qq q q qq qq q qq q q q q qq q qqq q q q qq q q qqqq q q q q q q qqq q q qq q qqq q q qqq q q qqq q q q q qq qq qq qq qqq q qq q qq qq qq q q q q q q qqq q q q q q q q q q q q q q q q q qq q qq q q q q q q qq q q q q q q q q qq q q q q q q q q q q qqq qqq q q q qqq q qq q q q q qq q q qq q q q qqqq q qqqq q q q q q q q qq q q q qqq q q q q q q qq qqq q q qq q qq qq qq q q q q qqqq q q q q q q qq qq q q q q q qq qq qq q q qqq q q q q q q q q q q q qqq q q q q q q q q q q qqq q q q q q qq q q q q q q q q q q q q q q q q q q qq q q q q q q q q qqq q q qqq q q q q q q qq q qq q qq q q q q q q q q q q q q q qqq q q qq q qq q q q q qq q qq qq qq q q qq q q q q qqq q q q q qq qqqq q q q qq q q qq q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq qqq q qq q q q q q q q q qq qq q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q qq q q q qq qqq q q qq q q q q q q qqq q q qq qq q qq qq q q q q q qqq qqq q q q qq qqqq q q q q qqq qqq q q q q q qq qq q qq q q q q qq q q q q q q q q q qq q q qq q q q q q qq qq q qq q q q q q q q q qq qq q q q q q q q q q q q q q q q q qq q qq q q qq qq qq qqq qq q qq qq q q q q q q q qq qq q q q q qq qq qq q q q q q q qq q q qqq q qq qq q q q q q qqq q qqq q q q q q qq q q qq qq q qq qqqq qq q q qq q qq qq qq q qq q q q q q q q qq qqq qq q q qqq q q q q q qq q q q q q qq qqq q qqqq q qq q q qq qqq q q q q q q qqqq q q q q q q q q q q q q q q qq q qq qq q q qq qq qq q qq q q q qqq qqq qqq q q qq q qq qqqq q q q q q q q qqq q q q q q q q q q q q qq q q q qqq q q q q q q q q q q qq q qq q qq qq qq q qq qq qq q q q q q q q q q q q qq qq q q q q q q qqq q q q q qq q qqq q q q q q q q q q q q qq q q qq qqq q q q qq qq q qq q q q q q q qq q q q q qqq q q q q q qq q q q q q q q qqq q q qqq q q q q qq q q q q q q q qq qqq q q qqq qq q q q q q q q q q q q q q qq qq qqq q q q q qq q qq q q qqqqq qq q q q qq q q q q q qq q qq q q qqq qq qq q q q q q q qq q q qqq q q q qq q qq q qqq q q qq q q q q q q q q q q q q q q qq q q q qqqqq q q q q q q qq q q qq q q q qqq q q q q q q qq q q q q q q q q q q q qqqqq q qqq q qqq q qqq q qq q q q q q qq q q qq q qq q q q q q q qq q q q qq qqq qqqq qqq qq q qq q q q q qq q q q q q q qq q q qq q qqqq q q q q qq q qqqq q qq qq q q q q q q q q q qqq q qq q q q q qq q q q qq q q q qqq q q qq q q q q q q q q q qq qq qq q qq qq q q q qq q q q qq q qq q q qq q q q q q q q q qq q q qq q qq q qq q q q q q q q qqq q qq q q qq qq qq q q q qq q qq qqq q q q q q qq qq q q q q q q q q q qqq q qq qq q qq q q q qqq qqqq qqqq q q q qq qq qq qq qq q q q q q q q q q q q q q q qqq q q qq q q q q qqq qq qq q q q qq q q q q q q q q qqq q q q q q q q q q qq qq q q q qqq q qq q q q q q q q q q q qq q q q q q qq q q qq q q q q q q q q q qq q qqq q q q q qq qq q q q q q qq q qqq q q qq q q q qq q q qq q q qq q q q q q qq q q q q qq q qq q q q q q q q q q q q q qq q q q q qq q q q q q qq qq q q q q q q qq q q q q q q q q q qq q q q q q qq q q q q q qqqq q q q qqq q q qq q q q q q q q q q q q q qqq q q q qq qqqqqqq qq q q q q q q q qq q qq q q q q q q q q q q q q q q q qq q q q q q q q qq q qq q q qqq q qq q qq q q qq q qq q q qqq q q q q qq q qq qqqq q q q q q q q q qq q q qq qqqqq qqq qq qq q q q q q q q q q qq q q q q q q q qqq q qq q q q q q q qqq q qq q qq q qq q q q qq qq q qq q q q qq q q q q q q q q q q q qq q q q qq qq q q q q q q q q qq qq q qq q q q qq qq qq q q q qqq qq q q q q q q q q q q q q q q q q qq q qq q q q q qq q q qq q q qq q q q q q q q q q q qqq q q q q q qqq q q qq qq qq q q q q q q qq qq qq q q q q q q q q q qqqq q q q qq qq q q q qq q q qqq qq q q q q qq q qq q q q q qq q q q qq q q q q q q q q q qq q qq q q qq q q qq qq q q qqq q qq q q q q q q qq q q q qq qq q q q q qq q q qqq q q q q q q q q q qq qq qq q q q q q qq qq qqqq q qq q qq qq q q q qqqq q q q q q q qq q q qq q q q qq q q q q q qq q q qq qqq q q q q qq qq qqq q qq qq q q q q q q qq qq q qq qqqq qq qq q qq qq q q q q qqq q q q qq q q q q q q qqq q qq q q q q qq qq q qqqq qq qqq q qqq q q q q q q q q q q q q q q q qq q qqq q q q q q q qq q q q qqq qq qqqq qq q q qq q q q q q q qq q q q qq qqq qq q q q q q q q q q q q q q q q q q q qq q q q q q qq qq q qq qq q q qqq qq q q q q q q qq q q q q q q q qq q q qq q q q qq q q qq q q qqqq qq q q q q q q q q q q qq q qq q q qqq q q q q qq qq qqq q q q q qq qq q q q qq q qq qq q qqq q q q qq qq qq qq q qq q qq q qq q qq q q q q q q q q qqq q q q q qq q q q q q q q q q q q q q q q qqq q q q q q qq q q q qq q q q qqq q q q q q q q qqqq q q q q q q q qq q q q qqq q q q qq q q qq qq q q q q q q q q q q q q q q q qq qq q q q q q q q q qqq q q q qq qqq q q qqq q q q q q q q q q qq q qq qq q q q qq q q qqq qq q qq q q q q q q qqq q qq q q q q qq q q qq q qq q q q q q q q q qq q qq qq q q q qq qq qqq q q q qq q q q q qq q q qq q qq qq q q q q qq qq q q q q q qq q q q q q qqq q q qq qq q q q q q qq q q q q q q q q qq q q q q q q q q q q q qqq qq q q q q q q q q q qq q q q q q q q q q q q q q q q qq q q qq q qq q q qq q qq q q q q qq q q q q q q q q qq q q qqq q qqqq qqq qq q q q q q q q qq q qq q q qqq q qq q q qq qqqq q q q q q qq q qq qq q q q q qqq q qq q q q q q q q q qqqq q qqq q qq qq qq q qqq q qq qq q q q qq q q q q q q q q q q qq q q q q q q q q q q qqq q q q q q q q q q q q q q qq q q q q q qqq q q qq qq q q q q q q q q qq qq q q q q q q q q qq q qq qq qqq q q q qq q q q q q qq qqq qqqq q q q q q q q q q q qq q q qq q q qqqq qqq q qq q q q q q q q qq q q q q q q q q q q q qq qq qq q q q qq q q q q q q qqqq q q q q qqqq q qq qq qq q qq qqqq q q q q q q q q q qqqqq qq qqq q q qq q q qq qqq qq q q q qq q qqq qq qq q qq q qq q q qq q q qq q q qq q q q q q q qq qqq q q q q qq q q q q q qq qq q q q q q qq q qq q q q q qq q q q qqq q qq q q qqq q qq q qq q q q qq q q q q q qqq q qq qq q q q q q q q q q q q qq q qq qq q q q q q q qqq q q q q q q qqq qq q q q q q q q q q qq q q q q q q q q q qqqq q q q q q q q qq q q qq q q q q q qq q qq q qqq qq qq qq q qq qq qq q q q q q q q qqq qq q q q q q q q q q qqq qq qq q q qqq q q q qq qq qq q qqq qq q q q q q q q q qq q q q q q q q qq q q qq q q q qqq q q qq q qq q qq q qq q q qq q q q qqq q q q q qq qqqq qq q q q q q q qq q qq q q qq q q q q qq q q q qqq q q q q qq q qq q q qq q q q qqqq q qq qq qq qq q qq qq qq q q q qqq q q q q q q q q q q q q q q q qq q q q qqq q qqq q q q q q q q qq qq q q q q qq q q q q q q q q q q q q q q qq q q q q qq qq q q q q q q q qq qq qqq qq q q q q q q q qq q q q q q q qq q q q q q qq q q q qq q q q q q q q qqq q q q q q q q q q q q qqqq q q q qq q qq q q q q q q q q qqq q q qq q q q q q q q qq q q q q qq q qq q q q q q q q q qqqq q q qq q q q qq q q qq q q q q q qq q q qq qq q q q q qq q q q q q q q q q q q q q q q qq q q q qq q q q q q q qq q qq qq q q q qqq qq q qqqq q q q q q q q q q q q q q q q q qq q q q q q qqq q qq q q q q qq q qq q qq qq q q qq q q q q q q q q q qqqq q qq q q q qq qq q q q q q q q q qq q qq qq q qq qqq q q qq q qq qqqq qqq qqq qq q q q q q q q q q qq qq q qq q qqq qq qq q q q qq q q qqq q qq q qq q q q q qq q q qq q q qq q q q q q q q qq q qq q q q qq q q q q q q q q q q q q q q qq q qqq q q q q qqq qq qq q q qq q q qq q qqq q qq q qq q q q qq q q q q q q q qq q q q q q q qq q q q q q q qq q qqq q q q q qqq q qq q q q q q q qq qq q q q q qqq qq q q qq q qqq q q q qqq qqqq q q qqq qqqq q q q qq qq q q q q q qq q qq q q q q q qqq q q qq qqq qq qq q qq q q q qqq q q q qqq qq qq q q qq q qq q q q q qq q q q qqqqqq qq q qq q qq q q q q q qq qq q q q q q q q q q q qq qqq q qq qqqq qqqq q q q qq q qqq q qqqq qq q qqqq qq q q q q q q q qqqqq qq q q q q q q qqqq q q q q q q q q q q q q q q q q qq qq q q qq q q qqq q q qq q qq q q q qqq q q q qq qq qq qq q q q qq q qq q q q q q q q q q q q qqq q q q q qq q q q q q qqq q qq q qq q q q q q q q q qqqq q q q qq q q qqq q q q q q q q qqq q qq q q q q qq q q q q q q q q qq qq q qqq q q q q q q qq qq q qq q q q q qq q q q qq q q q qq qq q q q q q q q q qq q q q qq q qq q q q q q q q q q q q q q q q q qq q qq q q qq q q qq q q q q q q qq qq qq q q q q q q q q q q qq q q q q qqq q q qq qq q q qq q qqq q q qq q q qq q q qqq q qqqqq q q q q qqq q q qqq q qq q q qq q q q q qqqq q qq q qqq q q q q q q q q q q q q q qqq q qq q qq q q q q qq q q q q q q q q q q q q q qq qqq q qq q q q q q q q q q q q q q q q q q q q q q q qq qq q q q q q q q q qq qqq q q qq qqq q qq qq q q q qqq q qqq q q qqq q qqq qq q q q q q qqq q qqq qq q q qq qqq q qq qq q qq q q q q q q qq q qq q q q q q qq q q q q q q qq q q q qq q q q q qqqq q qq q q qq q q q q q qq q q qq q q q q qq q q q q q q q q q q q qq q q q qq q q qq q qq q q q q qq q q qqq q qq q q q q q q q q q q qq q q q q q qq qq qq q q q q q qq qq q q qq q q q q q qq q q q qqq q q qq qq q q q qq qqqq q qq q q q q q q q q q qq q q qq q qq qq q q q qq qq q q qqqq qqq qq q q qq qq q q q q qq q q q q q q q q qq q q q q q q q qqqq q q q q q qqq q q qq q q q qq q qq q qqq qq qq q q q q q q q qq q q q qqq qq q q q q q q q q q q q qq q q q q q qq q q q qq q q qq q qq q q q q q q q q qq q qqqq q q qq q qq q q q q q qq qq qq qq q qq qqq q q q q q q q qq q q qq q q qq q qq qq q q q q q q q q q q q q q qq q qq qq q q q q q q q q qqq qqq q q qq q q q q qq q qq qq qq q q qq qq q qq q q q qq q q q q q q q q q q q qq qqq q q q qq q qq qq qq q q q q qq q q q q qqqq q q qq q q qqq qq qq q q q q q qq qq q q q qqq qq qq q q qq q q q qq q q q q q q qq q q q q q q qqq q q q q q q q q qq q q q q qq q q q q q q qq q q q qq q q q qqq qq qq qq q qq q qqq qq q q qq qq q q q q qq q q qq q qq q q q q q qq q q q qq q q q qq q q q q q qqq q qq q q q q qq q qq q q qq q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q qqq q q q q qq q q qq q q qq qqq qq qq q q q q q q q q qq qq q q q qq q q qq qq q q qq q q q q q q q q q q q q qq q q qq q q qq q q qqqq qqq qq q qq q qq qq q q q q qq q q q q q q q qq qq q q q q q q q q qqqqq q qq q q q q q q q qqq q qqqq q qq q q q q q q q q q q q q q q q q q q q qq q q q q q q q qq q q q q q q q qq q qqqqq q q q q q q q q qqq qqq q qqq q q q q qq qq q q q q q q qqq q q qq q q q qqq q qq qqq qq qqqqqq q q q q qq q qq q qq q q qqqq qqq q q q q qq qq qq q qq qq q q q q q q q qq q q q q q q q q q q q qq qq q q q q q qqq q q qqq q q q qq q q qq q qqq q qq qq q qqq q q q qqqqqq q q q q qq q qq q qqq q q q qq qqq qq qq qq qq q q q q qq q q q qqqqq q qq qq q qq q q q q q q q q qq qq q q q q q qqq qqqq q q q q q qqqqq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q qqq qq q q qq q q q q q qq q q qq q q q qq qq q q q qq qqqq q q qq q q qq q q q qq q q q q qq q q q q q q qqq q qq qq qq qq q q q q q q q q q qq qq q q qq q q q qq q q qq q qq q q q q q q q q q q q q q q q q qq qqqq q q q qq q q q q q qq q q qq q q qq qq q qq q q q q qq q q q q qq q q q q q q q qqq qq q q q qq q qq qq q qq q q q q q q qqq qq qq q q q q q q qq qq q qq q qq q qqq q q qq qq qq q q q q qq qq q q q q q q q q qq q q q q qq qq q qq qq q q q q q q q qq q q q q q q q qq q q qq q qq q q q q q q qq q q q q q qq q q q q qq qq q q q q qq q q q q qq q q qq q q q qqq qq q q q q q q q q qqq qq q qq qq q qq qqq q qq q q q qq q q q qq qqqq q qq qq qq qq q q q q q q q q qq qq qq q q q q q q q q q q qq q q qq q q q q qq q q qqq q q qqq q q q qq q q qq qqq q q q q q q q q q q q qqq q q q q q q q q q q q q q qq qq q qqq q qq q qq q q q q q q qq q q q q qq q q q qq q q q q q q q q q q q q q q q q q q q qqq qqq qqq qq q qq q qqqq q qq q qq q q q qq q q q q q q q q qq q q q q q qq q q qq q q qq q qq qq q qq qq q q q q qq q qqq q q q q q q qq q q q qq q q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q qq q q q q q q q qqq q qqq q qqq qq qq q qq q qq q qq q q q qqq q q q q q q q q q qqqq qqq qq qq q qq q q q q q q q qq qq q q q q q qq q qqq qq qq q qq qq q q q q q q q qq q qqq q q q q q q qqq qq q q q qqqq q q qqqq qq qqq qqq q qq q q q q q q qq q qq q q q q q q q qq qq qq qq q q q q q q qq q q q qq qq q q q q q q q q q qq q qq q qq qq qqq q q q q qq qqq q q q qq qq q q q q qq q q q q q qq q q q q q q qq q q q q q qqq q q q qqq q q qqqq q q q q q qq qq q q qq q qq q qqq qq q qqq q q qq qq qqqqqq q q qq qq q q q q q q qqqq q q qqq q q q q q q q q q qqq q qqq q qq q qqqq q qq q q q qq q q q q q q qq q qq q q q q q q qq q qqq q q q q q q q q q q q q qqqqq q qq qq q q qq q q qqq q qqq q qqq qqq qq qq qq qq qq q q q qq q q qq q q q qq q q q q qq qq q qq q q qq q q q qq q q q q q q qq qq qq qq q q q q q qq q q q q qq q q q q qq qqqqqq q q q q q q q qq q q q qq q q qq qq q qq q qq q q q q qq q q qq q q q q q qq q q qq qq q qqq q q q qq qq q qq qq qq q qq qq q qqq qq qqqq q qqq qq q q q qq q q qq q q q q qq q q q q qq q qq q q q q qq q qqq q q q q q q q qq q q q qq q q q q q q q q q q q qqq q q q q q q q q q q qqq qqq q q qq q q qq q qqq qq q q qq q q q q q q q q q q qq q q q q q qq q q q qq q q q q qqqq q qqqq qq q q q q q q q q q qq q q qq q q q q qqqq q qq q q q q q q q q q q q q qq q q q qqq q q qq qqq q q q q q q q q q qq qq qqq q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq qq qq qq qq q q q q q qq q q q q q q q qq q q q qqq q q qq q q q q q qq qq q qq q q q q q qq q q q qq qqq q q q q q qq q q q qq q q q qq q q qq q q q q q q q qq q q q q q q q q q qqqq q q q q q q qqq q q q q qqq q q q qqq q q q qqq q qq qqq q q qqq qq q qq q q q q q q q qq q q qq qq qqq qq qq qq q qq qq qq q qq q qq q q q q q q q q q q q qq qq q q qqqq q q q q q qq q q q q qqq q q q qq q q q qq qqq q q qq q q qqqq q q q qq q q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q qqqq qq q qq q qq q q q qqq qq q q q q qq q qqq qq q q q q qq q q qq q q q qq q q q q q qq q q q q q q qq q qqqq qq q q q qq q q q q q q q q q q q q q q q qq qqqq q q qq qqqqq q q qq qq qq qq qq qq qqqq q q qq q qqq q qq q q q q qq q q q q q q qq q q q q q q qqq q q q q q q q qq q q q q q q qq q q qqq q qq q q qq q q q q q q q q q qqq q qqq q q q qqqqq q q q qqq q q q q qq q q q q q q q qq q q q q q q qq q q q q q qq q q q q q q q q q q qq q q q qqq q q q q qq q qq q q q q q q q q q q q q q qq q q qqq q qq q q q qq q qqq q q q qq q q qq q qqq qq q qq q q q q qqq q q qq q q q q qq q q q q q q q q q q q q q q q qq qq q q q q q qqq qq q q q q q qqq q qqq q qq q qqqq q q q q q qq q q q q q q qqq q qq q q qq q q q q q qqq qq q qq q q q qq q q qq qq q q q q qq q q q q qq q q q q q q q q q qq q q qq q q qq q q q q q qq q q q q qq q q q q q q qq q q q q q q q q q q q q qq qq q q q qqq q q q q qqq q qq q qqq q qq qq q q q q qq q q q q q q q qq q q q qq q q q q qq qq q q q q q q qq q qq qqq q q q qqqq q q q q q qq q q q q q q q qq q q qq q qq q q q q q q q q q q qq q q q q q qq q q q q q q q q q qq q q qqq q q q q qqq q q q q qq q q q q q q q q q q q q q q q q q q q qq q q q q q q q qq q q qq q q q q q q q q q q qq q q q q q q qq q qq q q q q qq q q qq q q qqq qq qq qq q q q q qq q q q q q q q q q q q q q qq q qq qq qq q q qq q q q qq qq q q q q q qq q qq q q q q q qq q q q q q q q q q q q qq q q qq qq q qq qq q qqq q q q q q q q q q qq qq qq qqq qq q qq q q q q q q q qq qqq q q q q q q q q qqq q q qq qq q q qq q q q qq qq q q q qq q q q q qq qqq q q q qq qq q q q q q q q q q qqq q qq q q q qq qqqq q q qq q q q q q qqqqqq q q q qq q qq q q qq q q q q q q q qqq q q q q q q q q q qqq q qq q qq q q q q q q qq q q q q q qq q q qqq q q qqq qq q q q q q q qqqq q q q q q qq q q q q qq q q q q q q qqq q q q q q q q q q q q qq q q q q q q q qq q q q q qq q q q qq q q q qq q q q q q q qq qqq q q qqq q q q q q qq q q qq q q q qq qq qqq q qq q q q q qq qq q q q q q q q q q q q q q q qq q q q q qqq q qq q q qqqq q q q q q q qq q qq q qq q q q q qq qqq q q q q q q qqqq q q qqq q q q q qq q q q q q q q qq q q q qq q q qq q q q qq qq q q qq q q q q q qq qq q q q q q q q q q q q q q q q q qq q q qq q q q q q qq q q qq q qq qq q q q q q qq qq q q qqq q qqq q q qq q q qq q q q qq q q q qqq q q qq q q q qq q qq q q q q q q q q q q q qq q qq q q q q q qq q qq q q q q q q q qq qq qq q q q q q q qqq qq qq q qq q q q q qq q q q qq qq q q q q q q q qq q q q q q q q q qq qq q qq q q q q q q q q qq qq q qq q q q qq q q q qqqq q qq qq q qq q qqq qq q q q q q qq q qqq q qq qq q q q q q q q q q q q q q q q q q qqqq qqqqq qqq q qq qq q q q q qq q qq q qq q q q q q q q q q q q q qq qq q qq q q qq q qq q q q q q qq q q q q q q q q qqqq q qqq q q q q q q qq q qq q qq q qq q q qqqq q q q q q qq qq q q qq q q q q q qqq qq q q q q q qq q q q qq qq q q q qq qqq q qqq qq q q q qqq q q q qq q q qqq q qq q q qq q q qq qq qqq qq q qqq q q q q q qq q q qq q q qqq q q q q q q q q q qq qq qq q q q q q q q q qq q q q qq q q q q q q q q qq q q q q qqq q q q q qq q q q q q q q qq qq q q q q q q q q q q q q q qq q q q q q q q q q q qq q qqqq q q qqq qq q q q q q qq q q qqqq q q q q q qq qq q q q qq q q q q q q q q q qq q q q q q q q q q qqq q q qq q q q q qq q qq qq q q q qq q q qq q q q q q q qqqq q q qq q q q q q q q qq q q q q q q q q q −4 −2 0 2 4 −3−2−10123 Normal Q−Q Plot Theoretical Quantiles SampleQuantiles 0 20 40 60 80 100 0.00.20.40.60.81.0 Lag ACF Series mcmc.out
  • 21. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Heuristics on Hastings-Metropolis In standard Monte Carlo, generate θi’s i.i.d., then 1 n n i=1 g(θi) → E[g(θ)] = g(θ)π(θ)dθ (strong law of large numbers). Well-behaved Markov Chains (P aperiodic, irreducible, positive recurrent) can satisfy some ergodic property, similar to that LLN. More precisely, • P has a unique stationary distribution λ, i.e. λ = λ × P • ergodic theorem 1 n n i=1 g(θi) → g(θ)λ(θ)dθ even if θi’s are not independent. 21
  • 22. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Heuristics on Hastings-Metropolis Remark The conditions mentioned above are • aperiodic, the chain does not regularly return to any state in multiples of some k. • irreducible, the state can go from any state to any other state in some finite number of steps • positively recurrent, the chain will return to any particular state with probability 1, and finite expected return time 22
  • 23. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences MCMC and Loss Models Example A Tweedie model, E(X) = µ and Var(X) = ϕ · µp . Here assume that ϕ and p are given, and µ is the unknown parameter. → need a predictive distribution for µ given x. Consider the following transition kernel (a Gamma distribution) µ|µt ∼ G µt α , α with E(µ|µt) = µt and CV(µ) = 1 √ α . Use some a priori distribution, e.g. G (α0, β0). 23
  • 24. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences MCMC and Loss Models • generate µ1 • at step t : generate µ ∼ G α−1 µt, α and U ∼ U([0, 1]), compute R = π(µ ) · f(x|µ ) π(µt) · f(x|θt) Pα(µt|θ ) Pα(θ |θt−1) if U < R set θt+1 = θ if U ≥ R set θt+1 = θt where f(x|µ) = L(µ) = n i=1 f(xi|µ, p, ϕ), f(x · |µ, p, ϕ) being the density of the Tweedie distribution, dtweedie function (x, p, mu, phi) from library(tweedie). 24
  • 25. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences > p=2 ; phi=2/5 > set.seed(1) ; X <- rtweedie(50,p,10,phi) > metrop2 <- function(n=10000,a0=10, + b0=1,alpha=1){ + vec <- vector("numeric", n) + mu <- rgamma(1,a0,b0) + vec[1] <- mu + for (i in 2:n) { + mustar <- rgamma(1,vec[i-1]/alpha,alpha) + R=prod(dtweedie(X,p,mustar,phi)/dtweedie + (X,p,vec[i-1],phi))*dgamma(mustar,a0,b0)/ + dgamma(vec[i-1],a0,b0)* dgamma(vec[i-1], + mustar/alpha,alpha)/dgamma(mustar, + vec[i-1]/alpha,alpha) + aprob <- min(1,R) + u <- runif(1) + ifelse(u < aprob,vec[i]<-mustar, + vec[i]<-vec[i-1]) } + return(vec)} > metrop.output<-metrop2(10000,alpha=1) 25 Time ts(mcmc.out) 0 2000 4000 6000 8000 10000 789101112 Histogram of mcmc.out mcmc.out Density 7 8 9 10 11 12 0.00.10.20.30.40.5 qq qq q qq q qqqqqqq qqq qqqqqqqqq qqqqqqqq qqq qqqq qqqqqqqqqqq qqqqqqqqqqqqq q qqq qqqq qqqqq q q qqqq q q q qqqqqq qq qq qqq qqqqqqq qqqq qqqqqq qqqqqqqqqqqq q qqqq qqqq qqqqqqq q q qqqqqqq qqqqqq q q qqqqq qqqqqq qqqqq qqqqq qqqq qqq qqqqq qqq qqqqqq qq qqq q qqqqq qq qqqqq qqqqqqq qqqqqqqqq qq qq qqqqq qqqqqq qq qqqqq q qqq q qq qq qqqq q qqqqqqqqqqqqqq qq q qqqqqqqqqqq q qqqq qq qqqqqqq qqqqqq qqqqqqqqqqqqq qqqqqqq qq qqq qq qqqqqqqqq qq qq q q q qqqqqqq qqqqq qqqqqqqqqqq qqq qqqq qqqq qqq q q qqqqq qqqq q q qqqq qq q qqq qq qqqqqqq qq q qqqq q q q q qq qq qq qq qqqqqqqqq qqqqq q qqqqqqqqqq qqq q qqqqqqqqqqqqq qqqqqqqqq qqq qqq qqqq qqqqqq qqqqqqqqqqqqqqqqq q qqq qqq qqqqqqq qqqqqq qq qq qqqqqqqqqqq q q qqq q qqq qq qqqq qqq qqq q qqqqqqqqqqq qq qqqqqqqq qqqqqqqq qqqq qqqqq q qqq qqqq qqqq qqqqqq qqqqqq qqq qqqq qqqq qqqqqq q q qqqqqq q qqqqqqqqqq q qqq q qqqqqq q q qqqq qqq q qq q qq q qqqq q q qqqq q q qq qq q qqqqqqqqqq qqq qqqqqqqqqqqqqq qq q qq qqqqqqqq qqq q qqqqqq qqqqqqq qqq q qq qqqqq q qqqqqqq qqq q qqqqqq q qqqq qqq qqq qqqq qqqqqqq qq q qqqq qqqq q qq qqq qqq q q qq qqq qqq qq qqqqq q q qqqqqqqqqq qqqqqqqqq q qqq qqqqqqqq q qqq qqqqqqqqq q q qq qqq q qqqqq qqqqqqqqq qqqqqq q qqqq qqq qqqqqqqqqqqqqqqqqqqqq q q q qqq qqqqqq qqq qqqq qqqqqq q q q qq qqqqqqqqqqqq qqq q qqqq qqqqqq q qqqqqqqq q qqqq qqqqqqqq qqqq qq qqq qqq qq q q qqqqq qqqq q qqqqqqqqqqqqqqq qqqqqq q q qqq qq qqqqqqqqqqqqqqq q qq qq q q q qqqq qqq qqq qq qqqqqqq qqqqqqqq qqqqqqq q qqqq qqqq q qqqqqqqq qqq qqq q q q qqq q qqq qqqq qqqqq qqq q qqq qqqqq qq q qq q qqqqqqq qqqqqq qq qqqqqqqqqq qqqq qq qq qqq qqqqqq q qq qq qqqq q q qq q qqqq qqqqqqqqqqqq qqqqq qq qqqqqqq q qqqqqqqq qqq q qqqqqqqqqqqqqq q qqqqqqqqq q qq qq q qqq q qq qqq qqq qqq qqqqq qqqqqq qqqq qq q q qqqqq qqq qq qqq qq qqqqqqqqqq qqq q qq q q qqq q qqqqqqq q qq qqqqqqq q qqqqq qqqqqqq qq q q qqq q qq qqqq q qqqqqqq q qqq qqqq q qqqq q qqqqqqqq qq qqq q q qq qq qq qqqqqq qqqqqq q q qqqq qqq q q qqqq q q qqqqqqqq qqqq qq q qqqqqqqqq q q qqq qqq qqq qqqqq qqqqqqq qqqqqq qq qqqqqqqq qqqqqqqqqqqqqqqqq qqqqqqqqqqqqq q qqqqq qqqq qqqqqqq q qqqq q q qqqqqqqqq q qqqqqqqqqqqqq q q qqqqqq qq qqqqqq qqqqqq qq qqqqqqq qqqq q qqqqqqqq q q qq qqqq qqqq qqq qqqqqqqq qqqq qq qqq q qqqqq qqq qqq qqqqqqqqqq qq q qq qq qqqqqqqqqqqqq qqqqqqqqqq q q qq qq qq q qqqq qqqqqqqqqqqq qq qqq qqqqqqq qq q q qqqqq qqq qq qq qqqqqqqqqq q qqqqqqqqqqq qqqqqq qqqqqqqqqqqq qq qq qq qqqqq qq q qqq qqqqqqq qqqq qqq qq qqq q qqq q qqqqq qqqqqqqqqqqqqqqqqqq q qqqqqqq qqq qq q qqqq qq qqqqq qqqqqqqqqqqqqqqq qq qqq qq qqq qqqqqqqq q q qqqq qqq qqqqqqqqqqqqqqq qqqqqqqqqqq qq qq qq q qqqqqq qqqqqqqq qqqqq q qqq q q q qqqqqq q qqqqqq qq qqq qq qqqqqq qqqqqqqqqqq q qq qqqq qqqqqqqqqqqqqqqq qqqq q q qq qq qqq q q qq qqqqq qqqq qqqqqqq qqqqq qqq qq qq q q q qqqqqqqqqqq qqqq qq q qqqqqqqqqqq qq q qq qqqqqqqq qqqqq qq qqq q qq qqq qqqq q qq qq q q q q q qqqqqq q qq qqqqq qq q qqqq qq qq q qqqqqq qqq q q qqqqqqq qq qqq qqqqq qqqqqqqq qqqqq qqqq q q qqqq qqqqqqq qq qq qqqqq qqq qqq qqqqqqqqqqqqq qqqq q qqq q qqq qq q qqqqqq qqq qqq qqqq qqqqqqqqqqqqqqq q q qqq qq qqqqqq q q q qq qqqqqqqqqqqqq qqq qqqq q q q qqq q qqq qqqq qqq qqqqq qqq q qq qq qq q qq qq qqqq qq q qq q q qqqq qq qqqqqq qq qqqqqqqq qq qqqq qqqqqqqq qqqq q q qqq qq q qqqqqqqq qq qqqq qqq qq qq qqqqqqqqqq qqq qqq qqq qqqq qqqqqqq qqqqqqqqqqqqq qqqqqqq q q qq q qqqqqqqqqq qqqqqqq qq q q q qqqqqqqq q qqqqqqqqqqq qqqq q qq q qq qqqqqqq qqqq qqqqqqqqqqqq qqqq q qqqqqqqqqqqq qqq qq qq qqqqqq q qq qqqq qq q qqq qqqqqqqqqq qq qqqqqqqqqq qqq q q qqqqq q qq qqqq qqqq q qqqqqq q qqq q q qqqqq q qqqqq qqqq qq qq qqqqqqqqq q qqqqqqqqq q qqqqqqqqq q qq qqqq qq q q qq qqqq q q qqqqq q q qqqqq q qq qq qqqq qqqqq qq qqq q qqqq qq q q q qq qqq qqqqqq qqq qq qqq q qqqqq q q qqqqq qqqq qq qqqq qq qq qqqqq qqqqqqqqq qqqq q q q qqqqqq qq qq q qq qq qq q qqqqqqqqqqq qq qq q qqqq q qqqq qqqqqqqqq q qqqqq q qqqq q q qqqqqqqqqqqqq qqq q qqqqq qqqqq qqqqqqq q qqqqq qq qqqq qqqq qq qqqq qqq q qqqq qqqqqq qq qqq qqqq qqqqqqq qqqqqq qq qq qqqq qq qqqq q q qqqqqqqqq qqq qqqq qqq q q qq qqqq qqqq q qqqqqq qqqqqqqq q qqqqqqq qq q qqqqq qqqqq qqqqqqqqq qqqq qqq qqqq qqqq qq qqqqqq qqqq q q q qqqqq qqq qqqqqq qqqqqqq q qqq qq qqqqqqqq qqqq qqqq qq q qq qqqqqqqqq qqqqq qqqqqqqqqqqqq q q qqq qqqqq qqq qqqq qqq q q q qqq q qqq q qqqq q qq qq qq q qqq q q qq qqqqqqqq qq qq q q q qq q qqq q qqq q qqqqqq qq qq q qq qqqqqqqq qqq qqqqqqqqqqqqqqqq q qqq qqqqq qq qqqqqqqq qqqq qqqqqqqqqqq qqq qqqqqqq qq qqqq q qqq qqq qqqq qq q qqqqqqqqqqqqqq q qq q qqqqqqq qqq qqqqqqq qqqqqqqqqq q qq qqqqq qq q qqqqqqqq qqqqqq qqqqqqq q q q qqqqqq qq q q qqqqqqqqqq q q qqqqq qqqqqqqqq qqq qqq qqqqq qqqqqq q qqq qqq qq qq qqqqqqq qqq qqqqqqqqqqq qqq qq qqq qqqqqqqqqqq qqqqq qqqqqqqqqqqqqqqqqqqqqqqqq qqq q qqqqq qq qqqq qqqq qqqq q q q qq qq q qq q qqq qqqq qq qq qq q q qq q q qq qqqqq qqqqq q qqq qqq qqq q qqq qq qqqqq qqqqqq qq qqqqq qqqqq qq qq q qqq qqqqq q q qq qq q qqqqqq qq q qqqq qqqq qq q q qqq q q qqqqq qq q qqqqq qqqqqq qqq q q qqqqqq q q q q q qqq qqq q qqq qq qqqq qqq qq qqq q qqq qq q qqqq qqqqq qqq qq qqqqq qqqqq qqq q q qqqq q qq qq qqqqq qqq qqqqq qqqqqqqqqqqq qq qq q q q q qqqqqqq qqqqq q qqq qqqqq qqqqqqqqqqqqqqqqqqq qq qqqq q qqq qqqqqqq qqqqqq q q qq qq qqqq qq qqqqqqqqqqqqqqqqqq qqqqqqqqq qqqqq q qqq qqqqq qqqq q qqq qqqqqqqqqqqqqqqqq q qq qq q qq qq q q qqqqqqq q q qqqqqqqqqq qqqqq qqqqqqq q qqq qq qqqqq qqqqqqq qqqqqq qqqqqqq qqq qqqqqqqqqq qqq qq qqq q qqqqq qq q qqqqqqqq qqqq qq qq qq qqqq qq q q qq qqqqqqqqqqqqqq qqq qqq qqq qq qq qq qqq q qqqqqqqqqq q qqq qq qq qqq q qqqqqqq qq qq qqqq q qqq q q qqqqqqqqq q q qqq q qqq qq qqq qqqqqq qqqqqq qq qqqqqq q q qqqqq qqqqqqqqqqqqqqqqq qqqqqqqqq qqq qq qqqqqq qqq qqqq qqqqq q q qqqq qqq qq qq qqqqq qqq qqqq qqqqqq q qqqqqqqq qqqqqqqqqqqq q qqq qqqqqq qqqqqq q qqqqqqq qqq q qqqqqqqq q qqq q qqqq qqqq qqqqqqqqqqqqq qqqqq qqqqqq qqqqq qq qq q qqqqqqqqq qqq qqqqqq q qq qq q qqq qqqqq qq q qqq q qqqqq q qq qqq qqqqq qqqqq q qq q qq q q qqqq qq qq q qqqqqqqqqqq qqqqqq qqq qq qqqq q qq q qqqqqq q qqqq qqqqqq qq qqq qq q qqq qqqqq qq qqqqqq q q qqqqqq qqqqq q qqq q qqqqqqqq qq q qqqq qqqqqqqqqqq q qq qqq qq qq qqqq qqq qqq qqqqqq qqq qqqqqqqqqqqqq qqqqqqqq q q qqqqqqqqq qqqqq q qqq qq q qq q q q qqqqq q qq qqqqq q q qq qq q qqqqqqq qqq qq qqq q qq qqq qq qq q q q qqqqqqq qqq qq q qqq qqqq q qq qqq qqqq qqqqqqqqqq qqqq q qqqqqqqq qqq qqqqqq qqqqqq q qqqqq q qqq qqq qqqqqqqqqqqq qq qqqqqqqqq q qqqqqqqqqqqqqqqqqq qq qqqqqq qqq q qqqqq qq qq qqqqq qqq qqq q q qq qqqq qqqq q qqqqq qq qqq q q qqqqqqq q qqqq qqqq qqq q qqqq q qqqqqqqqqqqq qqqqq q q qqq qqq q q qqq qqq qqqqqq qq qq qqq qqqqqqqq qqqq q q qqqqqqqq qqqqqqqqqqqqqqqqqq qqqqqqqqqqqqq qq q qqq qq qq qqqqq qq q qqqqqqq q q q qqqqqqq qq q qq qqq qqq q q qqqqqq q q q q qqq q q qqqq qq qq qq q qqqqq q qqqq qqqqq qqqq qqqqqqqqqqqqq qqq qqqq q qq qqqq qqq qqqqqq q qqq qqqqqq qqqqq qq qqq qq qq qqqqq q qqqqqq q qqqqqq qqq qqqqqqqq qqq q q qqqqqq q qq qqqqqq qq q qqq q qqqq qq qq q q qqq qq qqq qqqq q q qqqqq qqq qqqq qqqqq q q qqqqqqqqq qqqqqqq qqqqqq qqqqqqqq qq qqqq qqq qqqq qqqqqq qqqqqq qqqqqq qq qqq qq qqqqqqq qqqqqqqqq qqq qqqqq qqqqqqqqqq q qqqqq q qq qq q qqq q q qqq qqqqqqqqqqqqq qqq q qqqqqq qq qq q q qqq q qq qqq qqq q qq qq qq qqqq qqq qqqqq q qqqq qqqq q q qqqqqqq qqqq q qqqqqqqqq qqq qqq qqqqqqq qqqqqqqq qq qqqqqqqqqq qqqqqqqq qq qqqq qqq qq qqqqqqq q qqq qqq qq q q qqqqqqqqqqqqqqqqq qq qqq q q qqqqq qqqqqqqqqqqqqqq q qqq qq q q qqqqqq qq qq qqqqqqqqq q qqqqqqqq qq qqqqq qq q q q qqq q q qqqqq q q q q qqq qq q qqqqqqqqqqqq q q q qqq qqqqqqq qqqqqqqqqq q qqqqqqqqq qqq qqqqq qqqqqq q qqq qqqqqqqq qqqqqqq qqq qqq q qqqqq qq qq q q qqq qqqqqqq qq q q qqq qqq qqq qq q qq q qqqqq qqqqqqqqqqqq q qqqqqq qqqqq q qqqq q qqqqqqq qqqqqqqqq qqqqqqqqqqqqqqqqqq q qqq qqq q q q qqq qqqqqqqqqq qqq qqqq qqq qqqqqqqqqqq q q qq qq qq q qq q qqqq qq q qqqqqq qqqqqqqq qqqqqq q qqq q qqqqq qqqqqqqqqqqqqqqqqq qqqqq qqqqqqqqqqq qq qqqqqqqqq qqqqqqqqq qq qqq qq qqqq q qqqqq q qqq qqqq qq q q q qqqq qq q qqqqqqqqqqqqq qq q qq q qqqqqqqq q q qq qqqqqqq qqqqq qqq qq qqqq qqqqqq qq qqqqqq qqqqqq qq qqq qqq q qqqq qq qqqq q q qqqqqqqq qq qqqqqqq qq q qq qqqqqqqqqq qqqqqq q qqqqqqq q q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqq qqq qqqqq qq qqq qqqqqqq qqq qqq qqqqqqqqq qqqqqqqqqqqqqqqqqq q qqq q qqqq q qq qqqq qqq qq qqqqqqqqq q q qqqq qqqqqqq qqqq q q qqq q qqqqqqq qq qqqqq qqqqqqq qqq qqqqqqqqqqqqqqq qqqqqqqq qqqqqqqqqqqq qqqq q qqqqqq q q q q qqq q qqq qq qqq qqqqq qqqqq qq qqqqqq qqq qqq qq qq qqq qqq qqqqqqq qq qqqq q qqq qq q qqqqqqqqqqqqq q q q qqqqqqq qqqqq qq q qq qqqqq q q qq q qq qqqq qqqqq qqqqqqq qq qqq qqq qqq qqqqqq qqq q q q qqqqqqq qqq qqqqq qq qq qqqqqqqqqqqqqq qqq q q qqqqq q q qq qqqqq qqqqqqq qqqq qqqq qq q qqqqqq qqqqqqqqq qqq qqqq qqq qq q qqqqqqqqqq q q qqqqq qqqqqqqqqqqqqqq q qqqqqq q qqqq qqq qq q q qqqq q q q qqqqqq qq qq qqqqqqq q qqqqqq qqqqqq qq qqqqq q qq qqqqqqqq q qqqq q q qqq qqqq qq qqqqqq qqqqqqqqqqqqqqq qqqqqqqqq qq qqqqqqqqqqq qqqqqqq qq qqq q qqq qq q qqq qqqq qqqqq qqqq qqqqqqq q qq q qqqqqqqqq q qqqq qqqq qq qqqq q qqqqq qq qq qqqqqqqqq qq qqq q qqqq q q qqqq q qqq qqqqq q qqqq q qq q qqqqqq qqqqqqq q q q qqqqqq qqqqq q qq qq q qqqqqq qqqqqqqqqqqqqqq q qq qqqq q qq qq q q qqqqqq qqqqqqqq qq q qqqqqq qqqqq qq qq q qqqq q qq qq q qqqq qqqqqqqqq qq qq qqqqqq qqqqqqq qqqqq qqqq qqqqqq qqqqqqq qqqqqq qqqq qq qq q qq qq qq qqqqqqqq q qq qq q q q qqqq qq qqqqq qq q qqq qq q qqq qqqqqqqqqqqqqq qqqq qq q qq q q qqq qqqq qqqqqqqq qq qqqq q qqq qqqq qqqq qqqqqq qqqqqqq q qqq qqqqqqqqqq qq q qqqqq q qq qqqqqq qq qqqqqq qq q q q qqqqqqqqqqq qqqqqqqq qqqqqqqqqqq q q q qqq qqq q q q q qqq q qqq qqq qqqqqq qq q qqq qqqqqqqqqqqq q q qqq qqqqq qqqqq qq qqqqq qqqqq qqqq q qqq q qqqqq qq qq qq qq qqqqq qqqqqq q q qqqqq q qq qq qqqqqqqq qq qqqq q qqq qqq qq qqqqqq qqq qqqqqq qqqq qq qq qqqq q q qq qqq qq qqqqqqqqqqq qq qqqqqqqq qqqqqqqq qqq q qqqqq q q q qqq qqqqqqq qqqq q q qqqqqqq qqq q qq qqqq qq qq q q qqq qqqqqqq qqqqqqqqqqqqqqqqq q q qqqq qq q qq q qq qqqq q qqq q qqqqqqqqqq qqq q qqq q qqq q qqqqq qqq qqqq qqqqq qqqqq q qqqqqqqq qqqqqq qqq qqqqqq q qqqqqq qq q qqqqq qqqq q q qq qq q q qqq q qqq q qqqqqq qqqqqqqq qqqqqqqqq qq qqqq qqqqq qqqqq q q q qq q q qq q qq qq qq qqqq qqqq qqqqq qqqqqqqq q q q qqqqqqqqqq qqqqqqq qq q qqq qqqq qqq q qqq qqqqq qqq qqqqq q qqq qq qqqq qqqqqqqqq qqqqqqqqq q qqqqqqqqqq qqq qq qqqqqqqqqqqqqqqqqqqqq qqqqq q qq qq qqqqqqqqqqqqqq qqqq qqq qq q qqqqqq qqqqqqq q qq qq qqqqq qqqq qqqqqq qqqqqqqqq qqq qqqq qqqq qq qq qq q qqqqqqqqqq qqq qqqqqqqqq qqqq qqq q qq q qqqqqqqq qq qqqq qq qqq qq qqq qqq qqqqq q q qqq q qq qq q q qq qqqqqqq qqqqqqqqqqqqqqqqqqqq qqqqq qq q q qqq qqqqq qqqq qqq qqq qqqqqq qq qq qq q qq qqqqqqq qqq qqqq qq qqq qqqqqqqqqqqqqq q q q qqqqqqqqqqqqqq qq qq q qqqqq qq q q q qq q qq q q qqqqqq qq qq qqqqqqqqqqq qqqqqqqqq qqqqqqqqq q qqq qqqq qq qqqqqqq qqqq q qq qqq qqq qq q qq qqqqqqqqq qqqqqqqq qqqqqqqqqq qqqqqqqqqq qqq qqqq qq q q q q q qq q qqqqqqqqqqq q qqqqqqqq q q qqqqqqqq qqqqqq qqqqqqqqqqqqq qq qq qqqqqqqqqqq q q qqqq qq qqq qqq qqq q qqqq qqqqqqqqqqqqq q qqqqqq qqqqqqqq qq q q qqqqq qqqqqq qqqq qqqqqq qq qq qq q q qqqqqq qqqqqqqqqq qqqqqq qq q qq qq q qq q qqqqq qqqqq qqqqq qqqqq qqqq qqqqqq q q qqq qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqq qqqqqqq qq q qqq q q qqqq qqqqqqqqq qq q qqqqqqqqqqqqqqqq qqqqqq q qqqq qq qqq qqqqqqq qqqqq qqqqq qq qq q qqqqqq qqqqqq q q qqqqqqqqqqq qqqq qqqqqqqqqqqq qqqq qqq qqqqqqq q qqqq qqq q qqqq qqq qqq qq qqqqqq q qq qqq qqq qq qq q qqqqqqqqq q qqq qqqqqqqqqqq qqq qqq q qq qqq q qqqqq qqqqq q qqq q qqqqqqqqq qqq q q qqqqqqqqqq qqqqqqqqq qqq qqq qqq qqq q qqqqqqqqqq qqqqqqqq qq qqqq q qqqq q qqqqq qq q qq qqqqq qq qq q qqq q qq q q qqqqqqq qq qqqqqqqqqq qq qq qqqqqqqqqqqqq q q q qq q qqqqqqqqqq q q qqqqqq q qqqqqqqqqqqqqqqqq qqqqq qqqq q qqqq qqqqq qq qq qq q qqq qq qq qqqqq q qqq qq q qqqq q q qqq qqq qqqqq qq qqqqqqqq qqqq q qqqqqqq q qqq qqqq qq qqqqqqqqqqq q qq qq qqqqqqqq qqqqqqq qqqq qqqqq qqqqqqqqqq qqqq qq q q qqq qqqqqq q q qqqqq qqqq q qq qq qqqqqq q qqq q qqqqqqqqqqqqqqq qq q qqqq q q qqqq q qq q qqq qqqq qqqq qqqqqqqq qqq qqqq qqqqqqqqq qqq qq qq qqq q qqq q qq q qqqqqqq q qqqqqqq qqqqq q qqqq qqqqqqq qqq qq q qqqqq qqqqqq qq qq qqqqqq qqqqq qq qqq q q qq qqqqqq q qq qqqq qq qq q qqqqqqqqqqqq qqq q q qqqqq q qqq qqqqqqq qq qqqqqqqqq qqqqq q qqqqqqqqq qq q qqq q qqqq q qq q qqq qqq qqq q qq qqqqqqqqq q qqqqqq qq q qqqq qq qqqq q qqq qqqq qq q qq qq qqq qqqqq qqqqqqq qqqq qqq qqq qq qq qqqqq qqqq q qqqqq qq qq qqq qqqq qqq qq qqqqqq q q qqq qqqqq q qqqqq qqqq q qqqqqqq qq qqqq q qqqqq qqqq q qq qq qqq qq qqqqq qqq q q qq q qqqq qqq qqq q q qq qqqq qq qqq q q q q qqqq qqqq qqq q qq qqqqqqq qqqqq qq qqqq qqqqqq qqqq qqqqqqqq qqqq q qqqq q qqq qq qqqq qq qqqq qqqq qq q qqqqqqqqq qqqqq qqqq −4 −2 0 2 4 789101112 Normal Q−Q Plot Theoretical Quantiles SampleQuantiles 0 20 40 60 80 100 0.00.20.40.60.81.0 Lag ACF Series mcmc.out
  • 26. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Gibbs Sampler For a multivariate problem, it is possible to use Gibbs sampler. Example Assume that the loss ratio of a company has a lognormal distribution, LN(µ, σ2 ), .e.g > LR <- c(0.958, 0.614, 0.977, 0.921, 0.756) Example Assume that we have a sample x from a N(µ, σ2 ). We want the posterior distribution of θ = (µ, σ2 ) given x . Observe here that if priors are Gaussian N µ0, τ2 and the inverse Gamma distribution IG(a, b), them    µ|σ2 , x ∼ N σ2 σ2 + nτ2 µ0 + nτ2 σ2 + nτ2 x, σ2 τ2 σ2 + nτ2 2 i=1 σ2 |µ, x ∼ IG n 2 + a, 1 2 n i=1 [xi − µ]2 + b More generally, we need the conditional distribution of θk|θ−k, x, for all k. > x <- log(LR) 26
  • 27. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Gibbs Sampler > xbar <- mean(x) > mu <- sigma2=rep(0,10000) > sigma2[1] <- 1/rgamma(1,shape=1,rate=1) > Z <- sigma2[1]/(sigma2[1]+n*1) > mu[1] <- rnorm(1,m=Z*0+(1-Z)*xbar, + sd=sqrt(1*Z)) > for (i in 2:10000){ + Z <- sigma2[i-1]/(sigma2[i-1]+n*1) + mu[i] <- rnorm(1,m=Z*0+(1-Z)*xbar, + sd=sqrt(1*Z)) + sigma2[i] <- 1/rgamma(1,shape=n/2+1, + rate <- (1/2)*(sum((x-mu[i])∧ 2))+1) + } 27 Time ts(mcmc.out) 0 2000 4000 6000 8000 10000 −0.188−0.186−0.184−0.182−0.180−0.178 Histogram of mcmc.out mcmc.out Density −0.188 −0.186 −0.184 −0.182 −0.180 −0.178 050100150200250300 q q qq q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q qq q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q qq q q q q q q q q q qq q q q q q q q q q q q q q qqq q q q q q q qq q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q qqq q q q q qq q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q qq q q q q q q qq q q q qq q q qqq q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q q q q qq q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q qqq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q qq q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q qq q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q qq q q q q q q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qqq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q q q q q qq q q q q q q q q q qq q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q qq q q q q q q q q q qq q q q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqq q qq q q q q q q q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qqq q q q q qq q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q qq qq q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq qq q q q q q q q q q q q q q q q q q q q q q q q q q qqq q q q qqq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q qq q qq q q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q q qq q q qq q q q q q q q q q q q q q qq q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q qqq q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q qq q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q qq q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q qq q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q qq q q q q q q q q q q q q q qq qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q −4 −2 0 2 4 −0.188−0.186−0.184−0.182−0.180−0.178 Normal Q−Q Plot Theoretical Quantiles SampleQuantiles 0 20 40 60 80 100 0.00.20.40.60.81.0 Lag ACF Series mcmc.out
  • 28. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Gibbs Sampler Example Consider some vector X = (X1, · · · , Xd) with indépendent components, Xi ∼ E(λi). We sample to sample from X given XT 1 > s for some threshold s > 0. • start with some starting point x0 such that xT 0 1 > s • pick up (randomly) i ∈ {1, · · · , d} Xi given Xi > s − xT (−i)1 has an Exponential distribution E(λi) draw Y ∼ E(λi) and set xi = y + (s − xT (−i)1)+ until xT (−i)1 + xi > s E.g. losses and allocated expenses 28
  • 29. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Gibbs Sampler > sim <- NULL > lambda <- c(1,2) > X <- c(3,3) > s <- 5 > for(k in 1:1000){ + i <- sample(1:2,1) + X[i] <- rexp(1,lambda[i])+ + max(0,s-sum(X[-i])) + while(sum(X)<s){ + X[i] <- rexp(1,lambda[i])+ + max(0,s-sum(X[-i])) } + sim <- rbind(sim,X) } 29 q q q qqq q q q qqq q q q q q qqq qq qqq q qq qq qqq q q q q qq q q qq qq qq qq q qq q q qq q q qq q q q q qq q q q q q qq qq qq qq q qq qqq qq qq q q q qqqq qq q qq q qqqqq q qqq q q q qqq q q qq q q q q qq q qq q q q q qq qq q qqq q qq q qq qq q q q qq q qq qq q qq q qq qq q q qqq qq q q qq q q qq q qq q q q q q qq qq q q qq q q qq q q q q qqq q q q qq q qq q qq q q q q q q q qqqq q q q q q q q qq qq qqq qq qq q q q q q q q q qq q qq q q qqqqq q q q qq qq q qqqq qq q qq qqqq qq qq qq q qq qq q q q q q q q q q q q q q qq qq qqq qq qqq q q q qqqq qq q q qq q qq q q q q q q q qq q q q q q q qq q q qq q q q qq q qq qq qq q q q q qqqq qq q qq q q q q qqq q qq q q q q qq q q qq q qqq qq q q qq q q q qq q q qqq qqqq q qq q q q q q q q qq qqq q qq qq q qqq qq q q qq q q qqqq q q q qq q q q qq q q q q q q qq qq q qq q qqq q q q q q qq qq q q qqq qq q q qq q q qq q q qq qq q qq q qq q qq q q q qq q qq q q qq q qq q q qq q qq q qq q qq q q q qq q q q q qq qqq q q qq qq q q qq q q q q q qq q qq q qq qq q qqq q q q q q q qq q qq q q q q q qq q q q q q qq q q qqq qqq q q q qqq qq qq q q q q q q q q q qq q qq q qq q q qq qq qq q q q qqqqq q q q q q qq q qqqq q q q q qq qq q q q q q qq qqq qq q qq q qq qq q qq q q qq qqqq q qqq q q qq q q q q qq q qqq q q qq q q q qq q q q q q q q qq qq q q q qq q qqq q qq q q q q q qq qqq qq q q q q q q q qqq qq q q qq qq q q q q q q qq q qq q q q qq qq q q q qqqqq q qq q q q qqq q q q qqq q q qq q qq q q qqq q qqq qqq qq q qqqq qqq q q qq q q qq qq qq q q q qq q q q q q q q q qq q q qq q q qq qqq q q qq q q q q q qqq q qqq qq q q qq qqqq qq q q q q q q q q qq q q q q q qqq q qq qq qq qq q qqqqqq q qq q qq qq q qq qq q q q q sim[,1] sim[,2] q q q qqq q q q qqq q q q q q qq q qq qqq q qq qq qqq q q q q qq q q qq qq qq qq q qq q q qq q q qq q q q q qq q q q q q qq qq qq qq q qq qqq qq q q q q q qqqq qq q qq q qqqqq q q qq q q q qqq q q qq q q q q qq q qq q q q q qq qq q qqq q qq q qq q q q q q qq q qq qq q qq q qq qq q q qqq qq q q qq q q qq q qq q q q q q q q qq q q qq q q qq q q q q qqq q q q qq q qq q qq q q q q q q q qqqq q q q q q q q qq qq q qq qq qq q q q q q q q q qq q qq q q qqqqq q q q qq qq q qqqq qq q qq qqqq qq qq qq q qq qq q q q q q q q q q q q q q qq qq qqq qq qqq q q q qqqq qq q q qq q qq q q q q q q q qq q q q q q q qq q q qq q q q qq q qq qq qq q q q q qqqq qq q qq q q q q qqq q qq q q q q qq q q qq q qqq qq q q qq q q q qq q q qqq q qqq q qq q q q q q q q qq qqq q qq qq q qq q qq q q qq q q qq qq q q q qq q q q qq q q q q q q qq qq q qq q qqq q q q q q qq qq q q qqq qq q q qq q q qq q q qq qq q q q q qq q qq q q q qq q qq q q qq q qq q q qq q qq q qq q qq q q q qq q q q q qq qqq q q qq qq q q qq q q q q q qq q qq q qq qq q qqq q q q q q q qq q qq q q q q q qq q q q q q qq q q qqq qqq q q q qqq qq qq q q q q q q q q q qq q qq q qq q q qq qq qq q q q qqqqq q q q q q qq q qqqq q q q q qq qq q q q q q qq qqq qq q qq q qq qq q qq q q qq qqqq q qqq q q qq q q q q qq q qqq q q qq q q q qq q q q q q q q qq qq q q q qq q qqq q qq q q q q q qq qqq qq q q q q q q q qqq qq q q qq qq q q q q q q qq q qq q q q qq qq q q q qqqqq q qq q q q q qq q q q qqq q q qq q qq q q qqq q qqq qqq qq q q qqq qqq q q qq q q qq qq qq q q q qq q q q q q q q q qq q q qq q q qq qqq q q qq q q q q q qqq q qqq qq q q qq qqqq qq q q q q q q q q qq q q q q q qq q q qq qq qq qq q qqqqq q q qq q qq qq q qq qq q q q q 0 2 4 6 8 10 012345
  • 30. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences JAGS and STAN Martyn Plummer developed JAGS Just another Gibbs sampler in 2007 (stable since 2013) in library(runjags). It is an open-source, enhanced, cross-platform version of an earlier engine BUGS (Bayesian inference Using Gibbs Sampling). STAN library(Rstan) is a newer tool that uses the Hamiltonian Monte Carlo (HMC) sampler. HMC uses information about the derivative of the posterior probability density to improve the algorithm. These derivatives are supplied by algorithm differentiation in C/C++ codes. 30
  • 31. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences JAGS on the N(µ, σ2 ) distribution > library(runjags) > jags.model <- " + model { + mu ∼ dnorm(mu0, 1/(sigma0∧ 2)) + g ∼ dgamma(k0, theta0) + sigma <- 1 / g + for (i in 1:n) { + logLR[i] ∼ dnorm(mu, g∧ 2) + } + }" > jags.data <- list(n=length(LR), + logLR=log(LR), mu0=-.2, sigma0=0.02, + k0=1, theta0=1) > jags.init <- list(list(mu=log(1.2), + g=1/0.5∧ 2), + list(mu=log(.8), + g=1/.2∧ 2)) > model.out <- autorun.jags(jags.model, + data=jags.data, inits=jags.init, + monitor=c("mu", "sigma"), n.chains=2) > traceplot(model.out$mcmc) > summary(model.out) 31
  • 32. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences STAN on the N(µ, σ2 ) distribution > library(rstan) > stan.model <- " + data { + int<lower=0> n; + vector[n] LR; + real mu0; + real<lower=0> sigma0; + real<lower=0> k0; + real<lower=0> theta0; + } + parameters { + real mu; + real<lower=0> sigma; + } + model { + mu ∼ normal(mu0, sigma0); + sigma ∼ inv_gamma(k0, theta0); + for (i in 1:n) + log(LR[i]) ∼ normal(mu, sigma); + }" > stan.data <- list(n=length(LR), r=LR, mu0=mu0, + sigma0=sigma0, k0=k0, theta0=theta0) > stan.out <- stan(model_code=stan.model, + data=stan.data, seed=2) > traceplot(stan.out) > print(stan.out, digits_summary=2) 32
  • 33. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences MCMC and Loss Models Example Consider some simple time series of Loss Ratios, LRt ∼ N(µt, σ2 ) where µt = φµt−1 + εt E.g. in JAGS we can define the vector µ = (µ1, · · · , µT ) recursively + model { + mu[1] ∼ dnorm(mu0, 1/(sigma0∧ 2)) + for (t in 2:T) { mu[t] ∼ dnorm(mu[t-1], 1/(sigma0∧ 2)) } + } 33
  • 34. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences MCMC and Claims Reserving Consider the following (cumulated) triangle, {Ci,j}, 0 1 2 3 4 5 0 3209 4372 4411 4428 4435 4456 1 3367 4659 4696 4720 4730 4752.4 2 3871 5345 5398 5420 5430.1 5455.8 3 4239 5917 6020 6046.1 6057.4 6086.1 4 4929 6794 6871.7 6901.5 6914.3 6947.1 5 5217 7204.3 7286.7 7318.3 7331.9 7366.7 λj 0000 1.3809 1.0114 1.0043 1.0018 1.0047 σj 0000 0.7248 0.3203 0.04587 0.02570 0.02570 (from Markus’ library(ChainLadder)). 34
  • 35. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences A Bayesian version of Chain Ladder 0 1 2 3 4 5 0 1.362418 1.008920 1.003854 1.001581 1.004735 1 1.383724 1.007942 1.005111 1.002119 2 1.380780 1.009916 1.004076 3 1.395848 1.017407 4 1.378373 λj 1.380900 1.011400 1.004300 1.001800 1.004700 σj 0.724800 0.320300 0.0458700 0.0257000 0.0257000 Assume that λi,j ∼ N µj, τj Ci,j . We can use Gibbs sampler to get the distribution of the transition factors, as well as a distribution for the reserves, 35
  • 36. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences > source("http://freakonometrics.free.fr/ triangleCL.R") > source("http://freakonometrics.free.fr/ bayesCL.R") > mcmcCL<-bayesian.triangle(PAID) > plot.mcmc(mcmcCL$Lambda[,1]) > plot.mcmc(mcmcCL$Lambda[,2]) > plot.mcmc(mcmcCL$reserves[,6]) > plot.mcmc(mcmcCL$reserves[,7]) > library(ChainLadder) > MCL<-MackChainLadder(PAID) > m<-sum(MCL$FullTriangle[,6]- + diag(MCL$FullTriangle[,6:1])) > stdev<-MCL$Total.Mack.S.E > hist(mcmcCL$reserves[,7],probability=TRUE, > breaks=20,col="light blue") > x=seq(2000,3000,by=10) > y=dnorm(x,m,stdev) > lines(x,y,col="red") 36 Time ts(mcmc.out) 0 200 400 600 800 1000 1.3651.3701.3751.3801.3851.3901.395 Histogram of mcmc.out mcmc.out Density 1.365 1.375 1.385 1.395 020406080 qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q qq q q q q q q q q q q qq q q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q qq q q q qq q q qq q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q −3 −2 −1 0 1 2 3 1.3651.3701.3751.3801.3851.3901.395 Normal Q−Q Plot Theoretical Quantiles SampleQuantiles 0 20 40 60 80 100 0.00.20.40.60.81.0 Lag ACF Series mcmc.out
  • 37. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences A Bayesian analysis of the Poisson Regression Model In a Poisson regression model, we have a sample (x, y) = {(xi, yi)}, yi ∼ P(µi) with log µi = β0 + β1xi. In the Bayesian framework, β0 and β1 are random variables. Example: for instance library(arm), (see also library(INLA)) The code is very simple : from > reg<-glm(dist∼speed,data=cars,family=poisson) get used to > regb <- bayesglm(dist∼speed,data=cars,family=poisson) 37
  • 38. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences A Bayesian analysis of the Poisson Regression Model > newd <- data.frame(speed=0:30) > predreg <- predict(reg,newdata= + newd,type="response") > plot(cars,axes) > lines(newd$speed,predreg,lwd=2) > library(arm) > beta01<-coef(sim(regb)) > for(i in 1:100){ > lines(newd$speed,exp(beta01[i,1]+ > beta01[i,2]*newd$speed))} > plot.mcmc(beta01[,1]) > plot.mcmc(beta01[,2]) 38 q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q speed dist 5 10 15 20 25 020406080100120
  • 39. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Other alternatives to classical statistics Consider a regression problem, µ(x) = E(Y |X = x), and assume that smoothed splines are used, µ(x) = k i=1 βjhj(x) Let H be the n × k matrix, H = [hj(xi)] = [h(xi)], then β = (HT H)−1 HT y, and se(µ(x)) = [h(x)T (HT H)−1 h(x)] 1 2 σ With a Gaussian assumption on the residuals, we can derive (approximated) confidence bands for predictions µ(x). 39
  • 40. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Smoothed regression with splines > dtf <- read.table( + "http://freakonometrics.free.fr/ theftinsurance.txt",sep=";", + header=TRUE) > names(dtf)<-c("x","y") > library(splines) > reg=lm(y∼bs(x,df=4),data=dtf) > yp=predict(reg,type="response", + newdata=new,interval="confidence") 40 q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q 0 5 10 15 050001000015000
  • 41. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Bayesian interpretation of the regression problem Assume here that β ∼ N(0, τΣ) as the priori distribution for β. Then, if (x, y) = {(xi, yi), i = 1, · · · , n}, the posterior distribution of µ(x) will be Gaussian, with E(µ(x)|x, y) = h(x)T HT H + σ2 τ Σ−1 −1 HT y cov(µ(x), µ(x )|x, y) = h(x)T HT H + σ2 τ Σ−1 −1 h(x )σ2 Example Σ = I 41
  • 42. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Bayesian interpretation of the regression problem > tau <- 100 > sigma <- summary(reg)$sigma > H=cbind(rep(1,nrow(dtf)),matrix(bs(b$x, + df=4),nrow=nrow(dtf))) > h=cbind(rep(1,nrow(new)),matrix(bs(new$x, + df=4),nrow=nrow(new))) > E=h%*%solve(t(H)%*%H + sigma∧ 2/tau* + diag(1,ncol(H)))%*%t(H)%*%dtf$y > V=h%*%solve(t(H)%*%H + sigma∧ 2/tau* + diag(1,ncol(H)))%*% t(h) * sigma∧ 2 > z=E+t(chol(V))%*%rnorm(length(E)) 42 q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q 0 5 10 15 050001000015000
  • 43. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Bootstrap strategy Assume that Y = µ(x) + ε, and based on the estimated model, generate pseudo observations, yi = µ(xi) + εi . Based on (x, y ) = {(xi, yi ), i = 1, · · · , n}, derive the estimator µ ( ) (and repeat) 43
  • 44. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Bootstrap strategy > for(b in 1:1000) { + i=sample(1:nrow(dtf),size=nrow(dtf), + replace=TRUE) + regb=lm(y∼bs(x,df=4),data=dtf[i,]) + ypb[,b]=predict(regb,type="response", + newdata=new)) + } Observe that the bootstrap is the Bayesian case, when τ → ∞. 44 q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q 0 5 10 15 050001000015000
  • 45. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Some additional references (before a conclusion) 45
  • 46. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Take-Away Conclusion Kendrick (2006), about computational economics: “our the- sis is that computational economics offers a way to improve this situation and to bring new life into the teaching of economics in colleges and universities [...] computational economics provides an opportunity for some students to move away from too much use of the lecture-exam paradigm and more use of a laboratory-paper paradigm in teaching under graduate economics. This opens the door for more creative activity on the part of the students by giv- ing them models developed by previous generations and challenging them to modify those models.” It is probably the same about computational actuarial science, thanks to R... 46
  • 47. Arthur CHARPENTIER - Bayesian Techniques & Actuarial Sciences Take-Away Conclusion Efron (2004) claimed that “Bayes rule is a very attractive way of reasoning, and fun to use, but using Bayes rule doesn’t make one a Bayesian ”. Bayesian models offer an interesting alternative to stan- dard statistical techniques, on small datasets as well as on large ones (see applications to hierarchical and longi- tudinal models). Computational issues are not that complicated... once you get used to the bayesian way of seen a statistical model. 47