SlideShare a Scribd company logo
Automatic Bayesian method for Numerical
Integration
Jagadeeswaran Rathinavel, Fred J. Hickernell
Department of Applied Mathematics, Illinois Institute of Technology
jrathin1@iit.edu
Thanks to the CASSC 2017 organizers
Introduction Bayesian Cubature Simulation results Conclusion References
Multivariate Integration
Approximate the d-dimensional integral over [0, 1)d
µ := E[f(X)] :=
ż
[0,1)d
f(x) ν(dx)
by a simple cubature rule
µ « ^µ :=
n´1ÿ
j=0
f(xj)wj =
ż
[0,1)d
f(x) ^ν(dx)
using points (xj)n´1
j=0 and associated weights wj. Then the approximation error
µ ´ ^µ =
ş
[0,1)d f(x) (ν ´ ^ν)(dx)
use extensible pointset and an algorithm that allows to add more points if needed.
2/22
Introduction Bayesian Cubature Simulation results Conclusion References
Motivating Example
How to measure the water volume of a creek in a given area (For Ex: 10sqft) ?
Figure: (Nuyens, 2007)
3/22
Introduction Bayesian Cubature Simulation results Conclusion References
What is multivariate integration?
Figure: (Nuyens, 2007)
A d-dim integral:
ż
[0,1)d
f(x) dx =
ż 1
0
ż 1
0
. . .
ż 1
0
f(x1, x2, . . . xd) dx1 dx2 . . . dxd
4/22
Introduction Bayesian Cubature Simulation results Conclusion References
What is multivariate integration?
Figure: (Nuyens, 2007)
A d-dim integral:
ż
[0,1)d
f(x) dx =
ż 1
0
ż 1
0
. . .
ż 1
0
f(x1, x2, . . . xd) dx1 dx2 . . . dxd
Grid points: Curse of dimensionality
4/22
Introduction Bayesian Cubature Simulation results Conclusion References
What is multivariate integration?
Figure: (Nuyens, 2007)
A d-dim integral:
ż
[0,1)d
f(x) dx =
ż 1
0
ż 1
0
. . .
ż 1
0
f(x1, x2, . . . xd) dx1 dx2 . . . dxd
Grid points: Curse of dimensionality
IID Monte Carlo: Converges O(n´ 1
2 )
^µ = 1
n
řn´1
j=0 f(xj)
4/22
Introduction Bayesian Cubature Simulation results Conclusion References
What is multivariate integration?
Figure: (Nuyens, 2007)
A d-dim integral:
ż
[0,1)d
f(x) dx =
ż 1
0
ż 1
0
. . .
ż 1
0
f(x1, x2, . . . xd) dx1 dx2 . . . dxd
Grid points: Curse of dimensionality
IID Monte Carlo: Converges O(n´ 1
2 )
^µ = 1
n
řn´1
j=0 f(xj)
Typical Quasi Monte Carlo: Converges O(n´1+
)
xj chosen carefully (low-discrepancy points)
4/22
Introduction Bayesian Cubature Simulation results Conclusion References
What is multivariate integration?
Figure: (Nuyens, 2007)
A d-dim integral:
ż
[0,1)d
f(x) dx =
ż 1
0
ż 1
0
. . .
ż 1
0
f(x1, x2, . . . xd) dx1 dx2 . . . dxd
Grid points: Curse of dimensionality
IID Monte Carlo: Converges O(n´ 1
2 )
^µ = 1
n
řn´1
j=0 f(xj)
Typical Quasi Monte Carlo: Converges O(n´1+
)
xj chosen carefully (low-discrepancy points)
Can we do better? 4/22
Introduction Bayesian Cubature Simulation results Conclusion References
Algorithm
1: procedure AutoCubature(f, errtol) Ź Integrate within the error tolerance
2: n Ð 28
3: do
4: Generate (xi)n´1
i=0
5: Sample (f(xi))n´1
i=0
6: Compute errn
7: n Ð 2 ˆ n
8: while errn ą errtol Ź Iterate till error tolerance is met
9: Compute weights (wi)n´1
i=0
10: Compute integral ^µn
11: return ^µn Ź Integral estimate ^µn
12: end procedure
Problem:
How to choose (xj)n´1
j=0 and (wj)n´1
j=0 to make |µ ´ ^µ| small?
Given error tolerance errtol, how big must ‘n’ be to guarantee |µ ´ ^µ| ď errtol
5/22
Introduction Bayesian Cubature Simulation results Conclusion References
Algorithm
1: procedure AutoCubature(f, errtol) Ź Integrate within the error tolerance
2: n Ð 28
3: do
4: Generate (xi)n´1
i=0
5: Sample (f(xi))n´1
i=0
6: Compute errn
7: n Ð 2 ˆ n
8: while errn ą errtol Ź Iterate till error tolerance is met
9: Compute weights (wi)n´1
i=0
10: Compute integral ^µn
11: return ^µn Ź Integral estimate ^µn
12: end procedure
Problem:
How to choose (xj)n´1
j=0 and (wj)n´1
j=0 to make |µ ´ ^µ| small?
Given error tolerance errtol, how big must ‘n’ be to guarantee |µ ´ ^µ| ď errtol
5/22
Introduction Bayesian Cubature Simulation results Conclusion References
Bayesian Trio Identity
Random f : f „ GP(0, s2
Cθ), a Gaussian process
from the sample space F
with zero mean and covariance kernel, s2
Cθ, Cθ : X ˆ X Ñ R. Then
6/22
Introduction Bayesian Cubature Simulation results Conclusion References
Bayesian Trio Identity
Random f : f „ GP(0, s2
Cθ), a Gaussian process
from the sample space F
with zero mean and covariance kernel, s2
Cθ, Cθ : X ˆ X Ñ R. Then
c0 =
ż
XˆX
C(x, t) ν(dx)ν(dt), c =
ż
X
C(xi, t) ν(dt)
n´1
i=0
C = C(xi, xj)
n´1
i,j=0
, w = wi
n´1
i=0
,
µ ´ ^µ =
ş
X f(x) (ν ´ ^ν)(dx)
s
a
c0 ´ 2cTw + wTCwloooooooooooooomoooooooooooooon
ALNB
(f, ν ´ ^ν)
a
c0 ´ 2cTw + wTCwlooooooooooooomooooooooooooon
DSCB
(ν ´ ^ν)
sloomoon
VARB
(f)
6/22
Introduction Bayesian Cubature Simulation results Conclusion References
Bayesian Trio Identity
Random f : f „ GP(0, s2
Cθ), a Gaussian process
from the sample space F
with zero mean and covariance kernel, s2
Cθ, Cθ : X ˆ X Ñ R. Then
c0 =
ż
XˆX
C(x, t) ν(dx)ν(dt), c =
ż
X
C(xi, t) ν(dt)
n´1
i=0
C = C(xi, xj)
n´1
i,j=0
, w = wi
n´1
i=0
,
µ ´ ^µ =
ş
X f(x) (ν ´ ^ν)(dx)
s
a
c0 ´ 2cTw + wTCwloooooooooooooomoooooooooooooon
ALNB
(f, ν ´ ^ν)
a
c0 ´ 2cTw + wTCwlooooooooooooomooooooooooooon
DSCB
(ν ´ ^ν)
sloomoon
VARB
(f)
The scale parameter, s, and shape parameter, θ, should be estimated.
w = C´1
c minimizes DSCB
(ν ´ ^ν)
makes ALNB
(f, ν ´ ^ν)
ˇ
ˇ(f(xi) = yi)n´1
i=0 „ N(0, 1)
Ref: Diaconis (1988), O’Hagan (1991), Ritter (2000), Rasmussen (2003) and
others 6/22
Introduction Bayesian Cubature Simulation results Conclusion References
Covariance kernel
shift invariant kernel
Cθ(x, t) :=
ÿ
mPZd
λm,θ e
?
´12πmT
(x´t)
, 0 ď |x ´ t| ď 1
7/22
Introduction Bayesian Cubature Simulation results Conclusion References
Covariance kernel
shift invariant kernel
Cθ(x, t) :=
ÿ
mPZd
λm,θ e
?
´12πmT
(x´t)
, 0 ď |x ´ t| ď 1
when
λm,θ :=
dź
l=1
1
max(|ml|
θ , 1)r
θď1
, with λ0,θ = 1, θ P (0, 1]d
, r P 2N
Cθ(x, t) =
dź
l=1
1 ´ θr
l
(2π
?
´1)r
r!
Br(|xl ´ tl|), θ P (0, 1]d
, r P 2N
7/22
Introduction Bayesian Cubature Simulation results Conclusion References
Covariance kernel
shift invariant kernel
Cθ(x, t) :=
ÿ
mPZd
λm,θ e
?
´12πmT
(x´t)
, 0 ď |x ´ t| ď 1
when
λm,θ :=
dź
l=1
1
max(|ml|
θ , 1)r
θď1
, with λ0,θ = 1, θ P (0, 1]d
, r P 2N
Cθ(x, t) =
dź
l=1
1 ´ θr
l
(2π
?
´1)r
r!
Br(|xl ´ tl|), θ P (0, 1]d
, r P 2N
where Br is Bernoulli polynomial of order r (Olver et al., 2013).
Br(x) =
´r!
(2π
?
´1)r
∞ÿ
k‰0,k=´∞
e2π
?
´1kx
kr
#
for r = 1, 0 ă x ă 1
for r = 2, 3, . . . 0 ď x ď 1
7/22
Introduction Bayesian Cubature Simulation results Conclusion References
Covariance kernel
shift invariant kernel
Cθ(x, t) :=
ÿ
mPZd
λm,θ e
?
´12πmT
(x´t)
, 0 ď |x ´ t| ď 1
when
λm,θ :=
dź
l=1
1
max(|ml|
θ , 1)r
θď1
, with λ0,θ = 1, θ P (0, 1]d
, r P 2N
Cθ(x, t) =
dź
l=1
1 ´ θr
l
(2π
?
´1)r
r!
Br(|xl ´ tl|), θ P (0, 1]d
, r P 2N
where Br is Bernoulli polynomial of order r (Olver et al., 2013).
Br(x) =
´r!
(2π
?
´1)r
∞ÿ
k‰0,k=´∞
e2π
?
´1kx
kr
#
for r = 1, 0 ă x ă 1
for r = 2, 3, . . . 0 ď x ď 1
Given (xi : i = 0, ..., n ´ 1), the symmetric kernel matrix is formed
Cθ = Cθ(xi, xj)
n´1
i,j=0
7/22
Introduction Bayesian Cubature Simulation results Conclusion References
Quais Monte-Carlo : Sample “more uniformly”
8/22
Introduction Bayesian Cubature Simulation results Conclusion References
Rank-1 Lattice rules : Low discrepancy point set
Given the “generating vector” g, the construction of n - Rank-1 lattice points is
given by
"
kg
n
* n´1
k=0
then the Lattice rule approximation is
1
n
n´1ÿ
k=0
f
"
kg
n
*
with t.u the fractional part, i.e, ‘modulo 1’ operator.
9/22
Introduction Bayesian Cubature Simulation results Conclusion References
Rank-1 Lattice rules : Low discrepancy point set
Shift invariant kernel + Lattice points = ’Symmetric circulant kernel’ matrix
9/22
Introduction Bayesian Cubature Simulation results Conclusion References
Bayesian Cubature
µ =
ż
X
f(x)ν(dx) « ^µn =
n´1ÿ
i=0
wif(xi)
10/22
Introduction Bayesian Cubature Simulation results Conclusion References
Bayesian Cubature
µ =
ż
X
f(x)ν(dx) « ^µn =
n´1ÿ
i=0
wif(xi)
Assume f „ GP(0, s2
Cθ)
µ ´ ^µn
ˇ
ˇ(f(xi) = yi)n´1
i=0 „ N yT
(C´1
c ´ w), s2
(c0 ´ cT
C´1
c)
10/22
Introduction Bayesian Cubature Simulation results Conclusion References
Bayesian Cubature
µ =
ż
X
f(x)ν(dx) « ^µn =
n´1ÿ
i=0
wif(xi)
Assume f „ GP(0, s2
Cθ)
µ ´ ^µn
ˇ
ˇ(f(xi) = yi)n´1
i=0 „ N yT
(C´1
c ´ w), s2
(c0 ´ cT
C´1
c)
Choosing w = C´1
^θ
c^θ is optimal
P[|µ ´ ^µn| ď errn] ě 99% for errn = 2.58
d
c^θ,0 ´ cT
^θ
C´1
^θ
c^θ
yTC´1
^θ
y
n
MLE ^θ = argmin
θ
yT
C´1
θ y
[det(C´1
θ )]1/n
, where y = f(xi)
n´1
i=0
.
10/22
Introduction Bayesian Cubature Simulation results Conclusion References
Bayesian Cubature
µ =
ż
X
f(x)ν(dx) « ^µn =
n´1ÿ
i=0
wif(xi)
Assume f „ GP(0, s2
Cθ)
µ ´ ^µn
ˇ
ˇ(f(xi) = yi)n´1
i=0 „ N yT
(C´1
c ´ w), s2
(c0 ´ cT
C´1
c)
Choosing w = C´1
^θ
c^θ is optimal
P[|µ ´ ^µn| ď errn] ě 99% for errn = 2.58
d
c^θ,0 ´ cT
^θ
C´1
^θ
c^θ
yTC´1
^θ
y
n
MLE ^θ = argmin
θ
yT
C´1
θ y
[det(C´1
θ )]1/n
, where y = f(xi)
n´1
i=0
.
C´1
typically requires O(n3
) operations.
But with covariance kernel C for which matrix C is symmetric circulant.
So operations on C require only O(n log(n)) operations.
10/22
Introduction Bayesian Cubature Simulation results Conclusion References
Optimal Shape parameter θ
Maximum likelihood estimate of θ by
argmin
θ
yT
C´1
θ y
[det(C´1
θ )]1/n
11/22
Introduction Bayesian Cubature Simulation results Conclusion References
Optimal Shape parameter θ
Maximum likelihood estimate of θ by
argmin
θ
yT
C´1
θ y
[det(C´1
θ )]1/n
simplified to (Bernstein, 2009)
argmin
θ
log
n´1ÿ
i=0
|zi|2
γi
+
1
n
n´1ÿ
i=0
log(γi)
where (γi) eigen values of Cθ
z := (zi)n´1
i=0 = DFT y , γ := (γi)n´1
i=0 = DFT C(xi, x0)
n´1
i=0
O(n log(n)) operations to estimate the ^θ
11/22
Introduction Bayesian Cubature Simulation results Conclusion References
Cubature rule - Computing ^µ efficiently
Define
Cθ = Cθ(xi, xj)
n´1
i,j=0
, cθ =
ż
[0,1)d
Cθ(xi, x)dx
n´1
i=0
12/22
Introduction Bayesian Cubature Simulation results Conclusion References
Cubature rule - Computing ^µ efficiently
Define
Cθ = Cθ(xi, xj)
n´1
i,j=0
, cθ =
ż
[0,1)d
Cθ(xi, x)dx
n´1
i=0
To find the approximate mean ^µ
^µ = wT
y = C´1
θ cθloomoon
wT
y
12/22
Introduction Bayesian Cubature Simulation results Conclusion References
Cubature rule - Computing ^µ efficiently
Define
Cθ = Cθ(xi, xj)
n´1
i,j=0
, cθ =
ż
[0,1)d
Cθ(xi, x)dx
n´1
i=0
To find the approximate mean ^µ
^µ = wT
y = C´1
θ cθloomoon
wT
y
Further simplified using shift-invariant and circulant matrix property (Bernstein,
2009)
^µ =
ş
[0,1]d K x, x0 dx ˆ
řn´1
i=0 C(x0, xi)
´1
ˆ
řn´1
i=0 yi
O(n) operations to compute the ^µ
12/22
Introduction Bayesian Cubature Simulation results Conclusion References
Computing error bound efficiently
Let’s simplify: errn = 2.58
d
C^θ,0 ´ cT
^θ
C´1
^θ
c^θ
yTC´1
^θ
y
n
13/22
Introduction Bayesian Cubature Simulation results Conclusion References
Computing error bound efficiently
Let’s simplify: errn = 2.58
d
C^θ,0 ´ cT
^θ
C´1
^θ
c^θ
yTC´1
^θ
y
n
using the facts about the shift invariant kernel and R-1 Lattice points (Bernstein,
2009), where
z := (zi)n´1
i=0 = DFT (y) , γ := (γi)n´1
i=0 = DFT (C(xi, x0))n´1
i=0
13/22
Introduction Bayesian Cubature Simulation results Conclusion References
Computing error bound efficiently
Let’s simplify: errn = 2.58
d
C^θ,0 ´ cT
^θ
C´1
^θ
c^θ
yTC´1
^θ
y
n
using the facts about the shift invariant kernel and R-1 Lattice points (Bernstein,
2009), where
z := (zi)n´1
i=0 = DFT (y) , γ := (γi)n´1
i=0 = DFT (C(xi, x0))n´1
i=0
Finally
errn = 2.58
g
f
f
e 1 ´
1
řn´1
i=0 C(x0, xi)
1
n
n´1ÿ
i=0
|zi|2
γi
13/22
Introduction Bayesian Cubature Simulation results Conclusion References
Periodization Transforms
Our algorithm works best with periodic functions
Baker : ˜f(t) = f 1 ´ 2|t ´
1
2
|
C0 : ˜f(t) = f 3t2
´ 2t3
dź
j=1
(6tj(1 ´ tj))
C1 : ˜f(t) = f t3
(10 ´ 15t + 6t2
)
dź
j=1
(30t2
j (1 ´ tj)2
)
Sidi’s C1 : ˜f(t) = f tj ´
sin(2πtj)
2π
d
j=1
dź
j=1
(1 ´ cos(2πtj))
14/22
Introduction Bayesian Cubature Simulation results Conclusion References
Test functions for Numerical integration:
Multivariate Normal (MVN)
µ =
ż
[a,b]
exp ´1
2 tT
Σ´1
t
a
(2π)d det(Σ)
dt
Genz (1993)
=
ż
[0,1]d´1
f(x) dx
Keister : µ =
ż
Rd
cos( x ) exp(´ x 2
) dx, d = 1, 2, . . . .
Exp(Cos) : µ =
ż
(0,1]d
exp(cos(x))dx, d = 1, 2, . . . .
15/22
Introduction Bayesian Cubature Simulation results Conclusion References
Integrating Multivariate Normal Probability
16/22
Introduction Bayesian Cubature Simulation results Conclusion References
Exponential of Cosine
17/22
Introduction Bayesian Cubature Simulation results Conclusion References
Keister Integral Example
18/22
Introduction Bayesian Cubature Simulation results Conclusion References
Summary
Automatic Bayesian Cubature with O(n) complexity and O(n log(n)) MLE
Complexity
Having the advantages of a kernel method and the low computation cost of
Quasi Monte carlo
Scalable based on the complexity of the Integrand
i.e, Kernel order and Lattice-points can be chosen to suit the smoothness of
the integrand
More about Guaranteed Automatic Algorithms (GAIL):
http://gailgithub.github.io/GAIL_Dev/
These slides will also be available here.
19/22
Introduction Bayesian Cubature Simulation results Conclusion References
Future work
Dimension independent
Tighter error bound
Roundoff error
Lattice points optimization specific the the kernel
Choosing kernel order automatically as part of MLE
Can we compute n directly instead of in Loop?
Choosing Periodization transform automatically
20/22
Thank you
Introduction Bayesian Cubature Simulation results Conclusion References
References I
Bernstein, Dennis S. 2009. Matrix mathematics, theory, facts, and formulas, Princeton University
Press.
Diaconis, P. 1988. Bayesian numerical analysis, Statistical decision theory and related topics iv,
papers from the 4th purdue symp., west lafayette/indiana 1986, pp. 163–175.
Genz, A. 1993. Comparison of methods for the computation of multivariate normal probabilities,
Computing Science and Statistics 25, 400–405.
Nuyens, Dirk. 2007. Fast construction of good lattice rules, Ph.D. Thesis.
O’Hagan, A. 1991. Bayes-Hermite quadrature, J. Statist. Plann. Inference 29, 245–260.
Olver, F. W. J., D. W. Lozier, R. F. Boisvert, C. W. Clark, and A. B. O. Dalhuis. 2013. Digital library of
mathematical functions.
Rasmussen, C. E. 2003. Bayesian Monte Carlo, Advances in Neural Information Processing
Systems, pp. 489–496.
Ritter, K. 2000. Average-case analysis of numerical problems, Lecture Notes in Mathematics,
vol. 1733, Springer-Verlag, Berlin.
22/22

More Related Content

What's hot

MUMS Opening Workshop - Panel Discussion: Facts About Some Statisitcal Models...
MUMS Opening Workshop - Panel Discussion: Facts About Some Statisitcal Models...MUMS Opening Workshop - Panel Discussion: Facts About Some Statisitcal Models...
MUMS Opening Workshop - Panel Discussion: Facts About Some Statisitcal Models...
The Statistical and Applied Mathematical Sciences Institute
 
Tulane March 2017 Talk
Tulane March 2017 TalkTulane March 2017 Talk
Tulane March 2017 Talk
Fred J. Hickernell
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
Christian Robert
 
ABC convergence under well- and mis-specified models
ABC convergence under well- and mis-specified modelsABC convergence under well- and mis-specified models
ABC convergence under well- and mis-specified models
Christian Robert
 
Slides: Jeffreys centroids for a set of weighted histograms
Slides: Jeffreys centroids for a set of weighted histogramsSlides: Jeffreys centroids for a set of weighted histograms
Slides: Jeffreys centroids for a set of weighted histograms
Frank Nielsen
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
Multiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximationsMultiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximations
Christian Robert
 
Geodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGeodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and Graphics
Gabriel Peyré
 
accurate ABC Oliver Ratmann
accurate ABC Oliver Ratmannaccurate ABC Oliver Ratmann
accurate ABC Oliver Ratmannolli0601
 
Lesson 31: Evaluating Definite Integrals
Lesson 31: Evaluating Definite IntegralsLesson 31: Evaluating Definite Integrals
Lesson 31: Evaluating Definite Integrals
Matthew Leingang
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
Low Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsLow Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse Problems
Gabriel Peyré
 
Proximal Splitting and Optimal Transport
Proximal Splitting and Optimal TransportProximal Splitting and Optimal Transport
Proximal Splitting and Optimal Transport
Gabriel Peyré
 
ABC with Wasserstein distances
ABC with Wasserstein distancesABC with Wasserstein distances
ABC with Wasserstein distances
Christian Robert
 
Efficient Simulations for Contamination of Groundwater Aquifers under Uncerta...
Efficient Simulations for Contamination of Groundwater Aquifers under Uncerta...Efficient Simulations for Contamination of Groundwater Aquifers under Uncerta...
Efficient Simulations for Contamination of Groundwater Aquifers under Uncerta...
Alexander Litvinenko
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
The Statistical and Applied Mathematical Sciences Institute
 
Coordinate sampler: A non-reversible Gibbs-like sampler
Coordinate sampler: A non-reversible Gibbs-like samplerCoordinate sampler: A non-reversible Gibbs-like sampler
Coordinate sampler: A non-reversible Gibbs-like sampler
Christian Robert
 
comments on exponential ergodicity of the bouncy particle sampler
comments on exponential ergodicity of the bouncy particle samplercomments on exponential ergodicity of the bouncy particle sampler
comments on exponential ergodicity of the bouncy particle sampler
Christian Robert
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...
Valentin De Bortoli
 

What's hot (20)

MUMS Opening Workshop - Panel Discussion: Facts About Some Statisitcal Models...
MUMS Opening Workshop - Panel Discussion: Facts About Some Statisitcal Models...MUMS Opening Workshop - Panel Discussion: Facts About Some Statisitcal Models...
MUMS Opening Workshop - Panel Discussion: Facts About Some Statisitcal Models...
 
Tulane March 2017 Talk
Tulane March 2017 TalkTulane March 2017 Talk
Tulane March 2017 Talk
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
 
ABC convergence under well- and mis-specified models
ABC convergence under well- and mis-specified modelsABC convergence under well- and mis-specified models
ABC convergence under well- and mis-specified models
 
Slides: Jeffreys centroids for a set of weighted histograms
Slides: Jeffreys centroids for a set of weighted histogramsSlides: Jeffreys centroids for a set of weighted histograms
Slides: Jeffreys centroids for a set of weighted histograms
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Multiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximationsMultiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximations
 
Geodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGeodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and Graphics
 
accurate ABC Oliver Ratmann
accurate ABC Oliver Ratmannaccurate ABC Oliver Ratmann
accurate ABC Oliver Ratmann
 
Lesson 31: Evaluating Definite Integrals
Lesson 31: Evaluating Definite IntegralsLesson 31: Evaluating Definite Integrals
Lesson 31: Evaluating Definite Integrals
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Low Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsLow Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse Problems
 
Proximal Splitting and Optimal Transport
Proximal Splitting and Optimal TransportProximal Splitting and Optimal Transport
Proximal Splitting and Optimal Transport
 
ABC with Wasserstein distances
ABC with Wasserstein distancesABC with Wasserstein distances
ABC with Wasserstein distances
 
Efficient Simulations for Contamination of Groundwater Aquifers under Uncerta...
Efficient Simulations for Contamination of Groundwater Aquifers under Uncerta...Efficient Simulations for Contamination of Groundwater Aquifers under Uncerta...
Efficient Simulations for Contamination of Groundwater Aquifers under Uncerta...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
 
Coordinate sampler: A non-reversible Gibbs-like sampler
Coordinate sampler: A non-reversible Gibbs-like samplerCoordinate sampler: A non-reversible Gibbs-like sampler
Coordinate sampler: A non-reversible Gibbs-like sampler
 
comments on exponential ergodicity of the bouncy particle sampler
comments on exponential ergodicity of the bouncy particle samplercomments on exponential ergodicity of the bouncy particle sampler
comments on exponential ergodicity of the bouncy particle sampler
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...
 

Similar to Automatic Bayesian method for Numerical Integration

Automatic bayesian cubature
Automatic bayesian cubatureAutomatic bayesian cubature
Automatic bayesian cubature
Jagadeeswaran Rathinavel
 
QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017
Fred J. Hickernell
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
SIAM - Minisymposium on Guaranteed numerical algorithms
SIAM - Minisymposium on Guaranteed numerical algorithmsSIAM - Minisymposium on Guaranteed numerical algorithms
SIAM - Minisymposium on Guaranteed numerical algorithms
Jagadeeswaran Rathinavel
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
Christian Robert
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clustering
Frank Nielsen
 
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
The Statistical and Applied Mathematical Sciences Institute
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...
Alexander Litvinenko
 
Divergence center-based clustering and their applications
Divergence center-based clustering and their applicationsDivergence center-based clustering and their applications
Divergence center-based clustering and their applications
Frank Nielsen
 
Nested sampling
Nested samplingNested sampling
Nested sampling
Christian Robert
 
Adaptive Restore algorithm & importance Monte Carlo
Adaptive Restore algorithm & importance Monte CarloAdaptive Restore algorithm & importance Monte Carlo
Adaptive Restore algorithm & importance Monte Carlo
Christian Robert
 
SOLVING BVPs OF SINGULARLY PERTURBED DISCRETE SYSTEMS
SOLVING BVPs OF SINGULARLY PERTURBED DISCRETE SYSTEMSSOLVING BVPs OF SINGULARLY PERTURBED DISCRETE SYSTEMS
SOLVING BVPs OF SINGULARLY PERTURBED DISCRETE SYSTEMS
Tahia ZERIZER
 
Zeros of orthogonal polynomials generated by a Geronimus perturbation of meas...
Zeros of orthogonal polynomials generated by a Geronimus perturbation of meas...Zeros of orthogonal polynomials generated by a Geronimus perturbation of meas...
Zeros of orthogonal polynomials generated by a Geronimus perturbation of meas...
Edmundo José Huertas Cejudo
 
Density theorems for anisotropic point configurations
Density theorems for anisotropic point configurationsDensity theorems for anisotropic point configurations
Density theorems for anisotropic point configurations
VjekoslavKovac1
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
Christian Robert
 
A numerical method to solve fractional Fredholm-Volterra integro-differential...
A numerical method to solve fractional Fredholm-Volterra integro-differential...A numerical method to solve fractional Fredholm-Volterra integro-differential...
A numerical method to solve fractional Fredholm-Volterra integro-differential...
OctavianPostavaru
 
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Gabriel Peyré
 
Triangle counting handout
Triangle counting handoutTriangle counting handout
Triangle counting handoutcsedays
 
Murphy: Machine learning A probabilistic perspective: Ch.9
Murphy: Machine learning A probabilistic perspective: Ch.9Murphy: Machine learning A probabilistic perspective: Ch.9
Murphy: Machine learning A probabilistic perspective: Ch.9
Daisuke Yoneoka
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
Christian Robert
 

Similar to Automatic Bayesian method for Numerical Integration (20)

Automatic bayesian cubature
Automatic bayesian cubatureAutomatic bayesian cubature
Automatic bayesian cubature
 
QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
SIAM - Minisymposium on Guaranteed numerical algorithms
SIAM - Minisymposium on Guaranteed numerical algorithmsSIAM - Minisymposium on Guaranteed numerical algorithms
SIAM - Minisymposium on Guaranteed numerical algorithms
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clustering
 
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...
 
Divergence center-based clustering and their applications
Divergence center-based clustering and their applicationsDivergence center-based clustering and their applications
Divergence center-based clustering and their applications
 
Nested sampling
Nested samplingNested sampling
Nested sampling
 
Adaptive Restore algorithm & importance Monte Carlo
Adaptive Restore algorithm & importance Monte CarloAdaptive Restore algorithm & importance Monte Carlo
Adaptive Restore algorithm & importance Monte Carlo
 
SOLVING BVPs OF SINGULARLY PERTURBED DISCRETE SYSTEMS
SOLVING BVPs OF SINGULARLY PERTURBED DISCRETE SYSTEMSSOLVING BVPs OF SINGULARLY PERTURBED DISCRETE SYSTEMS
SOLVING BVPs OF SINGULARLY PERTURBED DISCRETE SYSTEMS
 
Zeros of orthogonal polynomials generated by a Geronimus perturbation of meas...
Zeros of orthogonal polynomials generated by a Geronimus perturbation of meas...Zeros of orthogonal polynomials generated by a Geronimus perturbation of meas...
Zeros of orthogonal polynomials generated by a Geronimus perturbation of meas...
 
Density theorems for anisotropic point configurations
Density theorems for anisotropic point configurationsDensity theorems for anisotropic point configurations
Density theorems for anisotropic point configurations
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
 
A numerical method to solve fractional Fredholm-Volterra integro-differential...
A numerical method to solve fractional Fredholm-Volterra integro-differential...A numerical method to solve fractional Fredholm-Volterra integro-differential...
A numerical method to solve fractional Fredholm-Volterra integro-differential...
 
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
 
Triangle counting handout
Triangle counting handoutTriangle counting handout
Triangle counting handout
 
Murphy: Machine learning A probabilistic perspective: Ch.9
Murphy: Machine learning A probabilistic perspective: Ch.9Murphy: Machine learning A probabilistic perspective: Ch.9
Murphy: Machine learning A probabilistic perspective: Ch.9
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 

Recently uploaded

EY - Supply Chain Services 2018_template.pptx
EY - Supply Chain Services 2018_template.pptxEY - Supply Chain Services 2018_template.pptx
EY - Supply Chain Services 2018_template.pptx
AlguinaldoKong
 
NuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final versionNuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final version
pablovgd
 
Large scale production of streptomycin.pptx
Large scale production of streptomycin.pptxLarge scale production of streptomycin.pptx
Large scale production of streptomycin.pptx
Cherry
 
GBSN- Microbiology (Lab 3) Gram Staining
GBSN- Microbiology (Lab 3) Gram StainingGBSN- Microbiology (Lab 3) Gram Staining
GBSN- Microbiology (Lab 3) Gram Staining
Areesha Ahmad
 
general properties of oerganologametal.ppt
general properties of oerganologametal.pptgeneral properties of oerganologametal.ppt
general properties of oerganologametal.ppt
IqrimaNabilatulhusni
 
Predicting property prices with machine learning algorithms.pdf
Predicting property prices with machine learning algorithms.pdfPredicting property prices with machine learning algorithms.pdf
Predicting property prices with machine learning algorithms.pdf
binhminhvu04
 
FAIR & AI Ready KGs for Explainable Predictions
FAIR & AI Ready KGs for Explainable PredictionsFAIR & AI Ready KGs for Explainable Predictions
FAIR & AI Ready KGs for Explainable Predictions
Michel Dumontier
 
platelets_clotting_biogenesis.clot retractionpptx
platelets_clotting_biogenesis.clot retractionpptxplatelets_clotting_biogenesis.clot retractionpptx
platelets_clotting_biogenesis.clot retractionpptx
muralinath2
 
RNA INTERFERENCE: UNRAVELING GENETIC SILENCING
RNA INTERFERENCE: UNRAVELING GENETIC SILENCINGRNA INTERFERENCE: UNRAVELING GENETIC SILENCING
RNA INTERFERENCE: UNRAVELING GENETIC SILENCING
AADYARAJPANDEY1
 
Lab report on liquid viscosity of glycerin
Lab report on liquid viscosity of glycerinLab report on liquid viscosity of glycerin
Lab report on liquid viscosity of glycerin
ossaicprecious19
 
Anemia_ different types_causes_ conditions
Anemia_ different types_causes_ conditionsAnemia_ different types_causes_ conditions
Anemia_ different types_causes_ conditions
muralinath2
 
Richard's entangled aventures in wonderland
Richard's entangled aventures in wonderlandRichard's entangled aventures in wonderland
Richard's entangled aventures in wonderland
Richard Gill
 
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
Sérgio Sacani
 
Hemoglobin metabolism_pathophysiology.pptx
Hemoglobin metabolism_pathophysiology.pptxHemoglobin metabolism_pathophysiology.pptx
Hemoglobin metabolism_pathophysiology.pptx
muralinath2
 
The ASGCT Annual Meeting was packed with exciting progress in the field advan...
The ASGCT Annual Meeting was packed with exciting progress in the field advan...The ASGCT Annual Meeting was packed with exciting progress in the field advan...
The ASGCT Annual Meeting was packed with exciting progress in the field advan...
Health Advances
 
Lateral Ventricles.pdf very easy good diagrams comprehensive
Lateral Ventricles.pdf very easy good diagrams comprehensiveLateral Ventricles.pdf very easy good diagrams comprehensive
Lateral Ventricles.pdf very easy good diagrams comprehensive
silvermistyshot
 
Circulatory system_ Laplace law. Ohms law.reynaults law,baro-chemo-receptors-...
Circulatory system_ Laplace law. Ohms law.reynaults law,baro-chemo-receptors-...Circulatory system_ Laplace law. Ohms law.reynaults law,baro-chemo-receptors-...
Circulatory system_ Laplace law. Ohms law.reynaults law,baro-chemo-receptors-...
muralinath2
 
Seminar of U.V. Spectroscopy by SAMIR PANDA
 Seminar of U.V. Spectroscopy by SAMIR PANDA Seminar of U.V. Spectroscopy by SAMIR PANDA
Seminar of U.V. Spectroscopy by SAMIR PANDA
SAMIR PANDA
 
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATIONPRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
ChetanK57
 
Multi-source connectivity as the driver of solar wind variability in the heli...
Multi-source connectivity as the driver of solar wind variability in the heli...Multi-source connectivity as the driver of solar wind variability in the heli...
Multi-source connectivity as the driver of solar wind variability in the heli...
Sérgio Sacani
 

Recently uploaded (20)

EY - Supply Chain Services 2018_template.pptx
EY - Supply Chain Services 2018_template.pptxEY - Supply Chain Services 2018_template.pptx
EY - Supply Chain Services 2018_template.pptx
 
NuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final versionNuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final version
 
Large scale production of streptomycin.pptx
Large scale production of streptomycin.pptxLarge scale production of streptomycin.pptx
Large scale production of streptomycin.pptx
 
GBSN- Microbiology (Lab 3) Gram Staining
GBSN- Microbiology (Lab 3) Gram StainingGBSN- Microbiology (Lab 3) Gram Staining
GBSN- Microbiology (Lab 3) Gram Staining
 
general properties of oerganologametal.ppt
general properties of oerganologametal.pptgeneral properties of oerganologametal.ppt
general properties of oerganologametal.ppt
 
Predicting property prices with machine learning algorithms.pdf
Predicting property prices with machine learning algorithms.pdfPredicting property prices with machine learning algorithms.pdf
Predicting property prices with machine learning algorithms.pdf
 
FAIR & AI Ready KGs for Explainable Predictions
FAIR & AI Ready KGs for Explainable PredictionsFAIR & AI Ready KGs for Explainable Predictions
FAIR & AI Ready KGs for Explainable Predictions
 
platelets_clotting_biogenesis.clot retractionpptx
platelets_clotting_biogenesis.clot retractionpptxplatelets_clotting_biogenesis.clot retractionpptx
platelets_clotting_biogenesis.clot retractionpptx
 
RNA INTERFERENCE: UNRAVELING GENETIC SILENCING
RNA INTERFERENCE: UNRAVELING GENETIC SILENCINGRNA INTERFERENCE: UNRAVELING GENETIC SILENCING
RNA INTERFERENCE: UNRAVELING GENETIC SILENCING
 
Lab report on liquid viscosity of glycerin
Lab report on liquid viscosity of glycerinLab report on liquid viscosity of glycerin
Lab report on liquid viscosity of glycerin
 
Anemia_ different types_causes_ conditions
Anemia_ different types_causes_ conditionsAnemia_ different types_causes_ conditions
Anemia_ different types_causes_ conditions
 
Richard's entangled aventures in wonderland
Richard's entangled aventures in wonderlandRichard's entangled aventures in wonderland
Richard's entangled aventures in wonderland
 
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
 
Hemoglobin metabolism_pathophysiology.pptx
Hemoglobin metabolism_pathophysiology.pptxHemoglobin metabolism_pathophysiology.pptx
Hemoglobin metabolism_pathophysiology.pptx
 
The ASGCT Annual Meeting was packed with exciting progress in the field advan...
The ASGCT Annual Meeting was packed with exciting progress in the field advan...The ASGCT Annual Meeting was packed with exciting progress in the field advan...
The ASGCT Annual Meeting was packed with exciting progress in the field advan...
 
Lateral Ventricles.pdf very easy good diagrams comprehensive
Lateral Ventricles.pdf very easy good diagrams comprehensiveLateral Ventricles.pdf very easy good diagrams comprehensive
Lateral Ventricles.pdf very easy good diagrams comprehensive
 
Circulatory system_ Laplace law. Ohms law.reynaults law,baro-chemo-receptors-...
Circulatory system_ Laplace law. Ohms law.reynaults law,baro-chemo-receptors-...Circulatory system_ Laplace law. Ohms law.reynaults law,baro-chemo-receptors-...
Circulatory system_ Laplace law. Ohms law.reynaults law,baro-chemo-receptors-...
 
Seminar of U.V. Spectroscopy by SAMIR PANDA
 Seminar of U.V. Spectroscopy by SAMIR PANDA Seminar of U.V. Spectroscopy by SAMIR PANDA
Seminar of U.V. Spectroscopy by SAMIR PANDA
 
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATIONPRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
 
Multi-source connectivity as the driver of solar wind variability in the heli...
Multi-source connectivity as the driver of solar wind variability in the heli...Multi-source connectivity as the driver of solar wind variability in the heli...
Multi-source connectivity as the driver of solar wind variability in the heli...
 

Automatic Bayesian method for Numerical Integration

  • 1. Automatic Bayesian method for Numerical Integration Jagadeeswaran Rathinavel, Fred J. Hickernell Department of Applied Mathematics, Illinois Institute of Technology jrathin1@iit.edu Thanks to the CASSC 2017 organizers
  • 2. Introduction Bayesian Cubature Simulation results Conclusion References Multivariate Integration Approximate the d-dimensional integral over [0, 1)d µ := E[f(X)] := ż [0,1)d f(x) ν(dx) by a simple cubature rule µ « ^µ := n´1ÿ j=0 f(xj)wj = ż [0,1)d f(x) ^ν(dx) using points (xj)n´1 j=0 and associated weights wj. Then the approximation error µ ´ ^µ = ş [0,1)d f(x) (ν ´ ^ν)(dx) use extensible pointset and an algorithm that allows to add more points if needed. 2/22
  • 3. Introduction Bayesian Cubature Simulation results Conclusion References Motivating Example How to measure the water volume of a creek in a given area (For Ex: 10sqft) ? Figure: (Nuyens, 2007) 3/22
  • 4. Introduction Bayesian Cubature Simulation results Conclusion References What is multivariate integration? Figure: (Nuyens, 2007) A d-dim integral: ż [0,1)d f(x) dx = ż 1 0 ż 1 0 . . . ż 1 0 f(x1, x2, . . . xd) dx1 dx2 . . . dxd 4/22
  • 5. Introduction Bayesian Cubature Simulation results Conclusion References What is multivariate integration? Figure: (Nuyens, 2007) A d-dim integral: ż [0,1)d f(x) dx = ż 1 0 ż 1 0 . . . ż 1 0 f(x1, x2, . . . xd) dx1 dx2 . . . dxd Grid points: Curse of dimensionality 4/22
  • 6. Introduction Bayesian Cubature Simulation results Conclusion References What is multivariate integration? Figure: (Nuyens, 2007) A d-dim integral: ż [0,1)d f(x) dx = ż 1 0 ż 1 0 . . . ż 1 0 f(x1, x2, . . . xd) dx1 dx2 . . . dxd Grid points: Curse of dimensionality IID Monte Carlo: Converges O(n´ 1 2 ) ^µ = 1 n řn´1 j=0 f(xj) 4/22
  • 7. Introduction Bayesian Cubature Simulation results Conclusion References What is multivariate integration? Figure: (Nuyens, 2007) A d-dim integral: ż [0,1)d f(x) dx = ż 1 0 ż 1 0 . . . ż 1 0 f(x1, x2, . . . xd) dx1 dx2 . . . dxd Grid points: Curse of dimensionality IID Monte Carlo: Converges O(n´ 1 2 ) ^µ = 1 n řn´1 j=0 f(xj) Typical Quasi Monte Carlo: Converges O(n´1+ ) xj chosen carefully (low-discrepancy points) 4/22
  • 8. Introduction Bayesian Cubature Simulation results Conclusion References What is multivariate integration? Figure: (Nuyens, 2007) A d-dim integral: ż [0,1)d f(x) dx = ż 1 0 ż 1 0 . . . ż 1 0 f(x1, x2, . . . xd) dx1 dx2 . . . dxd Grid points: Curse of dimensionality IID Monte Carlo: Converges O(n´ 1 2 ) ^µ = 1 n řn´1 j=0 f(xj) Typical Quasi Monte Carlo: Converges O(n´1+ ) xj chosen carefully (low-discrepancy points) Can we do better? 4/22
  • 9. Introduction Bayesian Cubature Simulation results Conclusion References Algorithm 1: procedure AutoCubature(f, errtol) Ź Integrate within the error tolerance 2: n Ð 28 3: do 4: Generate (xi)n´1 i=0 5: Sample (f(xi))n´1 i=0 6: Compute errn 7: n Ð 2 ˆ n 8: while errn ą errtol Ź Iterate till error tolerance is met 9: Compute weights (wi)n´1 i=0 10: Compute integral ^µn 11: return ^µn Ź Integral estimate ^µn 12: end procedure Problem: How to choose (xj)n´1 j=0 and (wj)n´1 j=0 to make |µ ´ ^µ| small? Given error tolerance errtol, how big must ‘n’ be to guarantee |µ ´ ^µ| ď errtol 5/22
  • 10. Introduction Bayesian Cubature Simulation results Conclusion References Algorithm 1: procedure AutoCubature(f, errtol) Ź Integrate within the error tolerance 2: n Ð 28 3: do 4: Generate (xi)n´1 i=0 5: Sample (f(xi))n´1 i=0 6: Compute errn 7: n Ð 2 ˆ n 8: while errn ą errtol Ź Iterate till error tolerance is met 9: Compute weights (wi)n´1 i=0 10: Compute integral ^µn 11: return ^µn Ź Integral estimate ^µn 12: end procedure Problem: How to choose (xj)n´1 j=0 and (wj)n´1 j=0 to make |µ ´ ^µ| small? Given error tolerance errtol, how big must ‘n’ be to guarantee |µ ´ ^µ| ď errtol 5/22
  • 11. Introduction Bayesian Cubature Simulation results Conclusion References Bayesian Trio Identity Random f : f „ GP(0, s2 Cθ), a Gaussian process from the sample space F with zero mean and covariance kernel, s2 Cθ, Cθ : X ˆ X Ñ R. Then 6/22
  • 12. Introduction Bayesian Cubature Simulation results Conclusion References Bayesian Trio Identity Random f : f „ GP(0, s2 Cθ), a Gaussian process from the sample space F with zero mean and covariance kernel, s2 Cθ, Cθ : X ˆ X Ñ R. Then c0 = ż XˆX C(x, t) ν(dx)ν(dt), c = ż X C(xi, t) ν(dt) n´1 i=0 C = C(xi, xj) n´1 i,j=0 , w = wi n´1 i=0 , µ ´ ^µ = ş X f(x) (ν ´ ^ν)(dx) s a c0 ´ 2cTw + wTCwloooooooooooooomoooooooooooooon ALNB (f, ν ´ ^ν) a c0 ´ 2cTw + wTCwlooooooooooooomooooooooooooon DSCB (ν ´ ^ν) sloomoon VARB (f) 6/22
  • 13. Introduction Bayesian Cubature Simulation results Conclusion References Bayesian Trio Identity Random f : f „ GP(0, s2 Cθ), a Gaussian process from the sample space F with zero mean and covariance kernel, s2 Cθ, Cθ : X ˆ X Ñ R. Then c0 = ż XˆX C(x, t) ν(dx)ν(dt), c = ż X C(xi, t) ν(dt) n´1 i=0 C = C(xi, xj) n´1 i,j=0 , w = wi n´1 i=0 , µ ´ ^µ = ş X f(x) (ν ´ ^ν)(dx) s a c0 ´ 2cTw + wTCwloooooooooooooomoooooooooooooon ALNB (f, ν ´ ^ν) a c0 ´ 2cTw + wTCwlooooooooooooomooooooooooooon DSCB (ν ´ ^ν) sloomoon VARB (f) The scale parameter, s, and shape parameter, θ, should be estimated. w = C´1 c minimizes DSCB (ν ´ ^ν) makes ALNB (f, ν ´ ^ν) ˇ ˇ(f(xi) = yi)n´1 i=0 „ N(0, 1) Ref: Diaconis (1988), O’Hagan (1991), Ritter (2000), Rasmussen (2003) and others 6/22
  • 14. Introduction Bayesian Cubature Simulation results Conclusion References Covariance kernel shift invariant kernel Cθ(x, t) := ÿ mPZd λm,θ e ? ´12πmT (x´t) , 0 ď |x ´ t| ď 1 7/22
  • 15. Introduction Bayesian Cubature Simulation results Conclusion References Covariance kernel shift invariant kernel Cθ(x, t) := ÿ mPZd λm,θ e ? ´12πmT (x´t) , 0 ď |x ´ t| ď 1 when λm,θ := dź l=1 1 max(|ml| θ , 1)r θď1 , with λ0,θ = 1, θ P (0, 1]d , r P 2N Cθ(x, t) = dź l=1 1 ´ θr l (2π ? ´1)r r! Br(|xl ´ tl|), θ P (0, 1]d , r P 2N 7/22
  • 16. Introduction Bayesian Cubature Simulation results Conclusion References Covariance kernel shift invariant kernel Cθ(x, t) := ÿ mPZd λm,θ e ? ´12πmT (x´t) , 0 ď |x ´ t| ď 1 when λm,θ := dź l=1 1 max(|ml| θ , 1)r θď1 , with λ0,θ = 1, θ P (0, 1]d , r P 2N Cθ(x, t) = dź l=1 1 ´ θr l (2π ? ´1)r r! Br(|xl ´ tl|), θ P (0, 1]d , r P 2N where Br is Bernoulli polynomial of order r (Olver et al., 2013). Br(x) = ´r! (2π ? ´1)r ∞ÿ k‰0,k=´∞ e2π ? ´1kx kr # for r = 1, 0 ă x ă 1 for r = 2, 3, . . . 0 ď x ď 1 7/22
  • 17. Introduction Bayesian Cubature Simulation results Conclusion References Covariance kernel shift invariant kernel Cθ(x, t) := ÿ mPZd λm,θ e ? ´12πmT (x´t) , 0 ď |x ´ t| ď 1 when λm,θ := dź l=1 1 max(|ml| θ , 1)r θď1 , with λ0,θ = 1, θ P (0, 1]d , r P 2N Cθ(x, t) = dź l=1 1 ´ θr l (2π ? ´1)r r! Br(|xl ´ tl|), θ P (0, 1]d , r P 2N where Br is Bernoulli polynomial of order r (Olver et al., 2013). Br(x) = ´r! (2π ? ´1)r ∞ÿ k‰0,k=´∞ e2π ? ´1kx kr # for r = 1, 0 ă x ă 1 for r = 2, 3, . . . 0 ď x ď 1 Given (xi : i = 0, ..., n ´ 1), the symmetric kernel matrix is formed Cθ = Cθ(xi, xj) n´1 i,j=0 7/22
  • 18. Introduction Bayesian Cubature Simulation results Conclusion References Quais Monte-Carlo : Sample “more uniformly” 8/22
  • 19. Introduction Bayesian Cubature Simulation results Conclusion References Rank-1 Lattice rules : Low discrepancy point set Given the “generating vector” g, the construction of n - Rank-1 lattice points is given by " kg n * n´1 k=0 then the Lattice rule approximation is 1 n n´1ÿ k=0 f " kg n * with t.u the fractional part, i.e, ‘modulo 1’ operator. 9/22
  • 20. Introduction Bayesian Cubature Simulation results Conclusion References Rank-1 Lattice rules : Low discrepancy point set Shift invariant kernel + Lattice points = ’Symmetric circulant kernel’ matrix 9/22
  • 21. Introduction Bayesian Cubature Simulation results Conclusion References Bayesian Cubature µ = ż X f(x)ν(dx) « ^µn = n´1ÿ i=0 wif(xi) 10/22
  • 22. Introduction Bayesian Cubature Simulation results Conclusion References Bayesian Cubature µ = ż X f(x)ν(dx) « ^µn = n´1ÿ i=0 wif(xi) Assume f „ GP(0, s2 Cθ) µ ´ ^µn ˇ ˇ(f(xi) = yi)n´1 i=0 „ N yT (C´1 c ´ w), s2 (c0 ´ cT C´1 c) 10/22
  • 23. Introduction Bayesian Cubature Simulation results Conclusion References Bayesian Cubature µ = ż X f(x)ν(dx) « ^µn = n´1ÿ i=0 wif(xi) Assume f „ GP(0, s2 Cθ) µ ´ ^µn ˇ ˇ(f(xi) = yi)n´1 i=0 „ N yT (C´1 c ´ w), s2 (c0 ´ cT C´1 c) Choosing w = C´1 ^θ c^θ is optimal P[|µ ´ ^µn| ď errn] ě 99% for errn = 2.58 d c^θ,0 ´ cT ^θ C´1 ^θ c^θ yTC´1 ^θ y n MLE ^θ = argmin θ yT C´1 θ y [det(C´1 θ )]1/n , where y = f(xi) n´1 i=0 . 10/22
  • 24. Introduction Bayesian Cubature Simulation results Conclusion References Bayesian Cubature µ = ż X f(x)ν(dx) « ^µn = n´1ÿ i=0 wif(xi) Assume f „ GP(0, s2 Cθ) µ ´ ^µn ˇ ˇ(f(xi) = yi)n´1 i=0 „ N yT (C´1 c ´ w), s2 (c0 ´ cT C´1 c) Choosing w = C´1 ^θ c^θ is optimal P[|µ ´ ^µn| ď errn] ě 99% for errn = 2.58 d c^θ,0 ´ cT ^θ C´1 ^θ c^θ yTC´1 ^θ y n MLE ^θ = argmin θ yT C´1 θ y [det(C´1 θ )]1/n , where y = f(xi) n´1 i=0 . C´1 typically requires O(n3 ) operations. But with covariance kernel C for which matrix C is symmetric circulant. So operations on C require only O(n log(n)) operations. 10/22
  • 25. Introduction Bayesian Cubature Simulation results Conclusion References Optimal Shape parameter θ Maximum likelihood estimate of θ by argmin θ yT C´1 θ y [det(C´1 θ )]1/n 11/22
  • 26. Introduction Bayesian Cubature Simulation results Conclusion References Optimal Shape parameter θ Maximum likelihood estimate of θ by argmin θ yT C´1 θ y [det(C´1 θ )]1/n simplified to (Bernstein, 2009) argmin θ log n´1ÿ i=0 |zi|2 γi + 1 n n´1ÿ i=0 log(γi) where (γi) eigen values of Cθ z := (zi)n´1 i=0 = DFT y , γ := (γi)n´1 i=0 = DFT C(xi, x0) n´1 i=0 O(n log(n)) operations to estimate the ^θ 11/22
  • 27. Introduction Bayesian Cubature Simulation results Conclusion References Cubature rule - Computing ^µ efficiently Define Cθ = Cθ(xi, xj) n´1 i,j=0 , cθ = ż [0,1)d Cθ(xi, x)dx n´1 i=0 12/22
  • 28. Introduction Bayesian Cubature Simulation results Conclusion References Cubature rule - Computing ^µ efficiently Define Cθ = Cθ(xi, xj) n´1 i,j=0 , cθ = ż [0,1)d Cθ(xi, x)dx n´1 i=0 To find the approximate mean ^µ ^µ = wT y = C´1 θ cθloomoon wT y 12/22
  • 29. Introduction Bayesian Cubature Simulation results Conclusion References Cubature rule - Computing ^µ efficiently Define Cθ = Cθ(xi, xj) n´1 i,j=0 , cθ = ż [0,1)d Cθ(xi, x)dx n´1 i=0 To find the approximate mean ^µ ^µ = wT y = C´1 θ cθloomoon wT y Further simplified using shift-invariant and circulant matrix property (Bernstein, 2009) ^µ = ş [0,1]d K x, x0 dx ˆ řn´1 i=0 C(x0, xi) ´1 ˆ řn´1 i=0 yi O(n) operations to compute the ^µ 12/22
  • 30. Introduction Bayesian Cubature Simulation results Conclusion References Computing error bound efficiently Let’s simplify: errn = 2.58 d C^θ,0 ´ cT ^θ C´1 ^θ c^θ yTC´1 ^θ y n 13/22
  • 31. Introduction Bayesian Cubature Simulation results Conclusion References Computing error bound efficiently Let’s simplify: errn = 2.58 d C^θ,0 ´ cT ^θ C´1 ^θ c^θ yTC´1 ^θ y n using the facts about the shift invariant kernel and R-1 Lattice points (Bernstein, 2009), where z := (zi)n´1 i=0 = DFT (y) , γ := (γi)n´1 i=0 = DFT (C(xi, x0))n´1 i=0 13/22
  • 32. Introduction Bayesian Cubature Simulation results Conclusion References Computing error bound efficiently Let’s simplify: errn = 2.58 d C^θ,0 ´ cT ^θ C´1 ^θ c^θ yTC´1 ^θ y n using the facts about the shift invariant kernel and R-1 Lattice points (Bernstein, 2009), where z := (zi)n´1 i=0 = DFT (y) , γ := (γi)n´1 i=0 = DFT (C(xi, x0))n´1 i=0 Finally errn = 2.58 g f f e 1 ´ 1 řn´1 i=0 C(x0, xi) 1 n n´1ÿ i=0 |zi|2 γi 13/22
  • 33. Introduction Bayesian Cubature Simulation results Conclusion References Periodization Transforms Our algorithm works best with periodic functions Baker : ˜f(t) = f 1 ´ 2|t ´ 1 2 | C0 : ˜f(t) = f 3t2 ´ 2t3 dź j=1 (6tj(1 ´ tj)) C1 : ˜f(t) = f t3 (10 ´ 15t + 6t2 ) dź j=1 (30t2 j (1 ´ tj)2 ) Sidi’s C1 : ˜f(t) = f tj ´ sin(2πtj) 2π d j=1 dź j=1 (1 ´ cos(2πtj)) 14/22
  • 34. Introduction Bayesian Cubature Simulation results Conclusion References Test functions for Numerical integration: Multivariate Normal (MVN) µ = ż [a,b] exp ´1 2 tT Σ´1 t a (2π)d det(Σ) dt Genz (1993) = ż [0,1]d´1 f(x) dx Keister : µ = ż Rd cos( x ) exp(´ x 2 ) dx, d = 1, 2, . . . . Exp(Cos) : µ = ż (0,1]d exp(cos(x))dx, d = 1, 2, . . . . 15/22
  • 35. Introduction Bayesian Cubature Simulation results Conclusion References Integrating Multivariate Normal Probability 16/22
  • 36. Introduction Bayesian Cubature Simulation results Conclusion References Exponential of Cosine 17/22
  • 37. Introduction Bayesian Cubature Simulation results Conclusion References Keister Integral Example 18/22
  • 38. Introduction Bayesian Cubature Simulation results Conclusion References Summary Automatic Bayesian Cubature with O(n) complexity and O(n log(n)) MLE Complexity Having the advantages of a kernel method and the low computation cost of Quasi Monte carlo Scalable based on the complexity of the Integrand i.e, Kernel order and Lattice-points can be chosen to suit the smoothness of the integrand More about Guaranteed Automatic Algorithms (GAIL): http://gailgithub.github.io/GAIL_Dev/ These slides will also be available here. 19/22
  • 39. Introduction Bayesian Cubature Simulation results Conclusion References Future work Dimension independent Tighter error bound Roundoff error Lattice points optimization specific the the kernel Choosing kernel order automatically as part of MLE Can we compute n directly instead of in Loop? Choosing Periodization transform automatically 20/22
  • 41. Introduction Bayesian Cubature Simulation results Conclusion References References I Bernstein, Dennis S. 2009. Matrix mathematics, theory, facts, and formulas, Princeton University Press. Diaconis, P. 1988. Bayesian numerical analysis, Statistical decision theory and related topics iv, papers from the 4th purdue symp., west lafayette/indiana 1986, pp. 163–175. Genz, A. 1993. Comparison of methods for the computation of multivariate normal probabilities, Computing Science and Statistics 25, 400–405. Nuyens, Dirk. 2007. Fast construction of good lattice rules, Ph.D. Thesis. O’Hagan, A. 1991. Bayes-Hermite quadrature, J. Statist. Plann. Inference 29, 245–260. Olver, F. W. J., D. W. Lozier, R. F. Boisvert, C. W. Clark, and A. B. O. Dalhuis. 2013. Digital library of mathematical functions. Rasmussen, C. E. 2003. Bayesian Monte Carlo, Advances in Neural Information Processing Systems, pp. 489–496. Ritter, K. 2000. Average-case analysis of numerical problems, Lecture Notes in Mathematics, vol. 1733, Springer-Verlag, Berlin. 22/22