SlideShare a Scribd company logo
Simulating the Mean Efficiently
and to a Given Tolerance
Fred J. Hickernell
Department of Applied Mathematics, Illinois Institute of Technology
hickernell@iit.edu mypages.iit.edu/~hickernell
Thanks to Lan Jiang, Tony Jiménez Rugama, Jagadees Rathinavel,
and the rest of the the Guaranteed Automatic Integration Library (GAIL) team
Supported by NSF-DMS-1522687
Thanks for your kind invitation
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Estimating/Simulating/Computing an Integral
Gaussian probability =
ż
[a,b]
e´xT
Σ´1
x/2
(2π)d/2 |Σ|1/2
dx
2/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Estimating/Simulating/Computing an Integral
Gaussian probability =
ż
[a,b]
e´xT
Σ´1
x/2
(2π)d/2 |Σ|1/2
dx
option price =
ż
Rd
payoff(x)
e´xT
Σ´1
x/2
(2π)d/2 |Σ|1/2
looooooomooooooon
PDF of Brownian motion at d times
dx
2/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Estimating/Simulating/Computing an Integral
Gaussian probability =
ż
[a,b]
e´xT
Σ´1
x/2
(2π)d/2 |Σ|1/2
dx
option price =
ż
Rd
payoff(x)
e´xT
Σ´1
x/2
(2π)d/2 |Σ|1/2
looooooomooooooon
PDF of Brownian motion at d times
dx
Bayesian ^βj =
ż
Rd
βj prob(β|data) dβ =
ş
Rd βj prob(data|β) probprior(β) dβ
ş
Rd prob(data|β) probprior(β) dβ
2/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Estimating/Simulating/Computing an Integral
Gaussian probability =
ż
[a,b]
e´xT
Σ´1
x/2
(2π)d/2 |Σ|1/2
dx
option price =
ż
Rd
payoff(x)
e´xT
Σ´1
x/2
(2π)d/2 |Σ|1/2
looooooomooooooon
PDF of Brownian motion at d times
dx
Bayesian ^βj =
ż
Rd
βj prob(β|data) dβ =
ş
Rd βj prob(data|β) probprior(β) dβ
ş
Rd prob(data|β) probprior(β) dβ
Sobol’ indexj =
ş
[0,1]2d output(x) ´ output(xj, x1
´j) output(x1
) dx dx1
ş
[0,1]d output(x)2 dx ´
ş
[0,1]d output(x) dx
2
2/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Estimating/Simulating/Computing the Mean
Gaussian probability =
ż
[a,b]
e´xT
Σ´1
x/2
(2π)d/2 |Σ|1/2
dx
option price =
ż
Rd
payoff(x)
e´xT
Σ´1
x/2
(2π)d/2 |Σ|1/2
looooooomooooooon
PDF of Brownian motion at d times
dx
Bayesian ^βj =
ż
Rd
βj prob(β|data) dβ =
ş
Rd βj prob(data|β) probprior(β) dβ
ş
Rd prob(data|β) probprior(β) dβ
Sobol’ indexj =
ş
[0,1]2d output(x) ´ output(xj, x1
´j) output(x1
) dx dx1
ş
[0,1]d output(x)2 dx ´
ş
[0,1]d output(x) dx
2
µ =
ż
Rd
g(x) dx = E[f(X)] =
ż
Rd
f(x) ν(dx) =?, ^µn =
nÿ
i=1
wif(xi)
How to choose ν, txiun
i=1, and twiun
i=1 to make |µ ´ ^µn| small? (trio identity)
Given εa, how big must n be to guarantee |µ ´ ^µn| ď εa? (adaptive cubature) 2/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Product Rules Using Rectangular Grids
µ =
ż
Rd
f(x) ν(dx) « ^µn =
nÿ
i=1
wif(xi)
If
ż 1
0
f(x) dx ´
mÿ
i=1
wif(ti) = O(m´r
), then
ż
[0,1]d
f(x) dx
´
mÿ
i1=1
¨ ¨ ¨
mÿ
id=1
wi1
¨ ¨ ¨ wid
f(ti1
, . . . , tid
)
= O(m´r
) = O(n´r/d
)
assuming rth
derivatives in each direction exist.
3/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Product Rules Using Rectangular Grids
µ =
ż
Rd
f(x) ν(dx) « ^µn =
nÿ
i=1
wif(xi)
If
ż 1
0
f(x) dx ´
mÿ
i=1
wif(ti) = O(m´r
), then
ż
[0,1]d
f(x) dx
´
mÿ
i1=1
¨ ¨ ¨
mÿ
id=1
wi1
¨ ¨ ¨ wid
f(ti1
, . . . , tid
)
= O(m´r
) = O(n´r/d
)
assuming rth
derivatives in each direction exist.
3/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Product Rules Using Rectangular Grids
µ =
ż
Rd
f(x) ν(dx) « ^µn =
nÿ
i=1
wif(xi)
If
ż 1
0
f(x) dx ´
mÿ
i=1
wif(ti) = O(m´r
), then
ż
[0,1]d
f(x) dx
´
mÿ
i1=1
¨ ¨ ¨
mÿ
id=1
wi1
¨ ¨ ¨ wid
f(ti1
, . . . , tid
)
= O(m´r
) = O(n´r/d
)
assuming rth
derivatives in each direction exist. But the computational cost
becomes prohibitive for large dimensions, d:
d 1 2 5 10 100
n = 8d
8 64 3.3E4 1.0E9 2.0E90
Product rules are typically a bad idea unless d is small. 3/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Monte Carlo Simulation in the News
Sampling with a computer can be fast
How big is our error?
4/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
IID Monte Carlo
µ =
ż
Rd
f(x) ν(dx) « ^µn =
1
n
nÿ
i=1
f(xi), xi
IID
„ ν
µ ´ ^µn =
µ ´ ^µn
std(f(X))/
?
nlooooooomooooooon
CNF„(0,1)
ˆ
1
?
nloomoon
DSC(txiu)
ˆ std(f(X))loooomoooon
VAR(f)
trio identity (Meng, 2017+)
µ ´ ^µn =
ż
Rd
f(x) (ν ´ ^νn)(dx)
^νn =
1
n
nÿ
i=1
δxi
5/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Central Limit Theorem Stopping Rule for IID Monte Carlo
µ =
ż
Rd
f(x) ν(dx) « ^µn =
1
n
nÿ
i=1
f(xi), xi
IID
„ ν
µ ´ ^µn =
µ ´ ^µn
std(f(X))/
?
nlooooooomooooooon
CNF„(0,1)
ˆ
1
?
nloomoon
DSC(txiu)
ˆ std(f(X))loooomoooon
VAR(f)
P[|µ ´ ^µn| ď errn] « 99%
for errn =
2.58 ˆ 1.2^σ
?
n
by the Central Limit Theorem (CLT),
where ^σ2
is the sample variance.
5/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Central Limit Theorem Stopping Rule for IID Monte Carlo
µ =
ż
Rd
f(x) ν(dx) « ^µn =
1
n
nÿ
i=1
f(xi), xi
IID
„ ν
µ ´ ^µn =
µ ´ ^µn
std(f(X))/
?
nlooooooomooooooon
CNF„(0,1)
ˆ
1
?
nloomoon
DSC(txiu)
ˆ std(f(X))loooomoooon
VAR(f)
P[|µ ´ ^µn| ď errn] « 99%
for errn =
2.58 ˆ 1.2^σ
?
n
by the Central Limit Theorem (CLT),
where ^σ2
is the sample variance. But the CLT is only an asymptotic result, and
1.2^σ may be an overly optimistic upper bound on σ.
5/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Berry-Esseen Stopping Rule for IID Monte Carlo
µ =
ż
Rd
f(x) ν(dx) « ^µn =
1
n
nÿ
i=1
f(xi), xi
IID
„ ν
µ ´ ^µn =
µ ´ ^µn
std(f(X))/
?
nlooooooomooooooon
CNF„(0,1)
ˆ
1
?
nloomoon
DSC(txiu)
ˆ std(f(X))loooomoooon
VAR(f)
P[|µ ´ ^µn| ď errn] ě 99%
for Φ ´
?
n errn /(1.2^σnσ
)
+ ∆n(´
?
n errn /(1.2^σnσ
), κmax) = 0.0025
by the Berry-Esseen Inequality,
where ^σ2
nσ
is the sample variance using an independent sample from that used to
simulate the mean, and provided that kurt(f(X)) ď κmax(nσ) (H. et al., 2013;
Jiang, 2016)
5/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Normally n should be a power of 2
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Let tpf(k)uk denote the coefficients of the
Fourier Walsh or complex exponential
expansion of f. Let tω(k)uk be some
weights. Then
µ ´ ^µn =
´
ÿ
0‰kPdual
pf(k)
! pf(k)
ω(k)
)
k 2
tω(k)u0‰kPdual 2
loooooooooooooooooooomoooooooooooooooooooon
CNFP[´1,1]
ˆ tω(k)u0‰kPdual 2loooooooooomoooooooooon
DSC(txiun
i=1)=O(n´1+ )
ˆ
#
pf(k)
ω(k)
+
k 2looooooomooooooon
VAR(f)
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
^µn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Let tpf(k)uk denote the coefficients of the
Fourier Walsh or complex exponential
expansion of f.
Assuming that the pf(k) do not decay erratically as k Ñ ∞, the discrete
transform, rfn(k)
(
k
, may be used to bound the error reliably (H. and Jiménez
Rugama, 2016; Jiménez Rugama and H., 2016; H. et al., 2017+):
|µ ´ ^µn| ď errn := C(n)
ÿ
certaink
rfn(k)
6/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Bayesian Cubature—f Is Random
µ =
ż
Rd
f(x) ν(dx) « ^µn =
nÿ
i=1
wi f(xi)
Assume f „ GP(0, s2
Cθ) (Diaconis, 1988;
O’Hagan, 1991; Ritter, 2000; Rasmussen and
Ghahramani, 2003)
c0 =
ż
RdˆRd
Cθ(x, t) ν(dx)ν(dt)
c =
ż
Rd
Cθ(xi, t) ν(dt)
n
i=1
, C = Cθ(xi, xj)
n
i,j=1
Choosing w = wi
n
i=1
= C´1
c is optimal
µ ´ ^µn =
µ ´ ^µn
b
c0 ´ cTC´1c yTC´1y
nlooooooooooooooomooooooooooooooon
CNF„N(0,1)
ˆ
a
c0 ´ cTC´1cloooooooomoooooooon
DSC
ˆ
c
yTC´1y
nlooooomooooon
VAR(f)
where y = f(xi)
n
i=1
.
7/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Bayesian Cubature—f Is Random
µ =
ż
Rd
f(x) ν(dx) « ^µn =
nÿ
i=1
wi f(xi)
Assume f „ GP(0, s2
Cθ) (Diaconis, 1988;
O’Hagan, 1991; Ritter, 2000; Rasmussen and
Ghahramani, 2003)
c0 =
ż
RdˆRd
Cθ(x, t) ν(dx)ν(dt)
c =
ż
Rd
Cθ(xi, t) ν(dt)
n
i=1
, C = Cθ(xi, xj)
n
i,j=1
Choosing w = wi
n
i=1
= C´1
c is optimal
P[|µ ´ ^µn| ď errn] = 99% for errn = 2.58
c
c0 ´ cTC´1c
yTC´1y
n
where y = f(xi)
n
i=1
. But, θ needs to be inferred (by MLE), and C´1
typically
requires O(n3
) operations
7/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Gaussian Probability
µ =
ż
[a,b]
exp ´1
2 tT
Σ´1
t
a
(2π)d det(Σ)
dt
Genz (1993)
=
ż
[0,1]d´1
f(x) dx
For some typical choice of a, b, Σ, d = 3, εa = 0; µ « 0.6763
Worst 10% Worst 10%
εr Method % Accuracy n Time (s)
IID Monte Carlo 100% 8.1E4 1.8E´2
1E´2 Sobol’ Sampling 100% 1.0E3 5.1E´3
Bayesian Lattice 100% 1.0E3 2.8E´3
IID Monte Carlo 100% 2.0E6 3.8E´1
1E´3 Sobol’ Sampling 100% 2.0E3 7.7E´3
Bayesian Lattice 100% 1.0E3 2.8E´3
1E´4 Sobol’ Sampling 100% 1.6E4 1.8E´2
Bayesian Lattice 100% 8.2E3 1.4E´2
Bayesian lattice cubature uses covariance kernel C for which C is circulant,
and operations on C require only O(n log(n)) operations 8/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Asian Option Pricing
fair price =
ż
Rd
e´rT
max

1
d
dÿ
j=1
Sj ´ K, 0

 e´xT
Σ´1
x/2
(2π)d/2 |Σ|1/2
dx « $13.12
Sj = S0e(r´σ2
/2)jT/d+σxj
= stock price at time jT/d,
Σ = min(i, j)T/d
d
i,j=1
Worst 10% Worst 10%
εa = 1E´4 Method % Accuracy n Time (s)
Sobol’ Sampling 100% 2.1E6 4.3
Sobol’ Sampling w/ control variates 97% 1.0E6 2.1
The coefficient of the control variate for low discrepancy sampling is different than
for IID Monte Carlo (H. et al., 2005; H. et al., 2017+)
9/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Sobol’ Indices
Y = output(X), where X „ U[0, 1]d
; Sobol’ Indexj(µ) describes how much
coordinate j of input X influences output Y (Sobol’, 1990; 2001):
Sobol’ Indexj(µ) :=
µ1
µ2 ´ µ2
3
, j = 1, . . . , d
µ1 :=
ż
[0,1)2d
[output(x) ´ output(xj, x1
´j)]output(x1
) dx dx1
µ2 :=
ż
[0,1)d
output(x)2
dx, µ3 :=
ż
[0,1)d
output(x) dx
output(x) = ´x1 + x1x2 ´ x1x2x3 + ¨ ¨ ¨ + x1x2x3x4x5x6 (Bratley et al., 1992)
εa = 1E´3, εr = 0 j 1 2 3 4 5 6
n 65 536 32 768 16 384 16 384 2 048 2 048
Sobol’ Indexj 0.6529 0.1791 0.0370 0.0133 0.0015 0.0015
{Sobol’ Indexj 0.6528 0.1792 0.0363 0.0126 0.0010 0.0012
Sobol’ Indexj(pµn) 0.6492 0.1758 0.0308 0.0083 0.0018 0.0039
10/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Summary
The error in simulating the mean can be decomposed as a trio identity
(Meng, 2017+; H., 2017+)
Knowing when to stop a simulation of the mean is not trivial (H. et al., 2017+)
The Berry-Esseen inequality can tell us when to stop an IID simulation
Fourier analysis can tell us when to stop a low discrepancy simulation
Bayesian cubature can tell us when to stop a simulation if you can afford the
computational cost
All methods can be fooled by nasty functions, f
Relative error tolerances and problems involving functions of integrals can
be handled (H. et al., 2017+)
Our algorithms are implemented in the Guaranteed Automatic Integration
Library (GAIL) (Choi et al., 2013–2015), which is under continuous
development
11/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Upcoming SAMSI Quasi-Monte Carlo Program
12/16
Thank you
Slides available at www.slideshare.net/fjhickernell/
tulane-march-2017-talk
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
References I
Bratley, P., B. L. Fox, and H. Niederreiter. 1992. Implementation and tests of low-discrepancy
sequences, ACM Trans. Model. Comput. Simul. 2, 195–213.
Choi, S.-C. T., Y. Ding, F. J. H., L. Jiang, Ll. A. Jiménez Rugama, X. Tong, Y. Zhang, and X. Zhou.
2013–2015. GAIL: Guaranteed Automatic Integration Library (versions 1.0–2.1).
Cools, R. and D. Nuyens (eds.) 2016. Monte Carlo and quasi-Monte Carlo methods: MCQMC,
Leuven, Belgium, April 2014, Springer Proceedings in Mathematics and Statistics, vol. 163,
Springer-Verlag, Berlin.
Diaconis, P. 1988. Bayesian numerical analysis, Statistical decision theory and related topics IV,
Papers from the 4th Purdue symp., West Lafayette, Indiana 1986, pp. 163–175.
Genz, A. 1993. Comparison of methods for the computation of multivariate normal probabilities,
Computing Science and Statistics 25, 400–405.
H., F. J. 2017+. Error analysis of quasi-Monte Carlo methods. submitted for publication,
arXiv:1702.01487.
H., F. J., L. Jiang, Y. Liu, and A. B. Owen. 2013. Guaranteed conservative fixed width confidence
intervals via Monte Carlo sampling, Monte Carlo and quasi-Monte Carlo methods 2012, pp. 105–128.
H., F. J. and Ll. A. Jiménez Rugama. 2016. Reliable adaptive cubature using digital sequences,
Monte Carlo and quasi-Monte Carlo methods: MCQMC, Leuven, Belgium, April 2014, pp. 367–383.
arXiv:1410.8615 [math.NA].
14/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
References II
H., F. J., Ll. A. Jiménez Rugama, and D. Li. 2017+. Adaptive quasi-Monte Carlo methods. submitted
for publication, arXiv:1702.01491 [math.NA].
H., F. J., C. Lemieux, and A. B. Owen. 2005. Control variates for quasi-Monte Carlo, Statist. Sci. 20,
1–31.
Jiang, L. 2016. Guaranteed adaptive Monte Carlo methods for estimating means of random
variables, Ph.D. Thesis.
Jiménez Rugama, Ll. A. and F. J. H. 2016. Adaptive multidimensional integration based on rank-1
lattices, Monte Carlo and quasi-Monte Carlo methods: MCQMC, Leuven, Belgium, April 2014,
pp. 407–422. arXiv:1411.1966.
Meng, X. 2017+. Statistical paradises and paradoxes in big data. in preparation.
O’Hagan, A. 1991. Bayes-Hermite quadrature, J. Statist. Plann. Inference 29, 245–260.
Rasmussen, C. E. and Z. Ghahramani. 2003. Bayesian Monte Carlo, Advances in Neural Information
Processing Systems, pp. 489–496.
Ritter, K. 2000. Average-case analysis of numerical problems, Lecture Notes in Mathematics,
vol. 1733, Springer-Verlag, Berlin.
Sobol’, I. M. 1990. On sensitivity estimation for nonlinear mathematical models, Matem. Mod. 2,
no. 1, 112–118.
. 2001. Global sensitivity indices for nonlinear mathematical models and their monte carlo
estimates, Math. Comput. Simul. 55, no. 1-3, 271–280.
15/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Maximum Likelihood Estimation of the Covariance Kernel
f „ GP(0, s2
Cθ), Cθ = Cθ(xi, xj)
n
i,j=1
y = f(xi)
n
i=1
, ^µn = cT
^θ
C´1
^θ
y
^θ = argmin
θ
yT
C´1
θ y
[det(C´1
θ )]1/n
P[|µ ´ ^µn| ď errn] = 99% for errn =
2.58
?
n
b
c0,^θ ´ cT
^θ
C´1
^θ
c^θ yTC´1
^θ
y
16/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Maximum Likelihood Estimation of the Covariance Kernel
f „ GP(0, s2
Cθ), Cθ = Cθ(xi, xj)
n
i,j=1
y = f(xi)
n
i=1
, ^µn = cT
^θ
C´1
^θ
y
^θ = argmin
θ
yT
C´1
θ y
[det(C´1
θ )]1/n
P[|µ ´ ^µn| ď errn] = 99% for errn =
2.58
?
n
b
c0,^θ ´ cT
^θ
C´1
^θ
c^θ yTC´1
^θ
y
There is a de-randomized interpretation of Bayesian cubature (H., 2017+)
f P Hilbert space w/ reproducing kernel Cθ and with best interpolant rfy
16/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Maximum Likelihood Estimation of the Covariance Kernel
f „ GP(0, s2
Cθ), Cθ = Cθ(xi, xj)
n
i,j=1
y = f(xi)
n
i=1
, ^µn = cT
^θ
C´1
^θ
y
^θ = argmin
θ
yT
C´1
θ y
[det(C´1
θ )]1/n
P[|µ ´ ^µn| ď errn] = 99% for errn =
2.58
?
n
b
c0,^θ ´ cT
^θ
C´1
^θ
c^θ yTC´1
^θ
y
There is a de-randomized interpretation of Bayesian cubature (H., 2017+)
f P Hilbert space w/ reproducing kernel Cθ and with best interpolant rfy
|µ ´ ^µn| ď
2.58
?
n
b
c0,^θ ´ cT
^θ
C´1
^θ
c^θ
loooooooooomoooooooooon
error representer ^θ
b
yTC´1
^θ
y
looooomooooon
rfy ^θ
if f ´ rfy ^θ
ď
2.58 rf ^θ?
n
16/16
Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References
Maximum Likelihood Estimation of the Covariance Kernel
f „ GP(0, s2
Cθ), Cθ = Cθ(xi, xj)
n
i,j=1
y = f(xi)
n
i=1
, ^µn = cT
^θ
C´1
^θ
y
^θ = argmin
θ
yT
C´1
θ y
[det(C´1
θ )]1/n
P[|µ ´ ^µn| ď errn] = 99% for errn =
2.58
?
n
b
c0,^θ ´ cT
^θ
C´1
^θ
c^θ yTC´1
^θ
y
There is a de-randomized interpretation of Bayesian cubature (H., 2017+)
f P Hilbert space w/ reproducing kernel Cθ and with best interpolant rfy
^θ = argmin
θ
yT
C´1
θ y
[det(C´1
θ )]1/n
= argmin
θ
vol z P Rn
: rfz θ ď rfy θ
(
|µ ´ ^µn| ď
2.58
?
n
b
c0,^θ ´ cT
^θ
C´1
^θ
c^θ
loooooooooomoooooooooon
error representer ^θ
b
yTC´1
^θ
y
looooomooooon
rfy ^θ
if f ´ rfy ^θ
ď
2.58 rf ^θ?
n
16/16

More Related Content

What's hot

Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
Habilitation à diriger des recherches
Habilitation à diriger des recherchesHabilitation à diriger des recherches
Habilitation à diriger des recherches
Pierre Pudlo
 
Gtti 10032021
Gtti 10032021Gtti 10032021
Gtti 10032021
Valentin De Bortoli
 
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
Frank Nielsen
 
Macrocanonical models for texture synthesis
Macrocanonical models for texture synthesisMacrocanonical models for texture synthesis
Macrocanonical models for texture synthesis
Valentin De Bortoli
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...
Valentin De Bortoli
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...
QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...
QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...
The Statistical and Applied Mathematical Sciences Institute
 
Intro to ABC
Intro to ABCIntro to ABC
Intro to ABC
Matt Moores
 
Markov chain monte_carlo_methods_for_machine_learning
Markov chain monte_carlo_methods_for_machine_learningMarkov chain monte_carlo_methods_for_machine_learning
Markov chain monte_carlo_methods_for_machine_learning
Andres Mendez-Vazquez
 
Continuous and Discrete-Time Analysis of SGD
Continuous and Discrete-Time Analysis of SGDContinuous and Discrete-Time Analysis of SGD
Continuous and Discrete-Time Analysis of SGD
Valentin De Bortoli
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
Quantitative Propagation of Chaos for SGD in Wide Neural Networks
Quantitative Propagation of Chaos for SGD in Wide Neural NetworksQuantitative Propagation of Chaos for SGD in Wide Neural Networks
Quantitative Propagation of Chaos for SGD in Wide Neural Networks
Valentin De Bortoli
 
26 Machine Learning Unsupervised Fuzzy C-Means
26 Machine Learning Unsupervised Fuzzy C-Means26 Machine Learning Unsupervised Fuzzy C-Means
26 Machine Learning Unsupervised Fuzzy C-Means
Andres Mendez-Vazquez
 
Math Exam Help
Math Exam HelpMath Exam Help
Math Exam Help
Live Exam Helper
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
On learning statistical mixtures maximizing the complete likelihood
On learning statistical mixtures maximizing the complete likelihoodOn learning statistical mixtures maximizing the complete likelihood
On learning statistical mixtures maximizing the complete likelihood
Frank Nielsen
 
A series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropyA series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropy
Frank Nielsen
 
Probability Assignment Help
Probability Assignment HelpProbability Assignment Help
Probability Assignment Help
Statistics Assignment Help
 
Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)
Frank Nielsen
 

What's hot (20)

Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Habilitation à diriger des recherches
Habilitation à diriger des recherchesHabilitation à diriger des recherches
Habilitation à diriger des recherches
 
Gtti 10032021
Gtti 10032021Gtti 10032021
Gtti 10032021
 
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
 
Macrocanonical models for texture synthesis
Macrocanonical models for texture synthesisMacrocanonical models for texture synthesis
Macrocanonical models for texture synthesis
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...
QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...
QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...
 
Intro to ABC
Intro to ABCIntro to ABC
Intro to ABC
 
Markov chain monte_carlo_methods_for_machine_learning
Markov chain monte_carlo_methods_for_machine_learningMarkov chain monte_carlo_methods_for_machine_learning
Markov chain monte_carlo_methods_for_machine_learning
 
Continuous and Discrete-Time Analysis of SGD
Continuous and Discrete-Time Analysis of SGDContinuous and Discrete-Time Analysis of SGD
Continuous and Discrete-Time Analysis of SGD
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Quantitative Propagation of Chaos for SGD in Wide Neural Networks
Quantitative Propagation of Chaos for SGD in Wide Neural NetworksQuantitative Propagation of Chaos for SGD in Wide Neural Networks
Quantitative Propagation of Chaos for SGD in Wide Neural Networks
 
26 Machine Learning Unsupervised Fuzzy C-Means
26 Machine Learning Unsupervised Fuzzy C-Means26 Machine Learning Unsupervised Fuzzy C-Means
26 Machine Learning Unsupervised Fuzzy C-Means
 
Math Exam Help
Math Exam HelpMath Exam Help
Math Exam Help
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
On learning statistical mixtures maximizing the complete likelihood
On learning statistical mixtures maximizing the complete likelihoodOn learning statistical mixtures maximizing the complete likelihood
On learning statistical mixtures maximizing the complete likelihood
 
A series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropyA series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropy
 
Probability Assignment Help
Probability Assignment HelpProbability Assignment Help
Probability Assignment Help
 
Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)
 

Viewers also liked

IIT Scholarship Weekend Average Talk, 2017 February 9
IIT Scholarship Weekend Average Talk, 2017 February 9IIT Scholarship Weekend Average Talk, 2017 February 9
IIT Scholarship Weekend Average Talk, 2017 February 9
Fred J. Hickernell
 
Uncertainty and Sensitivity Analysis using HPC and HTC
Uncertainty and Sensitivity Analysis using HPC and HTCUncertainty and Sensitivity Analysis using HPC and HTC
Uncertainty and Sensitivity Analysis using HPC and HTC
openseesdays
 
Application of OpenSees in Reliability-based Design Optimization of Structures
Application of OpenSees in Reliability-based Design Optimization of StructuresApplication of OpenSees in Reliability-based Design Optimization of Structures
Application of OpenSees in Reliability-based Design Optimization of Structures
openseesdays
 
Introduction to OpenSees by Frank McKenna
Introduction to OpenSees by Frank McKennaIntroduction to OpenSees by Frank McKenna
Introduction to OpenSees by Frank McKenna
openseesdays
 
Geotechnical Examples using OpenSees
Geotechnical Examples using OpenSeesGeotechnical Examples using OpenSees
Geotechnical Examples using OpenSees
openseesdays
 
Natural Language Processing in R (rNLP)
Natural Language Processing in R (rNLP)Natural Language Processing in R (rNLP)
Natural Language Processing in R (rNLP)
fridolin.wild
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical Methods
Christian Robert
 
Monte carlo
Monte carloMonte carlo
Monte carlo
shishirkawde
 
Applying Monte Carlo Simulation to Microsoft Project Schedules
Applying Monte Carlo Simulation to Microsoft Project SchedulesApplying Monte Carlo Simulation to Microsoft Project Schedules
Applying Monte Carlo Simulation to Microsoft Project Schedules
jimparkpmp
 
Dynamic Analysis with Examples – Seismic Analysis
Dynamic Analysis with Examples – Seismic AnalysisDynamic Analysis with Examples – Seismic Analysis
Dynamic Analysis with Examples – Seismic Analysis
openseesdays
 
Monte carlo simulation
Monte carlo simulationMonte carlo simulation
Monte carlo simulation
Rajesh Piryani
 
High Dimensional Quasi Monte Carlo Method in Finance
High Dimensional Quasi Monte Carlo Method in FinanceHigh Dimensional Quasi Monte Carlo Method in Finance
High Dimensional Quasi Monte Carlo Method in Finance
Marco Bianchetti
 

Viewers also liked (12)

IIT Scholarship Weekend Average Talk, 2017 February 9
IIT Scholarship Weekend Average Talk, 2017 February 9IIT Scholarship Weekend Average Talk, 2017 February 9
IIT Scholarship Weekend Average Talk, 2017 February 9
 
Uncertainty and Sensitivity Analysis using HPC and HTC
Uncertainty and Sensitivity Analysis using HPC and HTCUncertainty and Sensitivity Analysis using HPC and HTC
Uncertainty and Sensitivity Analysis using HPC and HTC
 
Application of OpenSees in Reliability-based Design Optimization of Structures
Application of OpenSees in Reliability-based Design Optimization of StructuresApplication of OpenSees in Reliability-based Design Optimization of Structures
Application of OpenSees in Reliability-based Design Optimization of Structures
 
Introduction to OpenSees by Frank McKenna
Introduction to OpenSees by Frank McKennaIntroduction to OpenSees by Frank McKenna
Introduction to OpenSees by Frank McKenna
 
Geotechnical Examples using OpenSees
Geotechnical Examples using OpenSeesGeotechnical Examples using OpenSees
Geotechnical Examples using OpenSees
 
Natural Language Processing in R (rNLP)
Natural Language Processing in R (rNLP)Natural Language Processing in R (rNLP)
Natural Language Processing in R (rNLP)
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical Methods
 
Monte carlo
Monte carloMonte carlo
Monte carlo
 
Applying Monte Carlo Simulation to Microsoft Project Schedules
Applying Monte Carlo Simulation to Microsoft Project SchedulesApplying Monte Carlo Simulation to Microsoft Project Schedules
Applying Monte Carlo Simulation to Microsoft Project Schedules
 
Dynamic Analysis with Examples – Seismic Analysis
Dynamic Analysis with Examples – Seismic AnalysisDynamic Analysis with Examples – Seismic Analysis
Dynamic Analysis with Examples – Seismic Analysis
 
Monte carlo simulation
Monte carlo simulationMonte carlo simulation
Monte carlo simulation
 
High Dimensional Quasi Monte Carlo Method in Finance
High Dimensional Quasi Monte Carlo Method in FinanceHigh Dimensional Quasi Monte Carlo Method in Finance
High Dimensional Quasi Monte Carlo Method in Finance
 

Similar to Tulane March 2017 Talk

QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017
Fred J. Hickernell
 
Monte Carlo Methods 2017 July Talk in Montreal
Monte Carlo Methods 2017 July Talk in MontrealMonte Carlo Methods 2017 July Talk in Montreal
Monte Carlo Methods 2017 July Talk in Montreal
Fred J. Hickernell
 
Automatic bayesian cubature
Automatic bayesian cubatureAutomatic bayesian cubature
Automatic bayesian cubature
Jagadeeswaran Rathinavel
 
Bayesian Deep Learning
Bayesian Deep LearningBayesian Deep Learning
Bayesian Deep Learning
RayKim51
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
Christian Robert
 
H2O World - Consensus Optimization and Machine Learning - Stephen Boyd
H2O World - Consensus Optimization and Machine Learning - Stephen BoydH2O World - Consensus Optimization and Machine Learning - Stephen Boyd
H2O World - Consensus Optimization and Machine Learning - Stephen Boyd
Sri Ambati
 
Input analysis
Input analysisInput analysis
Input analysis
Bhavik A Shah
 
Data sparse approximation of Karhunen-Loeve Expansion
Data sparse approximation of Karhunen-Loeve ExpansionData sparse approximation of Karhunen-Loeve Expansion
Data sparse approximation of Karhunen-Loeve Expansion
Alexander Litvinenko
 
Slides
SlidesSlides
Data sparse approximation of the Karhunen-Loeve expansion
Data sparse approximation of the Karhunen-Loeve expansionData sparse approximation of the Karhunen-Loeve expansion
Data sparse approximation of the Karhunen-Loeve expansion
Alexander Litvinenko
 
QMC: Operator Splitting Workshop, Using Sequences of Iterates in Inertial Met...
QMC: Operator Splitting Workshop, Using Sequences of Iterates in Inertial Met...QMC: Operator Splitting Workshop, Using Sequences of Iterates in Inertial Met...
QMC: Operator Splitting Workshop, Using Sequences of Iterates in Inertial Met...
The Statistical and Applied Mathematical Sciences Institute
 
Minimax optimal alternating minimization \\ for kernel nonparametric tensor l...
Minimax optimal alternating minimization \\ for kernel nonparametric tensor l...Minimax optimal alternating minimization \\ for kernel nonparametric tensor l...
Minimax optimal alternating minimization \\ for kernel nonparametric tensor l...
Taiji Suzuki
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)
Christian Robert
 
A new implementation of k-MLE for mixture modelling of Wishart distributions
A new implementation of k-MLE for mixture modelling of Wishart distributionsA new implementation of k-MLE for mixture modelling of Wishart distributions
A new implementation of k-MLE for mixture modelling of Wishart distributions
Frank Nielsen
 
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
The Statistical and Applied Mathematical Sciences Institute
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
Christian Robert
 
Maximizing Submodular Function over the Integer Lattice
Maximizing Submodular Function over the Integer LatticeMaximizing Submodular Function over the Integer Lattice
Maximizing Submodular Function over the Integer Lattice
Tasuku Soma
 
Can we estimate a constant?
Can we estimate a constant?Can we estimate a constant?
Can we estimate a constant?
Christian Robert
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...
Alexander Litvinenko
 

Similar to Tulane March 2017 Talk (20)

QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017
 
Monte Carlo Methods 2017 July Talk in Montreal
Monte Carlo Methods 2017 July Talk in MontrealMonte Carlo Methods 2017 July Talk in Montreal
Monte Carlo Methods 2017 July Talk in Montreal
 
Automatic bayesian cubature
Automatic bayesian cubatureAutomatic bayesian cubature
Automatic bayesian cubature
 
Bayesian Deep Learning
Bayesian Deep LearningBayesian Deep Learning
Bayesian Deep Learning
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
 
H2O World - Consensus Optimization and Machine Learning - Stephen Boyd
H2O World - Consensus Optimization and Machine Learning - Stephen BoydH2O World - Consensus Optimization and Machine Learning - Stephen Boyd
H2O World - Consensus Optimization and Machine Learning - Stephen Boyd
 
Input analysis
Input analysisInput analysis
Input analysis
 
Data sparse approximation of Karhunen-Loeve Expansion
Data sparse approximation of Karhunen-Loeve ExpansionData sparse approximation of Karhunen-Loeve Expansion
Data sparse approximation of Karhunen-Loeve Expansion
 
Slides
SlidesSlides
Slides
 
Data sparse approximation of the Karhunen-Loeve expansion
Data sparse approximation of the Karhunen-Loeve expansionData sparse approximation of the Karhunen-Loeve expansion
Data sparse approximation of the Karhunen-Loeve expansion
 
QMC: Operator Splitting Workshop, Using Sequences of Iterates in Inertial Met...
QMC: Operator Splitting Workshop, Using Sequences of Iterates in Inertial Met...QMC: Operator Splitting Workshop, Using Sequences of Iterates in Inertial Met...
QMC: Operator Splitting Workshop, Using Sequences of Iterates in Inertial Met...
 
Minimax optimal alternating minimization \\ for kernel nonparametric tensor l...
Minimax optimal alternating minimization \\ for kernel nonparametric tensor l...Minimax optimal alternating minimization \\ for kernel nonparametric tensor l...
Minimax optimal alternating minimization \\ for kernel nonparametric tensor l...
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)
 
A new implementation of k-MLE for mixture modelling of Wishart distributions
A new implementation of k-MLE for mixture modelling of Wishart distributionsA new implementation of k-MLE for mixture modelling of Wishart distributions
A new implementation of k-MLE for mixture modelling of Wishart distributions
 
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
 
Maximizing Submodular Function over the Integer Lattice
Maximizing Submodular Function over the Integer LatticeMaximizing Submodular Function over the Integer Lattice
Maximizing Submodular Function over the Integer Lattice
 
Can we estimate a constant?
Can we estimate a constant?Can we estimate a constant?
Can we estimate a constant?
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...
 

Recently uploaded

3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
3D Hybrid PIC simulation of the plasma expansion (ISSS-14)3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
David Osipyan
 
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
University of Maribor
 
Nucleic Acid-its structural and functional complexity.
Nucleic Acid-its structural and functional complexity.Nucleic Acid-its structural and functional complexity.
Nucleic Acid-its structural and functional complexity.
Nistarini College, Purulia (W.B) India
 
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptxANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
RASHMI M G
 
Unveiling the Energy Potential of Marshmallow Deposits.pdf
Unveiling the Energy Potential of Marshmallow Deposits.pdfUnveiling the Energy Potential of Marshmallow Deposits.pdf
Unveiling the Energy Potential of Marshmallow Deposits.pdf
Erdal Coalmaker
 
Eukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptxEukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptx
RitabrataSarkar3
 
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Ana Luísa Pinho
 
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốtmô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
HongcNguyn6
 
NuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyerNuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyer
pablovgd
 
Shallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptxShallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptx
Gokturk Mehmet Dilci
 
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
Abdul Wali Khan University Mardan,kP,Pakistan
 
SAR of Medicinal Chemistry 1st by dk.pdf
SAR of Medicinal Chemistry 1st by dk.pdfSAR of Medicinal Chemistry 1st by dk.pdf
SAR of Medicinal Chemistry 1st by dk.pdf
KrushnaDarade1
 
Chapter 12 - climate change and the energy crisis
Chapter 12 - climate change and the energy crisisChapter 12 - climate change and the energy crisis
Chapter 12 - climate change and the energy crisis
tonzsalvador2222
 
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptxThe use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
MAGOTI ERNEST
 
DMARDs Pharmacolgy Pharm D 5th Semester.pdf
DMARDs Pharmacolgy Pharm D 5th Semester.pdfDMARDs Pharmacolgy Pharm D 5th Semester.pdf
DMARDs Pharmacolgy Pharm D 5th Semester.pdf
fafyfskhan251kmf
 
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdfTopic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
TinyAnderson
 
The debris of the ‘last major merger’ is dynamically young
The debris of the ‘last major merger’ is dynamically youngThe debris of the ‘last major merger’ is dynamically young
The debris of the ‘last major merger’ is dynamically young
Sérgio Sacani
 
aziz sancar nobel prize winner: from mardin to nobel
aziz sancar nobel prize winner: from mardin to nobelaziz sancar nobel prize winner: from mardin to nobel
aziz sancar nobel prize winner: from mardin to nobel
İsa Badur
 
Medical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptxMedical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptx
terusbelajar5
 
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
yqqaatn0
 

Recently uploaded (20)

3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
3D Hybrid PIC simulation of the plasma expansion (ISSS-14)3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
 
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
 
Nucleic Acid-its structural and functional complexity.
Nucleic Acid-its structural and functional complexity.Nucleic Acid-its structural and functional complexity.
Nucleic Acid-its structural and functional complexity.
 
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptxANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
 
Unveiling the Energy Potential of Marshmallow Deposits.pdf
Unveiling the Energy Potential of Marshmallow Deposits.pdfUnveiling the Energy Potential of Marshmallow Deposits.pdf
Unveiling the Energy Potential of Marshmallow Deposits.pdf
 
Eukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptxEukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptx
 
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
 
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốtmô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
 
NuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyerNuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyer
 
Shallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptxShallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptx
 
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
 
SAR of Medicinal Chemistry 1st by dk.pdf
SAR of Medicinal Chemistry 1st by dk.pdfSAR of Medicinal Chemistry 1st by dk.pdf
SAR of Medicinal Chemistry 1st by dk.pdf
 
Chapter 12 - climate change and the energy crisis
Chapter 12 - climate change and the energy crisisChapter 12 - climate change and the energy crisis
Chapter 12 - climate change and the energy crisis
 
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptxThe use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
 
DMARDs Pharmacolgy Pharm D 5th Semester.pdf
DMARDs Pharmacolgy Pharm D 5th Semester.pdfDMARDs Pharmacolgy Pharm D 5th Semester.pdf
DMARDs Pharmacolgy Pharm D 5th Semester.pdf
 
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdfTopic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
 
The debris of the ‘last major merger’ is dynamically young
The debris of the ‘last major merger’ is dynamically youngThe debris of the ‘last major merger’ is dynamically young
The debris of the ‘last major merger’ is dynamically young
 
aziz sancar nobel prize winner: from mardin to nobel
aziz sancar nobel prize winner: from mardin to nobelaziz sancar nobel prize winner: from mardin to nobel
aziz sancar nobel prize winner: from mardin to nobel
 
Medical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptxMedical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptx
 
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
 

Tulane March 2017 Talk

  • 1. Simulating the Mean Efficiently and to a Given Tolerance Fred J. Hickernell Department of Applied Mathematics, Illinois Institute of Technology hickernell@iit.edu mypages.iit.edu/~hickernell Thanks to Lan Jiang, Tony Jiménez Rugama, Jagadees Rathinavel, and the rest of the the Guaranteed Automatic Integration Library (GAIL) team Supported by NSF-DMS-1522687 Thanks for your kind invitation
  • 2. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Estimating/Simulating/Computing an Integral Gaussian probability = ż [a,b] e´xT Σ´1 x/2 (2π)d/2 |Σ|1/2 dx 2/16
  • 3. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Estimating/Simulating/Computing an Integral Gaussian probability = ż [a,b] e´xT Σ´1 x/2 (2π)d/2 |Σ|1/2 dx option price = ż Rd payoff(x) e´xT Σ´1 x/2 (2π)d/2 |Σ|1/2 looooooomooooooon PDF of Brownian motion at d times dx 2/16
  • 4. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Estimating/Simulating/Computing an Integral Gaussian probability = ż [a,b] e´xT Σ´1 x/2 (2π)d/2 |Σ|1/2 dx option price = ż Rd payoff(x) e´xT Σ´1 x/2 (2π)d/2 |Σ|1/2 looooooomooooooon PDF of Brownian motion at d times dx Bayesian ^βj = ż Rd βj prob(β|data) dβ = ş Rd βj prob(data|β) probprior(β) dβ ş Rd prob(data|β) probprior(β) dβ 2/16
  • 5. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Estimating/Simulating/Computing an Integral Gaussian probability = ż [a,b] e´xT Σ´1 x/2 (2π)d/2 |Σ|1/2 dx option price = ż Rd payoff(x) e´xT Σ´1 x/2 (2π)d/2 |Σ|1/2 looooooomooooooon PDF of Brownian motion at d times dx Bayesian ^βj = ż Rd βj prob(β|data) dβ = ş Rd βj prob(data|β) probprior(β) dβ ş Rd prob(data|β) probprior(β) dβ Sobol’ indexj = ş [0,1]2d output(x) ´ output(xj, x1 ´j) output(x1 ) dx dx1 ş [0,1]d output(x)2 dx ´ ş [0,1]d output(x) dx 2 2/16
  • 6. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Estimating/Simulating/Computing the Mean Gaussian probability = ż [a,b] e´xT Σ´1 x/2 (2π)d/2 |Σ|1/2 dx option price = ż Rd payoff(x) e´xT Σ´1 x/2 (2π)d/2 |Σ|1/2 looooooomooooooon PDF of Brownian motion at d times dx Bayesian ^βj = ż Rd βj prob(β|data) dβ = ş Rd βj prob(data|β) probprior(β) dβ ş Rd prob(data|β) probprior(β) dβ Sobol’ indexj = ş [0,1]2d output(x) ´ output(xj, x1 ´j) output(x1 ) dx dx1 ş [0,1]d output(x)2 dx ´ ş [0,1]d output(x) dx 2 µ = ż Rd g(x) dx = E[f(X)] = ż Rd f(x) ν(dx) =?, ^µn = nÿ i=1 wif(xi) How to choose ν, txiun i=1, and twiun i=1 to make |µ ´ ^µn| small? (trio identity) Given εa, how big must n be to guarantee |µ ´ ^µn| ď εa? (adaptive cubature) 2/16
  • 7. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Product Rules Using Rectangular Grids µ = ż Rd f(x) ν(dx) « ^µn = nÿ i=1 wif(xi) If ż 1 0 f(x) dx ´ mÿ i=1 wif(ti) = O(m´r ), then ż [0,1]d f(x) dx ´ mÿ i1=1 ¨ ¨ ¨ mÿ id=1 wi1 ¨ ¨ ¨ wid f(ti1 , . . . , tid ) = O(m´r ) = O(n´r/d ) assuming rth derivatives in each direction exist. 3/16
  • 8. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Product Rules Using Rectangular Grids µ = ż Rd f(x) ν(dx) « ^µn = nÿ i=1 wif(xi) If ż 1 0 f(x) dx ´ mÿ i=1 wif(ti) = O(m´r ), then ż [0,1]d f(x) dx ´ mÿ i1=1 ¨ ¨ ¨ mÿ id=1 wi1 ¨ ¨ ¨ wid f(ti1 , . . . , tid ) = O(m´r ) = O(n´r/d ) assuming rth derivatives in each direction exist. 3/16
  • 9. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Product Rules Using Rectangular Grids µ = ż Rd f(x) ν(dx) « ^µn = nÿ i=1 wif(xi) If ż 1 0 f(x) dx ´ mÿ i=1 wif(ti) = O(m´r ), then ż [0,1]d f(x) dx ´ mÿ i1=1 ¨ ¨ ¨ mÿ id=1 wi1 ¨ ¨ ¨ wid f(ti1 , . . . , tid ) = O(m´r ) = O(n´r/d ) assuming rth derivatives in each direction exist. But the computational cost becomes prohibitive for large dimensions, d: d 1 2 5 10 100 n = 8d 8 64 3.3E4 1.0E9 2.0E90 Product rules are typically a bad idea unless d is small. 3/16
  • 10. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Monte Carlo Simulation in the News Sampling with a computer can be fast How big is our error? 4/16
  • 11. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References IID Monte Carlo µ = ż Rd f(x) ν(dx) « ^µn = 1 n nÿ i=1 f(xi), xi IID „ ν µ ´ ^µn = µ ´ ^µn std(f(X))/ ? nlooooooomooooooon CNF„(0,1) ˆ 1 ? nloomoon DSC(txiu) ˆ std(f(X))loooomoooon VAR(f) trio identity (Meng, 2017+) µ ´ ^µn = ż Rd f(x) (ν ´ ^νn)(dx) ^νn = 1 n nÿ i=1 δxi 5/16
  • 12. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Central Limit Theorem Stopping Rule for IID Monte Carlo µ = ż Rd f(x) ν(dx) « ^µn = 1 n nÿ i=1 f(xi), xi IID „ ν µ ´ ^µn = µ ´ ^µn std(f(X))/ ? nlooooooomooooooon CNF„(0,1) ˆ 1 ? nloomoon DSC(txiu) ˆ std(f(X))loooomoooon VAR(f) P[|µ ´ ^µn| ď errn] « 99% for errn = 2.58 ˆ 1.2^σ ? n by the Central Limit Theorem (CLT), where ^σ2 is the sample variance. 5/16
  • 13. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Central Limit Theorem Stopping Rule for IID Monte Carlo µ = ż Rd f(x) ν(dx) « ^µn = 1 n nÿ i=1 f(xi), xi IID „ ν µ ´ ^µn = µ ´ ^µn std(f(X))/ ? nlooooooomooooooon CNF„(0,1) ˆ 1 ? nloomoon DSC(txiu) ˆ std(f(X))loooomoooon VAR(f) P[|µ ´ ^µn| ď errn] « 99% for errn = 2.58 ˆ 1.2^σ ? n by the Central Limit Theorem (CLT), where ^σ2 is the sample variance. But the CLT is only an asymptotic result, and 1.2^σ may be an overly optimistic upper bound on σ. 5/16
  • 14. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Berry-Esseen Stopping Rule for IID Monte Carlo µ = ż Rd f(x) ν(dx) « ^µn = 1 n nÿ i=1 f(xi), xi IID „ ν µ ´ ^µn = µ ´ ^µn std(f(X))/ ? nlooooooomooooooon CNF„(0,1) ˆ 1 ? nloomoon DSC(txiu) ˆ std(f(X))loooomoooon VAR(f) P[|µ ´ ^µn| ď errn] ě 99% for Φ ´ ? n errn /(1.2^σnσ ) + ∆n(´ ? n errn /(1.2^σnσ ), κmax) = 0.0025 by the Berry-Esseen Inequality, where ^σ2 nσ is the sample variance using an independent sample from that used to simulate the mean, and provided that kurt(f(X)) ď κmax(nσ) (H. et al., 2013; Jiang, 2016) 5/16
  • 15. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 16. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 17. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 18. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 19. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 20. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 21. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 22. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 23. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 24. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 25. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 26. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 27. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 28. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 29. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 30. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 31. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Normally n should be a power of 2 6/16
  • 32. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Let tpf(k)uk denote the coefficients of the Fourier Walsh or complex exponential expansion of f. Let tω(k)uk be some weights. Then µ ´ ^µn = ´ ÿ 0‰kPdual pf(k) ! pf(k) ω(k) ) k 2 tω(k)u0‰kPdual 2 loooooooooooooooooooomoooooooooooooooooooon CNFP[´1,1] ˆ tω(k)u0‰kPdual 2loooooooooomoooooooooon DSC(txiun i=1)=O(n´1+ ) ˆ # pf(k) ω(k) + k 2looooooomooooooon VAR(f) 6/16
  • 33. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx ^µn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Let tpf(k)uk denote the coefficients of the Fourier Walsh or complex exponential expansion of f. Assuming that the pf(k) do not decay erratically as k Ñ ∞, the discrete transform, rfn(k) ( k , may be used to bound the error reliably (H. and Jiménez Rugama, 2016; Jiménez Rugama and H., 2016; H. et al., 2017+): |µ ´ ^µn| ď errn := C(n) ÿ certaink rfn(k) 6/16
  • 34. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Bayesian Cubature—f Is Random µ = ż Rd f(x) ν(dx) « ^µn = nÿ i=1 wi f(xi) Assume f „ GP(0, s2 Cθ) (Diaconis, 1988; O’Hagan, 1991; Ritter, 2000; Rasmussen and Ghahramani, 2003) c0 = ż RdˆRd Cθ(x, t) ν(dx)ν(dt) c = ż Rd Cθ(xi, t) ν(dt) n i=1 , C = Cθ(xi, xj) n i,j=1 Choosing w = wi n i=1 = C´1 c is optimal µ ´ ^µn = µ ´ ^µn b c0 ´ cTC´1c yTC´1y nlooooooooooooooomooooooooooooooon CNF„N(0,1) ˆ a c0 ´ cTC´1cloooooooomoooooooon DSC ˆ c yTC´1y nlooooomooooon VAR(f) where y = f(xi) n i=1 . 7/16
  • 35. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Bayesian Cubature—f Is Random µ = ż Rd f(x) ν(dx) « ^µn = nÿ i=1 wi f(xi) Assume f „ GP(0, s2 Cθ) (Diaconis, 1988; O’Hagan, 1991; Ritter, 2000; Rasmussen and Ghahramani, 2003) c0 = ż RdˆRd Cθ(x, t) ν(dx)ν(dt) c = ż Rd Cθ(xi, t) ν(dt) n i=1 , C = Cθ(xi, xj) n i,j=1 Choosing w = wi n i=1 = C´1 c is optimal P[|µ ´ ^µn| ď errn] = 99% for errn = 2.58 c c0 ´ cTC´1c yTC´1y n where y = f(xi) n i=1 . But, θ needs to be inferred (by MLE), and C´1 typically requires O(n3 ) operations 7/16
  • 36. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Gaussian Probability µ = ż [a,b] exp ´1 2 tT Σ´1 t a (2π)d det(Σ) dt Genz (1993) = ż [0,1]d´1 f(x) dx For some typical choice of a, b, Σ, d = 3, εa = 0; µ « 0.6763 Worst 10% Worst 10% εr Method % Accuracy n Time (s) IID Monte Carlo 100% 8.1E4 1.8E´2 1E´2 Sobol’ Sampling 100% 1.0E3 5.1E´3 Bayesian Lattice 100% 1.0E3 2.8E´3 IID Monte Carlo 100% 2.0E6 3.8E´1 1E´3 Sobol’ Sampling 100% 2.0E3 7.7E´3 Bayesian Lattice 100% 1.0E3 2.8E´3 1E´4 Sobol’ Sampling 100% 1.6E4 1.8E´2 Bayesian Lattice 100% 8.2E3 1.4E´2 Bayesian lattice cubature uses covariance kernel C for which C is circulant, and operations on C require only O(n log(n)) operations 8/16
  • 37. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Asian Option Pricing fair price = ż Rd e´rT max  1 d dÿ j=1 Sj ´ K, 0   e´xT Σ´1 x/2 (2π)d/2 |Σ|1/2 dx « $13.12 Sj = S0e(r´σ2 /2)jT/d+σxj = stock price at time jT/d, Σ = min(i, j)T/d d i,j=1 Worst 10% Worst 10% εa = 1E´4 Method % Accuracy n Time (s) Sobol’ Sampling 100% 2.1E6 4.3 Sobol’ Sampling w/ control variates 97% 1.0E6 2.1 The coefficient of the control variate for low discrepancy sampling is different than for IID Monte Carlo (H. et al., 2005; H. et al., 2017+) 9/16
  • 38. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Sobol’ Indices Y = output(X), where X „ U[0, 1]d ; Sobol’ Indexj(µ) describes how much coordinate j of input X influences output Y (Sobol’, 1990; 2001): Sobol’ Indexj(µ) := µ1 µ2 ´ µ2 3 , j = 1, . . . , d µ1 := ż [0,1)2d [output(x) ´ output(xj, x1 ´j)]output(x1 ) dx dx1 µ2 := ż [0,1)d output(x)2 dx, µ3 := ż [0,1)d output(x) dx output(x) = ´x1 + x1x2 ´ x1x2x3 + ¨ ¨ ¨ + x1x2x3x4x5x6 (Bratley et al., 1992) εa = 1E´3, εr = 0 j 1 2 3 4 5 6 n 65 536 32 768 16 384 16 384 2 048 2 048 Sobol’ Indexj 0.6529 0.1791 0.0370 0.0133 0.0015 0.0015 {Sobol’ Indexj 0.6528 0.1792 0.0363 0.0126 0.0010 0.0012 Sobol’ Indexj(pµn) 0.6492 0.1758 0.0308 0.0083 0.0018 0.0039 10/16
  • 39. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Summary The error in simulating the mean can be decomposed as a trio identity (Meng, 2017+; H., 2017+) Knowing when to stop a simulation of the mean is not trivial (H. et al., 2017+) The Berry-Esseen inequality can tell us when to stop an IID simulation Fourier analysis can tell us when to stop a low discrepancy simulation Bayesian cubature can tell us when to stop a simulation if you can afford the computational cost All methods can be fooled by nasty functions, f Relative error tolerances and problems involving functions of integrals can be handled (H. et al., 2017+) Our algorithms are implemented in the Guaranteed Automatic Integration Library (GAIL) (Choi et al., 2013–2015), which is under continuous development 11/16
  • 40. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Upcoming SAMSI Quasi-Monte Carlo Program 12/16
  • 41. Thank you Slides available at www.slideshare.net/fjhickernell/ tulane-march-2017-talk
  • 42. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References References I Bratley, P., B. L. Fox, and H. Niederreiter. 1992. Implementation and tests of low-discrepancy sequences, ACM Trans. Model. Comput. Simul. 2, 195–213. Choi, S.-C. T., Y. Ding, F. J. H., L. Jiang, Ll. A. Jiménez Rugama, X. Tong, Y. Zhang, and X. Zhou. 2013–2015. GAIL: Guaranteed Automatic Integration Library (versions 1.0–2.1). Cools, R. and D. Nuyens (eds.) 2016. Monte Carlo and quasi-Monte Carlo methods: MCQMC, Leuven, Belgium, April 2014, Springer Proceedings in Mathematics and Statistics, vol. 163, Springer-Verlag, Berlin. Diaconis, P. 1988. Bayesian numerical analysis, Statistical decision theory and related topics IV, Papers from the 4th Purdue symp., West Lafayette, Indiana 1986, pp. 163–175. Genz, A. 1993. Comparison of methods for the computation of multivariate normal probabilities, Computing Science and Statistics 25, 400–405. H., F. J. 2017+. Error analysis of quasi-Monte Carlo methods. submitted for publication, arXiv:1702.01487. H., F. J., L. Jiang, Y. Liu, and A. B. Owen. 2013. Guaranteed conservative fixed width confidence intervals via Monte Carlo sampling, Monte Carlo and quasi-Monte Carlo methods 2012, pp. 105–128. H., F. J. and Ll. A. Jiménez Rugama. 2016. Reliable adaptive cubature using digital sequences, Monte Carlo and quasi-Monte Carlo methods: MCQMC, Leuven, Belgium, April 2014, pp. 367–383. arXiv:1410.8615 [math.NA]. 14/16
  • 43. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References References II H., F. J., Ll. A. Jiménez Rugama, and D. Li. 2017+. Adaptive quasi-Monte Carlo methods. submitted for publication, arXiv:1702.01491 [math.NA]. H., F. J., C. Lemieux, and A. B. Owen. 2005. Control variates for quasi-Monte Carlo, Statist. Sci. 20, 1–31. Jiang, L. 2016. Guaranteed adaptive Monte Carlo methods for estimating means of random variables, Ph.D. Thesis. Jiménez Rugama, Ll. A. and F. J. H. 2016. Adaptive multidimensional integration based on rank-1 lattices, Monte Carlo and quasi-Monte Carlo methods: MCQMC, Leuven, Belgium, April 2014, pp. 407–422. arXiv:1411.1966. Meng, X. 2017+. Statistical paradises and paradoxes in big data. in preparation. O’Hagan, A. 1991. Bayes-Hermite quadrature, J. Statist. Plann. Inference 29, 245–260. Rasmussen, C. E. and Z. Ghahramani. 2003. Bayesian Monte Carlo, Advances in Neural Information Processing Systems, pp. 489–496. Ritter, K. 2000. Average-case analysis of numerical problems, Lecture Notes in Mathematics, vol. 1733, Springer-Verlag, Berlin. Sobol’, I. M. 1990. On sensitivity estimation for nonlinear mathematical models, Matem. Mod. 2, no. 1, 112–118. . 2001. Global sensitivity indices for nonlinear mathematical models and their monte carlo estimates, Math. Comput. Simul. 55, no. 1-3, 271–280. 15/16
  • 44. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Maximum Likelihood Estimation of the Covariance Kernel f „ GP(0, s2 Cθ), Cθ = Cθ(xi, xj) n i,j=1 y = f(xi) n i=1 , ^µn = cT ^θ C´1 ^θ y ^θ = argmin θ yT C´1 θ y [det(C´1 θ )]1/n P[|µ ´ ^µn| ď errn] = 99% for errn = 2.58 ? n b c0,^θ ´ cT ^θ C´1 ^θ c^θ yTC´1 ^θ y 16/16
  • 45. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Maximum Likelihood Estimation of the Covariance Kernel f „ GP(0, s2 Cθ), Cθ = Cθ(xi, xj) n i,j=1 y = f(xi) n i=1 , ^µn = cT ^θ C´1 ^θ y ^θ = argmin θ yT C´1 θ y [det(C´1 θ )]1/n P[|µ ´ ^µn| ď errn] = 99% for errn = 2.58 ? n b c0,^θ ´ cT ^θ C´1 ^θ c^θ yTC´1 ^θ y There is a de-randomized interpretation of Bayesian cubature (H., 2017+) f P Hilbert space w/ reproducing kernel Cθ and with best interpolant rfy 16/16
  • 46. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Maximum Likelihood Estimation of the Covariance Kernel f „ GP(0, s2 Cθ), Cθ = Cθ(xi, xj) n i,j=1 y = f(xi) n i=1 , ^µn = cT ^θ C´1 ^θ y ^θ = argmin θ yT C´1 θ y [det(C´1 θ )]1/n P[|µ ´ ^µn| ď errn] = 99% for errn = 2.58 ? n b c0,^θ ´ cT ^θ C´1 ^θ c^θ yTC´1 ^θ y There is a de-randomized interpretation of Bayesian cubature (H., 2017+) f P Hilbert space w/ reproducing kernel Cθ and with best interpolant rfy |µ ´ ^µn| ď 2.58 ? n b c0,^θ ´ cT ^θ C´1 ^θ c^θ loooooooooomoooooooooon error representer ^θ b yTC´1 ^θ y looooomooooon rfy ^θ if f ´ rfy ^θ ď 2.58 rf ^θ? n 16/16
  • 47. Introduction IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Examples References Maximum Likelihood Estimation of the Covariance Kernel f „ GP(0, s2 Cθ), Cθ = Cθ(xi, xj) n i,j=1 y = f(xi) n i=1 , ^µn = cT ^θ C´1 ^θ y ^θ = argmin θ yT C´1 θ y [det(C´1 θ )]1/n P[|µ ´ ^µn| ď errn] = 99% for errn = 2.58 ? n b c0,^θ ´ cT ^θ C´1 ^θ c^θ yTC´1 ^θ y There is a de-randomized interpretation of Bayesian cubature (H., 2017+) f P Hilbert space w/ reproducing kernel Cθ and with best interpolant rfy ^θ = argmin θ yT C´1 θ y [det(C´1 θ )]1/n = argmin θ vol z P Rn : rfz θ ď rfy θ ( |µ ´ ^µn| ď 2.58 ? n b c0,^θ ´ cT ^θ C´1 ^θ c^θ loooooooooomoooooooooon error representer ^θ b yTC´1 ^θ y looooomooooon rfy ^θ if f ´ rfy ^θ ď 2.58 rf ^θ? n 16/16