SlideShare a Scribd company logo
Error Analysis for Quasi-Monte Carlo Methods
Fred J. Hickernell
Department of Applied Mathematics, Illinois Institute of Technology
hickernell@iit.edu mypages.iit.edu/~hickernell
Thanks to the Guaranteed Automatic Integration Library (GAIL) team and friends
Supported by NSF-DMS-1522687
Thanks to SAMSI and the QMC Program organizers
SAMSI-QMC Opening Worskshop, August 28, 2017
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Integration Examples
µ =
Rd
f(x) ν(dx) = option price,
f(x) = discounted payoff determined by market forces x
µ =
X
g(x) dx = P(X ∈ X) = probability
g = probability density function
µj
µ0
= X
bjL(b|data) (b) db
X
L(b|data) (b) db
= Bayesian posterior expectation of βj
µ = E[f(X)] =
X
f(x) ν(dx) ≈ ^µ =
n
i=1
f(xi)wi =
X
f(x) ^ν(dx) µ − ^µ = ?
2/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Integration Examples
µ =
Rd
f(x) ν(dx) = option price,
f(x) = discounted payoff determined by market forces x
µ =
X
g(x) dx = P(X ∈ X) = probability
g = probability density function
µj
µ0
= X
bjL(b|data) (b) db
X
L(b|data) (b) db
= Bayesian posterior expectation of βj
µ = E[f(X)] =
X
f(x) ν(dx) ≈ ^µ =
n
i=1
f(xi)wi =
X
f(x) ^ν(dx) µ − ^µ = ?
2/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Integration Examples
µ =
Rd
f(x) ν(dx) = option price,
f(x) = discounted payoff determined by market forces x
µ =
X
g(x) dx = P(X ∈ X) = probability
g = probability density function
µj
µ0
= X
bjL(b|data) (b) db
X
L(b|data) (b) db
= Bayesian posterior expectation of βj
µ = E[f(X)] =
X
f(x) ν(dx) ≈ ^µ =
n
i=1
f(xi)wi =
X
f(x) ^ν(dx) µ − ^µ = ?
2/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Integration Examples
µ =
Rd
f(x) ν(dx) = option price,
f(x) = discounted payoff determined by market forces x
µ =
X
g(x) dx = P(X ∈ X) = probability
g = probability density function
µj
µ0
= X
bjL(b|data) (b) db
X
L(b|data) (b) db
= Bayesian posterior expectation of βj
µ = E[f(X)] =
X
f(x) ν(dx) ≈ ^µ =
n
i=1
f(xi)wi =
X
f(x) ^ν(dx) µ − ^µ = ?
2/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Error Analysis for Quasi-Monte Carlo Methods
µ := E[f(X)] :=
X
f(x) ν(dx) ≈ ^µ :=
n
i=1
f(xi)wi =
X
f(x) ^ν(dx)
µ − ^µ =
X
f(x) (ν − ^ν)(dx) = ?
Reality Error Analysis (Theory)
What is What we think it should be by rigorous argument
Not certain that our assumptions hold
3/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Error Analysis for Quasi-Monte Carlo Methods
µ := E[f(X)] :=
X
f(x) ν(dx) ≈ ^µ :=
n
i=1
f(xi)wi =
X
f(x) ^ν(dx)
µ − ^µ =
X
f(x) (ν − ^ν)(dx) = ?
Reality Error Analysis (Theory)
What is What we think it should be by rigorous argument
Not certain that our assumptions hold
Quasi-Monte Carlo Methods
x1, x2, . . . chosen more carefully that IID, dependent
3/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Error Decomposition in Reproducing Kernel Hilbert Spaces
Let the integrand lie in a Hilbert space, H, with reproducing kernel, K : X × X → R:
K(·, t) ∈ H, f(t) = K(·, t), f ∀t ∈ X, f ∈ H
The cubature error is given by the inner product with the representer, ηerr:
µ − ^µ =
X
f(x) (ν − ^ν)(dx) = ηerr, f , where ηerr(t) = K(·, t), ηerr =
X
K(x, t) (ν − ^ν)(dx)
The Cauchy-Schwartz inequality says that (H., 2000)
µ − ^µ = ηerr, f = cos(f, ηerr) DSC(ν − ^ν) f H discrepancy DSC(ν − ^ν) := ηerr H
DSC2
= ηerr, ηerr =
X×X
K(x, t) (ν − ^ν)(dx)(ν − ^ν)(dt)
=
X2
K(x, t) ν(dx)ν(dt) − 2
n
i=1
wi
X
K(xi, t) ν(dt) +
n
i,j=1
wiwjK(xi, xj)
= k0 − 2kT
w + wT
Kw w = K−1
k is optimal but expensive to compute
4/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
L2
-Discrepancy
ν = uniform on X = [0, 1]d
, ^ν = uniform on {xi}n
i=1, K(x, t) =
d
k=1
[2 − max(xk, tk)] (H., 1998)
f H = ∂u
f L2
u⊆1:d 2
, ∂u
f :=
∂|u|
f
∂xu xsu = 1
, 1:d := {1, . . . , d}, su := 1:d  u
DSC2
=
4
3
d
−
2
n
n
i=1
d
k=1
3 − x2
ik
2
+
1
n2
n
i,j=1
d
k=1
[2 − max(xik, xjk)]
5/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
L2
-Discrepancy
ν = uniform on X = [0, 1]d
, ^ν = uniform on {xi}n
i=1, K(x, t) =
d
k=1
[2 − max(xk, tk)] (H., 1998)
f H = ∂u
f L2
u⊆1:d 2
, ∂u
f :=
∂|u|
f
∂xu xsu = 1
, 1:d := {1, . . . , d}, su := 1:d  u
DSC2
=
4
3
d
−
2
n
n
i=1
d
k=1
3 − x2
ik
2
+
1
n2
n
i,j=1
d
k=1
[2 − max(xik, xjk)]
= ν([0, ·u]) − ^ν([0, ·u]) L2
∅=u⊆1:d
2
2
geometric interpretation
x = (. . . , 0.6, . . . , 0.4, . . .)
ν([0, x{5,8}]) − ^ν([0, x{5,8}])
= 0.24 − 7/32 = 0.02125
5/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
L2
-Discrepancy
ν = uniform on X = [0, 1]d
, ^ν = uniform on {xi}n
i=1, K(x, t) =
d
k=1
[2 − max(xk, tk)] (H., 1998)
f H = ∂u
f L2
u⊆1:d 2
, ∂u
f :=
∂|u|
f
∂xu xsu = 1
, 1:d := {1, . . . , d}, su := 1:d  u
DSC2
=
4
3
d
−
2
n
n
i=1
d
k=1
3 − x2
ik
2
+
1
n2
n
i,j=1
d
k=1
[2 − max(xik, xjk)]
= ν([0, ·u]) − ^ν([0, ·u]) L2
∅=u⊆1:d
2
2
geometric interpretation
µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f − f(1) H
DSC(ν − ^ν)
requires O(dn2
) operations to evaluate



= O(n−1/2
) on average for IID Monte Carlo
= O(n−1+
) for digital nets, integration lattices, ...
(Niederreiter, 1992; Dick and Pillichshammer, 2010)
O(n−1
) for all ^ν because this K has limited smoothness 5/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Multivariate Normal Probability
µ =
[a,b]
exp −1
2 tT
Σ−1
t
(2π)d det(Σ)
dt
Genz (1993)
=
[0,1]d−1
f(x) dx, ^ν = uniform on {xi}n
i=1
µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H
DSC(ν−^ν) =



O(n−1/2
) for IID MC
O(n−1+
) for Sobol’
6/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Multivariate Normal Probability
µ =
[a,b]
exp −1
2 tT
Σ−1
t
(2π)d det(Σ)
dt
Genz (1993)
=
[0,1]d−1
f(x) dx, ^ν = uniform on {xi}n
i=1
µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H
DSC(ν−^ν) =



O(n−1/2
) for IID MC
O(n−1+
) for Sobol’
6/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Multivariate Normal Probability
µ =
[a,b]
exp −1
2 tT
Σ−1
t
(2π)d det(Σ)
dt
Genz (1993)
=
[0,1]d−1
f(x) dx, ^ν = uniform on {xi}n
i=1
µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H
DSC(ν−^ν) =



O(n−1/2
) for IID MC
O(n−1+
) for Sobol’
O(n−1.5+
) for Scr. Sobol’
w/ smoother kernel (Owen, 1997)
(H. and Yue, 2000; Heinrich et al., 2004)
6/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Multivariate Normal Probability
µ =
[a,b]
exp −1
2 tT
Σ−1
t
(2π)d det(Σ)
dt
Genz (1993)
=
[0,1]d−1
f(x) dx, ^ν = uniform on {xi}n
i=1
µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H
DSC(ν−^ν) =



O(n−1/2
) for IID MC
O(n−1+
) for Sobol’
O(n−1.5+
) for Scr. Sobol’
w/ smoother kernel (Owen, 1997)
(H. and Yue, 2000; Heinrich et al., 2004)
For some typical choice of a, b, Σ, and d = 3
µ ≈ 0.6763
Low discrepancy sampling reduces cubature error;
randomization can help more
6/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Mean Square Discrepancy for Randomized Low Discrepancy Sets
Popular low discrepancy sets on [0, 1)d
—digital nets and lattice node sets—can be randomized while
preserving their low discrepancy properties. Let ψ : [0, 1)d
→ [0, 1)d
be a random mapping and
ψ(x) ∼ U[0, 1)d
, Kf(x, t) := E[K(ψ(x), ψ(t)] Kf1 = (1T
Kf,1)1, k0 = 1
where the vector Kf,1 is the first column of the matrix Kf, the analog of K with the original sequence but
for the filtered kernel, Kf. Taking equal weights w = 1/n,
E{[DSC(ν − ^ν; K)]2
} = E[k0 − 2kT
w + wT
Kw] = −1 +
1T
Kf,1
n
= [DSC(ν − ^ν; Kf)]2
7/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Mean Square Discrepancy for Randomized Low Discrepancy Sets
Popular low discrepancy sets on [0, 1)d
—digital nets and lattice node sets—can be randomized while
preserving their low discrepancy properties. Let ψ : [0, 1)d
→ [0, 1)d
be a random mapping and
ψ(x) ∼ U[0, 1)d
, Kf(x, t) := E[K(ψ(x), ψ(t)] Kf1 = (1T
Kf,1)1, k0 = 1
where the vector Kf,1 is the first column of the matrix Kf, the analog of K with the original sequence but
for the filtered kernel, Kf. Taking equal weights w = 1/n,
E{[DSC(ν − ^ν; K)]2
} = E[k0 − 2kT
w + wT
Kw] = −1 +
1T
Kf,1
n
= [DSC(ν − ^ν; Kf)]2
DSC(ν − ^ν; Kf) only requires O(n) operations to compute
Suggests using Hilbert spaces with reproducing kernels of the form Kf
Kf(x, t) =
l
λlφl(x)sφl(t),
φl(x) = Walsh functions for digital nets
φl(x) = exp(2πlT
x) for lattice nodesets
with λl chosen carefully so that the series sums in closed form and Kf has the desired smoothness
properties
7/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
The Effect of Dimension on the Decay of the Discrepancy
What happens when dimension, d, is large and but sample size, n, is modest?
L2
-discrepancy and variation:
DSC2
=
4
3
d
−
2
n
n
i=1
d
k=1
3 − x2
ik
2
+
1
n2
n
i,j=1
d
k=1
[2 − max(xik, xjk)]
f H =
∂|u|
f
∂xu xsu=1 L2 u⊆1:d 2
For Scrambled Sobol’ points
The onset of O(n−1+
) convergence of DSC is d dependent.
8/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Pricing an Asian Option
µ =
Rd
payoff(t)
exp −1
2 tT
Σ−1
t
(2π)d det(Σ)
dt =
[0,1]d
f(x) dx
µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H
Asian arithmetic mean option, d = 12, µ ≈ $13.1220
Error converges like O(n−1+
) for (scrambled)
Sobol’ even though discrepancy does not.
Phenomenon first seen by Paskov and Traub
(1995) for Collaterlized Mortgage Obligation
with d = 360
Scrambled Sobol’ does not achieve
O(n−3/2+
) convergence because f is not
smooth enough.
f is not even smooth enough for O(n−1+
)
convergence, except by a delicate argument
(Griebel et al., 2010; 2016+)
9/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Overcoming the Curse of Dimensionality with Weights
Sloan and Woźniakowski (1998) show how inserting decaying coordinate weights yields O(n−1+
)
convergence independent of dimension
Caflisch et al. (1997) explain how quasi-Monte Carlo methods work well for low effective dimension
Novak and Woźniakowski (2010) cover tractability comprehensively
K(x, t) =
d
k=1
[1 + γ2
k(1 − max(xk, tk))]
DSC2
=
d
k=1
1 +
γ2
k
3
−
2
n
n
i=1
d
k=1
1 +
γ2
k(1 − x2
ik)
2
+
1
n2
n
i,j=1
d
k=1
[1 + γ2
k(1 − max(xik, xjk))]
f H =
1
γu
∂|u|
f
∂xu xsu=1 L2
u⊆1:d 2
γu =
k∈u
γk
For Scrambled Sobol’ points
γ2
k = k−3
10/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Comments So Far
µ := E[f(X)] :=
X
f(x) ν(dx) ≈ ^µ :=
n
i=1
f(xi)wi =
X
f(x) ^ν(dx)
µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H
Cubature error represented as a trio identity (Meng, 2017+; H., 2017+); without the cos() term it is a
Koksma-Hlawka inequality
There are other versions than the deterministic one shown here
RKHS is an oft used tool—not only for cubature error (Fasshauer and McCourt, 2015)—and the analysis
may be extended to Banach spaces (H., 1998; 2000)
Error analysis can be attacked directly through Fourier Walsh or complex exponential series
If kernel has special form, then DSC(ν − ^ν) and a lower bound on f H can be computed in O(n log n)
operations; otherwise they are expensive
For this error decomposition the adaptive choice of the position of the next data site xn+1 does not
depend on f
11/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Comments So Far
µ := E[f(X)] :=
X
f(x) ν(dx) ≈ ^µ :=
n
i=1
f(xi)wi =
X
f(x) ^ν(dx)
µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H
The discrepancy measures the sample quality, and is independent of the integrand.
Decay rate may be improved by samples more carefully chosen than IID
Clever randomization low discrepancy points can reduce the discrepancy even further
Definition depends on the reproducing kernel (and the corresponding Hilbert space of integrands)
Working with filtered kernels may expedite computation of the discrepancy and of the optimal weights.
What is the appropriate filtered kernel for higher order nets?
Non-uniform low discrepancy points are often transformations of uniform low discrepancy points
There are other quality measures for points than the discrepancy, e.g., covering radius, minimum
distance between two points, the power function in RKHS approximation
11/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Comments So Far
µ := E[f(X)] :=
X
f(x) ν(dx) ≈ ^µ :=
n
i=1
f(xi)wi =
X
f(x) ^ν(dx)
µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H
The discrepancy measures the sample quality, and is independent of the integrand.
The strategy for choosing points that optimize the quality measure depends on whether you want to be
asymptotically optimal or pre-asymptotically optimal. See
WG IV Representative Points for Small-data and Big-data Problems
WG V Sampling and Analysis in High Dimensions When Samples Are Expensive
The size of the discrepancy may be affected by the dimension unless the space of integrands is defined
to de-emphasize higher dimensions. For how to choose the weights and sampling optimally for your
problem, see
WG AO Tuning QMC to the Specific Integrand of Interest
11/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Comments So Far
µ := E[f(X)] :=
X
f(x) ν(dx) ≈ ^µ :=
n
i=1
f(xi)wi =
X
f(x) ^ν(dx)
µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H
The term f H is often called the variation.
It can sometimes be made smaller by using a change of measure (variable transformation) to re-write
the integral with a different integrand.
11/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Comments So Far
µ := E[f(X)] :=
X
f(x) ν(dx) ≈ ^µ :=
n
i=1
f(xi)wi =
X
f(x) ^ν(dx)
µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H
The term f H is often called the variation.
It can sometimes be made smaller by using a change of measure (variable transformation) to re-write
the integral with a different integrand.
The term cos(f, ηerr) is called the confounding. It cannot exceed one in magnitude. It could be
quite small if your integrand is atypically nice.
11/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Bayesian Cubature
Random f postulated by Diaconis (1988), O’Hagan (1991), Ritter (2000), Rasmussen and Ghahramani
(2003) and others: f ∼ GP(0, s2
Cθ), a Gaussian process from the sample space F with zero mean and
covariance kernel, s2
Cθ, Cθ : X × X → R. Then
c0 =
X2
C(x, t) ν(dx)ν(dt), c =
X
C(xi, t) ν(dt)
n
i=1
C = C(xi, xj)
n
i,j=1
,
w = wi
n
i=1
= C−1
c is optimal
µ − ^µ = N(0, 1) × c0 − cTC−1c × s
The scale parameter, s, and shape parameter, θ, should be estimated, e.g. by maximum likelihood
estimation
Pr |µ − ^µ| 2.58 c0 − cTC−1c × s = 99% allows probabilistic error bounds
Requires O(n3
) operations to compute C−1
θ , but see Anitescu et al. (2016)
Ill-conditioning for smoother kernels (faster convergence)
DSC(ν − ^ν; K) = c0 − cTC−1c for optimal weights an d K = Cθ
See WG II Probabilistic Numerics 12/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Multivariate Normal Probability
µ =
[a,b]
exp −1
2 tT
Σ−1
t
(2π)d det(Σ)
dt
Genz (1993)
=
[0,1]d−1
f(x) dx
µ − ^µ = N(0, 1) × c0 − cTC−1c × s
Use a product Matérn kernel with modest smoothness:
Cθ(x, t) =
d
k=1
(1 + θ |xk − tk|)e−θ|xk−tk|
Smaller error using Bayesian cubature with
scrambled Sobol’ data sites
13/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Multivariate Normal Probability
µ =
[a,b]
exp −1
2 tT
Σ−1
t
(2π)d det(Σ)
dt
Genz (1993)
=
[0,1]d−1
f(x) dx
µ − ^µ = N(0, 1) × c0 − cTC−1c × s
Use a product Matérn kernel with modest smoothness:
Cθ(x, t) =
d
k=1
(1 + θ |xk − tk|)e−θ|xk−tk|
Smaller error using Bayesian cubature with
scrambled Sobol’ data sites
13/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Multivariate Normal Probability
µ =
[a,b]
exp −1
2 tT
Σ−1
t
(2π)d det(Σ)
dt
Genz (1993)
=
[0,1]d−1
f(x) dx
µ − ^µ = N(0, 1) × c0 − cTC−1c × s
Use a product Matérn kernel with modest smoothness:
Cθ(x, t) =
d
k=1
(1 + θ |xk − tk|)e−θ|xk−tk|
Smaller error using Bayesian cubature with
scrambled Sobol’ data sites
Confidence intervals succeed ≈ 83% of the time
13/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
How Many Samples Are Needed?
We know that
µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H
But, given an error tolerance, ε, how do we decide how many samples, n, are needed to make
|µ − ^µ| ε
Bayesian cubature provides data-based confidence intervals for ^µ
For digital net (e.g., Sobol’) and integration lattice sampling
Independent replications present a trade-off between number of replications and number of samples
But rigorous stopping criteria for a single sequence have been constructed in terms of the discrete
Fourier Walsh/complex exponential coefficients of f assuming that the true Fourier coefficients decay
steadily (H. and Jiménez Rugama, 2016; Jiménez Rugama and H., 2016; Li, 2016)
Functions of integrals and relative error criteria may also be handled (H. and Jiménez Rugama, 2016)
These automatic cubature rules are part of the Guaranteed Automatic Integration Library (GAIL) (Choi et
al., 2013–2017)
14/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Option Pricing
µ = fair price =
Rd
e−rT
max

1
d
d
j=1
Sj − K, 0

 e−zT
z/2
(2π)d/2
dz
≈ $13.12
Sj = S0e(r−σ2
/2)jT/d+σxj
= stock price at time jT/d,
x = Az, AAT
= Σ = min(i, j)T/d
d
i,j=1
, T = 1/4, d = 13 here
Abs. Error Median Worst 10% Worst 10%
Tolerance Method A Error Accuracy n Time (s)
1E−2 IID diff 2E−3 100% 6.1E7 33.000
1E−2 Scr. Sobol’ PCA 1E−3 100% 1.6E4 0.040
1E−2 Scr. Sob. cont. var. PCA 2E−3 100% 4.1E3 0.019
1E−2 Bayes. Latt. PCA 2E−3 99% 1.6E4 0.051
15/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Gaussian Probability
µ =
[a,b]
exp −1
2 tT
Σ−1
t
(2π)d det(Σ)
dt
Genz (1993)
=
[0,1]d−1
f(x) dx
For some typical choice of a, b, Σ, d = 3; µ ≈ 0.6763
Rel. Error Median Worst 10% Worst 10%
Tolerance Method Error Accuracy n Time (s)
1E−2 IID Genz 5E−4 100% 8.1E4 0.020
1E−2 Scr. Sobol’ Genz 4E−5 100% 1.0E3 0.005
1E−2 Bayes. Latt. Genz 5E−5 100% 4.1E3 0.023
1E−3 IID Genz 9E−5 100% 2.0E6 0.400
1E−3 Scr. Sobol’ Genz 2E−5 100% 2.0E3 0.006
1E−3 Bayes. Latt. Genz 3E−7 100% 6.6E4 0.076
1E−4 Scr. Sobol’ Genz 4E−7 100% 1.6E4 0.018
1E−4 Bayes. Latt. Genz 6E−9 100% 5.2E5 0.580
1E−4 Bayes. Latt. Smth. Genz 1E−7 100% 3.3E4 0.047
16/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Bayesian Inference for Logistic Regression
µj
µ0
= X
bjL(b|data) (b) db
X
L(b|data) (b) db
= Bayesian posterior expectation of βj
yi ∼ Ber
exp(β1 + β2xi)
1 + exp(β1 + β2xi)
, (b) =
exp −(b2
1 + b2
2)/2
2π
ε = 0.001, n = 9 000–17 000
17/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
Summary
Reproducing kernel Hilbert spaces are a powerful for characterizing the error of quasi-Monte Carlo
methods
Digital nets and nodesets of lattice have faster decaying discrepancies which suggests faster
decaying cubature errors
How well these bounds fit what is observed depends on whether the integrand is typical within the
space of intgrands
Practical stopping criteria are an area of active research
18/22
Thank you
Slides available at www.slideshare.net/fjhickernell/
hickernell-fred-samsi-tutorial-aug-2017
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
References I
Anitescu, M., J. Chen, and M. Stein. 2016. An inversion-free estimating equation approach for Gaussian process models, J.
Comput. Graph. Statist.
Caflisch, R. E., W. Morokoff, and A. Owen. 1997. Valuation of mortgage backed securities using Brownian bridges to reduce
effective dimension, J. Comput. Finance 1, 27–46.
Choi, S.-C. T., Y. Ding, F. J. H., L. Jiang, Ll. A. Jiménez Rugama, X. Tong, Y. Zhang, and X. Zhou. 2013–2017. GAIL:
Guaranteed Automatic Integration Library (versions 1.0–2.2).
Diaconis, P. 1988. Bayesian numerical analysis, Statistical decision theory and related topics IV, Papers from the 4th Purdue
symp., West Lafayette, Indiana 1986, pp. 163–175.
Dick, J. and F. Pillichshammer. 2010. Digital nets and sequences: Discrepancy theory and quasi-Monte Carlo integration,
Cambridge University Press, Cambridge.
Fasshauer, G. E. and M. McCourt. 2015. Kernel-based approximation methods using MATLAB, Interdisciplinary Mathematical
Sciences, vol. 19, World Scientific Publishing Co., Singapore.
Genz, A. 1993. Comparison of methods for the computation of multivariate normal probabilities, Computing Science and
Statistics 25, 400–405.
Griebel, M., F. Y. Kuo, and I. H. Sloan. 2010. The smoothing effect of the ANOVA decomposition, J. Complexity 26, 523–551.
. 2016+. The ANOVA decomposition of a non-smooth function of infinitely many variables can have every term smooth,
Math. Comp. in press.
20/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
References II
Heinrich, S., F. J. H., and R. X. Yue. 2004. Optimal quadrature for Haar wavelet spaces, Math. Comp. 73, 259–277.
H., F. J. 1998. A generalized discrepancy and quadrature error bound, Math. Comp. 67, 299–322.
. 2000. What affects the accuracy of quasi-Monte Carlo quadrature?, Monte Carlo and quasi-Monte Carlo methods 1998,
pp. 16–55.
. 2017+. The trio identity for quasi-Monte Carlo error analysis, Monte Carlo and quasi-Monte Carlo methods: MCQMC,
Stanford, usa, August 2016. submitted for publication, arXiv:1702.01487.
H., F. J. and Ll. A. Jiménez Rugama. 2016. Reliable adaptive cubature using digital sequences, Monte Carlo and quasi-Monte
Carlo methods: MCQMC, Leuven, Belgium, April 2014, pp. 367–383. arXiv:1410.8615 [math.NA].
H., F. J. and R. X. Yue. 2000. The mean square discrepancy of scrambled (t, s)-sequences, SIAM J. Numer. Anal. 38,
1089–1112.
Jiménez Rugama, Ll. A. and F. J. H. 2016. Adaptive multidimensional integration based on rank-1 lattices, Monte Carlo and
quasi-Monte Carlo methods: MCQMC, Leuven, Belgium, April 2014, pp. 407–422. arXiv:1411.1966.
Li, D. 2016. Reliable quasi-Monte Carlo with control variates, Master’s Thesis.
Meng, X. 2017+. Statistical paradises and paradoxes in big data. in preparation.
Niederreiter, H. 1992. Random number generation and quasi-Monte Carlo methods, CBMS-NSF Regional Conference Series in
Applied Mathematics, SIAM, Philadelphia.
Novak, E. and H. Woźniakowski. 2010. Tractability of multivariate problems Volume ii: Standard information for functionals, EMS
Tracts in Mathematics, European Mathematical Society, Zürich.
21/22
Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References
References III
O’Hagan, A. 1991. Bayes-Hermite quadrature, J. Statist. Plann. Inference 29, 245–260.
Owen, A. B. 1997. Scrambled net variance for integrals of smooth functions, Ann. Stat. 25, 1541–1562.
Paskov, S. and J. Traub. 1995. Faster valuation of financial derivatives, J. Portfolio Management 22, 113–120.
Rasmussen, C. E. and Z. Ghahramani. 2003. Bayesian Monte Carlo, Advances in Neural Information Processing Systems,
pp. 489–496.
Ritter, K. 2000. Average-case analysis of numerical problems, Lecture Notes in Mathematics, vol. 1733, Springer-Verlag, Berlin.
Sloan, I. H. and H. Woźniakowski. 1998. When are quasi-Monte Carlo algorithms efficient for high dimensional integrals?, J.
Complexity 14, 1–33.
22/22

More Related Content

What's hot

QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
Patch Matching with Polynomial Exponential Families and Projective Divergences
Patch Matching with Polynomial Exponential Families and Projective DivergencesPatch Matching with Polynomial Exponential Families and Projective Divergences
Patch Matching with Polynomial Exponential Families and Projective Divergences
Frank Nielsen
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
Classification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsClassification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metrics
Frank Nielsen
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Appli...
 Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Appli... Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Appli...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Appli...
The Statistical and Applied Mathematical Sciences Institute
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clustering
Frank Nielsen
 
Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)
Frank Nielsen
 
Divergence center-based clustering and their applications
Divergence center-based clustering and their applicationsDivergence center-based clustering and their applications
Divergence center-based clustering and their applications
Frank Nielsen
 
A series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropyA series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropy
Frank Nielsen
 

What's hot (20)

QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Patch Matching with Polynomial Exponential Families and Projective Divergences
Patch Matching with Polynomial Exponential Families and Projective DivergencesPatch Matching with Polynomial Exponential Families and Projective Divergences
Patch Matching with Polynomial Exponential Families and Projective Divergences
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Classification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsClassification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metrics
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Appli...
 Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Appli... Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Appli...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Appli...
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clustering
 
Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)
 
Divergence center-based clustering and their applications
Divergence center-based clustering and their applicationsDivergence center-based clustering and their applications
Divergence center-based clustering and their applications
 
A series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropyA series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropy
 

Similar to Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applied Mathematics Opening Workshop, Error Analysis for Quasi-Monte Carlo Methods - Fred Hickernell, Aug 28, 2017

Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration
Jagadeeswaran Rathinavel
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...
Alexander Litvinenko
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
Christian Robert
 
Low-rank tensor approximation (Introduction)
Low-rank tensor approximation (Introduction)Low-rank tensor approximation (Introduction)
Low-rank tensor approximation (Introduction)
Alexander Litvinenko
 
Tucker tensor analysis of Matern functions in spatial statistics
Tucker tensor analysis of Matern functions in spatial statistics Tucker tensor analysis of Matern functions in spatial statistics
Tucker tensor analysis of Matern functions in spatial statistics
Alexander Litvinenko
 
Thesis defense
Thesis defenseThesis defense
Thesis defense
Zheng Mengdi
 
The Multivariate Gaussian Probability Distribution
The Multivariate Gaussian Probability DistributionThe Multivariate Gaussian Probability Distribution
The Multivariate Gaussian Probability Distribution
Pedro222284
 
Quasistatic Fracture using Nonliner-Nonlocal Elastostatics with an Analytic T...
Quasistatic Fracture using Nonliner-Nonlocal Elastostatics with an Analytic T...Quasistatic Fracture using Nonliner-Nonlocal Elastostatics with an Analytic T...
Quasistatic Fracture using Nonliner-Nonlocal Elastostatics with an Analytic T...
Patrick Diehl
 
Vector Distance Transform Maps for Autonomous Mobile Robot Navigation
Vector Distance Transform Maps for Autonomous Mobile Robot NavigationVector Distance Transform Maps for Autonomous Mobile Robot Navigation
Vector Distance Transform Maps for Autonomous Mobile Robot Navigation
Janindu Arukgoda
 
1.1_The_Definite_Integral.pdf odjoqwddoio
1.1_The_Definite_Integral.pdf odjoqwddoio1.1_The_Definite_Integral.pdf odjoqwddoio
1.1_The_Definite_Integral.pdf odjoqwddoio
NoorYassinHJamel
 
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Tomoya Murata
 
Density theorems for Euclidean point configurations
Density theorems for Euclidean point configurationsDensity theorems for Euclidean point configurations
Density theorems for Euclidean point configurations
VjekoslavKovac1
 
MVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsMVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priors
Elvis DOHMATOB
 
ISI MSQE Entrance Question Paper (2008)
ISI MSQE Entrance Question Paper (2008)ISI MSQE Entrance Question Paper (2008)
ISI MSQE Entrance Question Paper (2008)
CrackDSE
 
Modeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential EquationModeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential Equation
Mark Chang
 
Density theorems for anisotropic point configurations
Density theorems for anisotropic point configurationsDensity theorems for anisotropic point configurations
Density theorems for anisotropic point configurations
VjekoslavKovac1
 
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
Frank Nielsen
 
Slides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processingSlides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processing
Frank Nielsen
 
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
The Statistical and Applied Mathematical Sciences Institute
 

Similar to Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applied Mathematics Opening Workshop, Error Analysis for Quasi-Monte Carlo Methods - Fred Hickernell, Aug 28, 2017 (20)

Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
 
Low-rank tensor approximation (Introduction)
Low-rank tensor approximation (Introduction)Low-rank tensor approximation (Introduction)
Low-rank tensor approximation (Introduction)
 
Tucker tensor analysis of Matern functions in spatial statistics
Tucker tensor analysis of Matern functions in spatial statistics Tucker tensor analysis of Matern functions in spatial statistics
Tucker tensor analysis of Matern functions in spatial statistics
 
Thesis defense
Thesis defenseThesis defense
Thesis defense
 
The Multivariate Gaussian Probability Distribution
The Multivariate Gaussian Probability DistributionThe Multivariate Gaussian Probability Distribution
The Multivariate Gaussian Probability Distribution
 
Quasistatic Fracture using Nonliner-Nonlocal Elastostatics with an Analytic T...
Quasistatic Fracture using Nonliner-Nonlocal Elastostatics with an Analytic T...Quasistatic Fracture using Nonliner-Nonlocal Elastostatics with an Analytic T...
Quasistatic Fracture using Nonliner-Nonlocal Elastostatics with an Analytic T...
 
Vector Distance Transform Maps for Autonomous Mobile Robot Navigation
Vector Distance Transform Maps for Autonomous Mobile Robot NavigationVector Distance Transform Maps for Autonomous Mobile Robot Navigation
Vector Distance Transform Maps for Autonomous Mobile Robot Navigation
 
Talk5
Talk5Talk5
Talk5
 
1.1_The_Definite_Integral.pdf odjoqwddoio
1.1_The_Definite_Integral.pdf odjoqwddoio1.1_The_Definite_Integral.pdf odjoqwddoio
1.1_The_Definite_Integral.pdf odjoqwddoio
 
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
 
Density theorems for Euclidean point configurations
Density theorems for Euclidean point configurationsDensity theorems for Euclidean point configurations
Density theorems for Euclidean point configurations
 
MVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsMVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priors
 
ISI MSQE Entrance Question Paper (2008)
ISI MSQE Entrance Question Paper (2008)ISI MSQE Entrance Question Paper (2008)
ISI MSQE Entrance Question Paper (2008)
 
Modeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential EquationModeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential Equation
 
Density theorems for anisotropic point configurations
Density theorems for anisotropic point configurationsDensity theorems for anisotropic point configurations
Density theorems for anisotropic point configurations
 
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
 
Slides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processingSlides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processing
 
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
 

More from The Statistical and Applied Mathematical Sciences Institute

Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
The Statistical and Applied Mathematical Sciences Institute
 
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
The Statistical and Applied Mathematical Sciences Institute
 
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
The Statistical and Applied Mathematical Sciences Institute
 
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
The Statistical and Applied Mathematical Sciences Institute
 
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
The Statistical and Applied Mathematical Sciences Institute
 
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
The Statistical and Applied Mathematical Sciences Institute
 

More from The Statistical and Applied Mathematical Sciences Institute (20)

Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
 
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
 
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
 
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
 
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
 
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
 
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
 
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
 
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
 
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
 
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
 
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
 
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
 
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
 
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
 
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
 
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
 
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
 
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
 
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
 

Recently uploaded

Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345
beazzy04
 
The geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideasThe geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideas
GeoBlogs
 
Introduction to Quality Improvement Essentials
Introduction to Quality Improvement EssentialsIntroduction to Quality Improvement Essentials
Introduction to Quality Improvement Essentials
Excellence Foundation for South Sudan
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
Jisc
 
Unit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdfUnit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdf
Thiyagu K
 
The approach at University of Liverpool.pptx
The approach at University of Liverpool.pptxThe approach at University of Liverpool.pptx
The approach at University of Liverpool.pptx
Jisc
 
How to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleHow to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS Module
Celine George
 
Additional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdfAdditional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdf
joachimlavalley1
 
Digital Tools and AI for Teaching Learning and Research
Digital Tools and AI for Teaching Learning and ResearchDigital Tools and AI for Teaching Learning and Research
Digital Tools and AI for Teaching Learning and Research
Vikramjit Singh
 
Sectors of the Indian Economy - Class 10 Study Notes pdf
Sectors of the Indian Economy - Class 10 Study Notes pdfSectors of the Indian Economy - Class 10 Study Notes pdf
Sectors of the Indian Economy - Class 10 Study Notes pdf
Vivekanand Anglo Vedic Academy
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
siemaillard
 
Synthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptxSynthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptx
Pavel ( NSTU)
 
Instructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptxInstructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptx
Jheel Barad
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
Atul Kumar Singh
 
Operation Blue Star - Saka Neela Tara
Operation Blue Star   -  Saka Neela TaraOperation Blue Star   -  Saka Neela Tara
Operation Blue Star - Saka Neela Tara
Balvir Singh
 
Ethnobotany and Ethnopharmacology ......
Ethnobotany and Ethnopharmacology ......Ethnobotany and Ethnopharmacology ......
Ethnobotany and Ethnopharmacology ......
Ashokrao Mane college of Pharmacy Peth-Vadgaon
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
siemaillard
 
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXXPhrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
MIRIAMSALINAS13
 
special B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdfspecial B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdf
Special education needs
 
Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)
rosedainty
 

Recently uploaded (20)

Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345
 
The geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideasThe geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideas
 
Introduction to Quality Improvement Essentials
Introduction to Quality Improvement EssentialsIntroduction to Quality Improvement Essentials
Introduction to Quality Improvement Essentials
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
 
Unit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdfUnit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdf
 
The approach at University of Liverpool.pptx
The approach at University of Liverpool.pptxThe approach at University of Liverpool.pptx
The approach at University of Liverpool.pptx
 
How to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleHow to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS Module
 
Additional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdfAdditional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdf
 
Digital Tools and AI for Teaching Learning and Research
Digital Tools and AI for Teaching Learning and ResearchDigital Tools and AI for Teaching Learning and Research
Digital Tools and AI for Teaching Learning and Research
 
Sectors of the Indian Economy - Class 10 Study Notes pdf
Sectors of the Indian Economy - Class 10 Study Notes pdfSectors of the Indian Economy - Class 10 Study Notes pdf
Sectors of the Indian Economy - Class 10 Study Notes pdf
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
 
Synthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptxSynthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptx
 
Instructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptxInstructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptx
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
 
Operation Blue Star - Saka Neela Tara
Operation Blue Star   -  Saka Neela TaraOperation Blue Star   -  Saka Neela Tara
Operation Blue Star - Saka Neela Tara
 
Ethnobotany and Ethnopharmacology ......
Ethnobotany and Ethnopharmacology ......Ethnobotany and Ethnopharmacology ......
Ethnobotany and Ethnopharmacology ......
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
 
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXXPhrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
 
special B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdfspecial B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdf
 
Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)
 

Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applied Mathematics Opening Workshop, Error Analysis for Quasi-Monte Carlo Methods - Fred Hickernell, Aug 28, 2017

  • 1. Error Analysis for Quasi-Monte Carlo Methods Fred J. Hickernell Department of Applied Mathematics, Illinois Institute of Technology hickernell@iit.edu mypages.iit.edu/~hickernell Thanks to the Guaranteed Automatic Integration Library (GAIL) team and friends Supported by NSF-DMS-1522687 Thanks to SAMSI and the QMC Program organizers SAMSI-QMC Opening Worskshop, August 28, 2017
  • 2. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Integration Examples µ = Rd f(x) ν(dx) = option price, f(x) = discounted payoff determined by market forces x µ = X g(x) dx = P(X ∈ X) = probability g = probability density function µj µ0 = X bjL(b|data) (b) db X L(b|data) (b) db = Bayesian posterior expectation of βj µ = E[f(X)] = X f(x) ν(dx) ≈ ^µ = n i=1 f(xi)wi = X f(x) ^ν(dx) µ − ^µ = ? 2/22
  • 3. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Integration Examples µ = Rd f(x) ν(dx) = option price, f(x) = discounted payoff determined by market forces x µ = X g(x) dx = P(X ∈ X) = probability g = probability density function µj µ0 = X bjL(b|data) (b) db X L(b|data) (b) db = Bayesian posterior expectation of βj µ = E[f(X)] = X f(x) ν(dx) ≈ ^µ = n i=1 f(xi)wi = X f(x) ^ν(dx) µ − ^µ = ? 2/22
  • 4. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Integration Examples µ = Rd f(x) ν(dx) = option price, f(x) = discounted payoff determined by market forces x µ = X g(x) dx = P(X ∈ X) = probability g = probability density function µj µ0 = X bjL(b|data) (b) db X L(b|data) (b) db = Bayesian posterior expectation of βj µ = E[f(X)] = X f(x) ν(dx) ≈ ^µ = n i=1 f(xi)wi = X f(x) ^ν(dx) µ − ^µ = ? 2/22
  • 5. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Integration Examples µ = Rd f(x) ν(dx) = option price, f(x) = discounted payoff determined by market forces x µ = X g(x) dx = P(X ∈ X) = probability g = probability density function µj µ0 = X bjL(b|data) (b) db X L(b|data) (b) db = Bayesian posterior expectation of βj µ = E[f(X)] = X f(x) ν(dx) ≈ ^µ = n i=1 f(xi)wi = X f(x) ^ν(dx) µ − ^µ = ? 2/22
  • 6. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Error Analysis for Quasi-Monte Carlo Methods µ := E[f(X)] := X f(x) ν(dx) ≈ ^µ := n i=1 f(xi)wi = X f(x) ^ν(dx) µ − ^µ = X f(x) (ν − ^ν)(dx) = ? Reality Error Analysis (Theory) What is What we think it should be by rigorous argument Not certain that our assumptions hold 3/22
  • 7. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Error Analysis for Quasi-Monte Carlo Methods µ := E[f(X)] := X f(x) ν(dx) ≈ ^µ := n i=1 f(xi)wi = X f(x) ^ν(dx) µ − ^µ = X f(x) (ν − ^ν)(dx) = ? Reality Error Analysis (Theory) What is What we think it should be by rigorous argument Not certain that our assumptions hold Quasi-Monte Carlo Methods x1, x2, . . . chosen more carefully that IID, dependent 3/22
  • 8. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Error Decomposition in Reproducing Kernel Hilbert Spaces Let the integrand lie in a Hilbert space, H, with reproducing kernel, K : X × X → R: K(·, t) ∈ H, f(t) = K(·, t), f ∀t ∈ X, f ∈ H The cubature error is given by the inner product with the representer, ηerr: µ − ^µ = X f(x) (ν − ^ν)(dx) = ηerr, f , where ηerr(t) = K(·, t), ηerr = X K(x, t) (ν − ^ν)(dx) The Cauchy-Schwartz inequality says that (H., 2000) µ − ^µ = ηerr, f = cos(f, ηerr) DSC(ν − ^ν) f H discrepancy DSC(ν − ^ν) := ηerr H DSC2 = ηerr, ηerr = X×X K(x, t) (ν − ^ν)(dx)(ν − ^ν)(dt) = X2 K(x, t) ν(dx)ν(dt) − 2 n i=1 wi X K(xi, t) ν(dt) + n i,j=1 wiwjK(xi, xj) = k0 − 2kT w + wT Kw w = K−1 k is optimal but expensive to compute 4/22
  • 9. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References L2 -Discrepancy ν = uniform on X = [0, 1]d , ^ν = uniform on {xi}n i=1, K(x, t) = d k=1 [2 − max(xk, tk)] (H., 1998) f H = ∂u f L2 u⊆1:d 2 , ∂u f := ∂|u| f ∂xu xsu = 1 , 1:d := {1, . . . , d}, su := 1:d u DSC2 = 4 3 d − 2 n n i=1 d k=1 3 − x2 ik 2 + 1 n2 n i,j=1 d k=1 [2 − max(xik, xjk)] 5/22
  • 10. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References L2 -Discrepancy ν = uniform on X = [0, 1]d , ^ν = uniform on {xi}n i=1, K(x, t) = d k=1 [2 − max(xk, tk)] (H., 1998) f H = ∂u f L2 u⊆1:d 2 , ∂u f := ∂|u| f ∂xu xsu = 1 , 1:d := {1, . . . , d}, su := 1:d u DSC2 = 4 3 d − 2 n n i=1 d k=1 3 − x2 ik 2 + 1 n2 n i,j=1 d k=1 [2 − max(xik, xjk)] = ν([0, ·u]) − ^ν([0, ·u]) L2 ∅=u⊆1:d 2 2 geometric interpretation x = (. . . , 0.6, . . . , 0.4, . . .) ν([0, x{5,8}]) − ^ν([0, x{5,8}]) = 0.24 − 7/32 = 0.02125 5/22
  • 11. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References L2 -Discrepancy ν = uniform on X = [0, 1]d , ^ν = uniform on {xi}n i=1, K(x, t) = d k=1 [2 − max(xk, tk)] (H., 1998) f H = ∂u f L2 u⊆1:d 2 , ∂u f := ∂|u| f ∂xu xsu = 1 , 1:d := {1, . . . , d}, su := 1:d u DSC2 = 4 3 d − 2 n n i=1 d k=1 3 − x2 ik 2 + 1 n2 n i,j=1 d k=1 [2 − max(xik, xjk)] = ν([0, ·u]) − ^ν([0, ·u]) L2 ∅=u⊆1:d 2 2 geometric interpretation µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f − f(1) H DSC(ν − ^ν) requires O(dn2 ) operations to evaluate    = O(n−1/2 ) on average for IID Monte Carlo = O(n−1+ ) for digital nets, integration lattices, ... (Niederreiter, 1992; Dick and Pillichshammer, 2010) O(n−1 ) for all ^ν because this K has limited smoothness 5/22
  • 12. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Multivariate Normal Probability µ = [a,b] exp −1 2 tT Σ−1 t (2π)d det(Σ) dt Genz (1993) = [0,1]d−1 f(x) dx, ^ν = uniform on {xi}n i=1 µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H DSC(ν−^ν) =    O(n−1/2 ) for IID MC O(n−1+ ) for Sobol’ 6/22
  • 13. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Multivariate Normal Probability µ = [a,b] exp −1 2 tT Σ−1 t (2π)d det(Σ) dt Genz (1993) = [0,1]d−1 f(x) dx, ^ν = uniform on {xi}n i=1 µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H DSC(ν−^ν) =    O(n−1/2 ) for IID MC O(n−1+ ) for Sobol’ 6/22
  • 14. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Multivariate Normal Probability µ = [a,b] exp −1 2 tT Σ−1 t (2π)d det(Σ) dt Genz (1993) = [0,1]d−1 f(x) dx, ^ν = uniform on {xi}n i=1 µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H DSC(ν−^ν) =    O(n−1/2 ) for IID MC O(n−1+ ) for Sobol’ O(n−1.5+ ) for Scr. Sobol’ w/ smoother kernel (Owen, 1997) (H. and Yue, 2000; Heinrich et al., 2004) 6/22
  • 15. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Multivariate Normal Probability µ = [a,b] exp −1 2 tT Σ−1 t (2π)d det(Σ) dt Genz (1993) = [0,1]d−1 f(x) dx, ^ν = uniform on {xi}n i=1 µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H DSC(ν−^ν) =    O(n−1/2 ) for IID MC O(n−1+ ) for Sobol’ O(n−1.5+ ) for Scr. Sobol’ w/ smoother kernel (Owen, 1997) (H. and Yue, 2000; Heinrich et al., 2004) For some typical choice of a, b, Σ, and d = 3 µ ≈ 0.6763 Low discrepancy sampling reduces cubature error; randomization can help more 6/22
  • 16. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Mean Square Discrepancy for Randomized Low Discrepancy Sets Popular low discrepancy sets on [0, 1)d —digital nets and lattice node sets—can be randomized while preserving their low discrepancy properties. Let ψ : [0, 1)d → [0, 1)d be a random mapping and ψ(x) ∼ U[0, 1)d , Kf(x, t) := E[K(ψ(x), ψ(t)] Kf1 = (1T Kf,1)1, k0 = 1 where the vector Kf,1 is the first column of the matrix Kf, the analog of K with the original sequence but for the filtered kernel, Kf. Taking equal weights w = 1/n, E{[DSC(ν − ^ν; K)]2 } = E[k0 − 2kT w + wT Kw] = −1 + 1T Kf,1 n = [DSC(ν − ^ν; Kf)]2 7/22
  • 17. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Mean Square Discrepancy for Randomized Low Discrepancy Sets Popular low discrepancy sets on [0, 1)d —digital nets and lattice node sets—can be randomized while preserving their low discrepancy properties. Let ψ : [0, 1)d → [0, 1)d be a random mapping and ψ(x) ∼ U[0, 1)d , Kf(x, t) := E[K(ψ(x), ψ(t)] Kf1 = (1T Kf,1)1, k0 = 1 where the vector Kf,1 is the first column of the matrix Kf, the analog of K with the original sequence but for the filtered kernel, Kf. Taking equal weights w = 1/n, E{[DSC(ν − ^ν; K)]2 } = E[k0 − 2kT w + wT Kw] = −1 + 1T Kf,1 n = [DSC(ν − ^ν; Kf)]2 DSC(ν − ^ν; Kf) only requires O(n) operations to compute Suggests using Hilbert spaces with reproducing kernels of the form Kf Kf(x, t) = l λlφl(x)sφl(t), φl(x) = Walsh functions for digital nets φl(x) = exp(2πlT x) for lattice nodesets with λl chosen carefully so that the series sums in closed form and Kf has the desired smoothness properties 7/22
  • 18. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References The Effect of Dimension on the Decay of the Discrepancy What happens when dimension, d, is large and but sample size, n, is modest? L2 -discrepancy and variation: DSC2 = 4 3 d − 2 n n i=1 d k=1 3 − x2 ik 2 + 1 n2 n i,j=1 d k=1 [2 − max(xik, xjk)] f H = ∂|u| f ∂xu xsu=1 L2 u⊆1:d 2 For Scrambled Sobol’ points The onset of O(n−1+ ) convergence of DSC is d dependent. 8/22
  • 19. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Pricing an Asian Option µ = Rd payoff(t) exp −1 2 tT Σ−1 t (2π)d det(Σ) dt = [0,1]d f(x) dx µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H Asian arithmetic mean option, d = 12, µ ≈ $13.1220 Error converges like O(n−1+ ) for (scrambled) Sobol’ even though discrepancy does not. Phenomenon first seen by Paskov and Traub (1995) for Collaterlized Mortgage Obligation with d = 360 Scrambled Sobol’ does not achieve O(n−3/2+ ) convergence because f is not smooth enough. f is not even smooth enough for O(n−1+ ) convergence, except by a delicate argument (Griebel et al., 2010; 2016+) 9/22
  • 20. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Overcoming the Curse of Dimensionality with Weights Sloan and Woźniakowski (1998) show how inserting decaying coordinate weights yields O(n−1+ ) convergence independent of dimension Caflisch et al. (1997) explain how quasi-Monte Carlo methods work well for low effective dimension Novak and Woźniakowski (2010) cover tractability comprehensively K(x, t) = d k=1 [1 + γ2 k(1 − max(xk, tk))] DSC2 = d k=1 1 + γ2 k 3 − 2 n n i=1 d k=1 1 + γ2 k(1 − x2 ik) 2 + 1 n2 n i,j=1 d k=1 [1 + γ2 k(1 − max(xik, xjk))] f H = 1 γu ∂|u| f ∂xu xsu=1 L2 u⊆1:d 2 γu = k∈u γk For Scrambled Sobol’ points γ2 k = k−3 10/22
  • 21. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Comments So Far µ := E[f(X)] := X f(x) ν(dx) ≈ ^µ := n i=1 f(xi)wi = X f(x) ^ν(dx) µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H Cubature error represented as a trio identity (Meng, 2017+; H., 2017+); without the cos() term it is a Koksma-Hlawka inequality There are other versions than the deterministic one shown here RKHS is an oft used tool—not only for cubature error (Fasshauer and McCourt, 2015)—and the analysis may be extended to Banach spaces (H., 1998; 2000) Error analysis can be attacked directly through Fourier Walsh or complex exponential series If kernel has special form, then DSC(ν − ^ν) and a lower bound on f H can be computed in O(n log n) operations; otherwise they are expensive For this error decomposition the adaptive choice of the position of the next data site xn+1 does not depend on f 11/22
  • 22. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Comments So Far µ := E[f(X)] := X f(x) ν(dx) ≈ ^µ := n i=1 f(xi)wi = X f(x) ^ν(dx) µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H The discrepancy measures the sample quality, and is independent of the integrand. Decay rate may be improved by samples more carefully chosen than IID Clever randomization low discrepancy points can reduce the discrepancy even further Definition depends on the reproducing kernel (and the corresponding Hilbert space of integrands) Working with filtered kernels may expedite computation of the discrepancy and of the optimal weights. What is the appropriate filtered kernel for higher order nets? Non-uniform low discrepancy points are often transformations of uniform low discrepancy points There are other quality measures for points than the discrepancy, e.g., covering radius, minimum distance between two points, the power function in RKHS approximation 11/22
  • 23. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Comments So Far µ := E[f(X)] := X f(x) ν(dx) ≈ ^µ := n i=1 f(xi)wi = X f(x) ^ν(dx) µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H The discrepancy measures the sample quality, and is independent of the integrand. The strategy for choosing points that optimize the quality measure depends on whether you want to be asymptotically optimal or pre-asymptotically optimal. See WG IV Representative Points for Small-data and Big-data Problems WG V Sampling and Analysis in High Dimensions When Samples Are Expensive The size of the discrepancy may be affected by the dimension unless the space of integrands is defined to de-emphasize higher dimensions. For how to choose the weights and sampling optimally for your problem, see WG AO Tuning QMC to the Specific Integrand of Interest 11/22
  • 24. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Comments So Far µ := E[f(X)] := X f(x) ν(dx) ≈ ^µ := n i=1 f(xi)wi = X f(x) ^ν(dx) µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H The term f H is often called the variation. It can sometimes be made smaller by using a change of measure (variable transformation) to re-write the integral with a different integrand. 11/22
  • 25. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Comments So Far µ := E[f(X)] := X f(x) ν(dx) ≈ ^µ := n i=1 f(xi)wi = X f(x) ^ν(dx) µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H The term f H is often called the variation. It can sometimes be made smaller by using a change of measure (variable transformation) to re-write the integral with a different integrand. The term cos(f, ηerr) is called the confounding. It cannot exceed one in magnitude. It could be quite small if your integrand is atypically nice. 11/22
  • 26. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Bayesian Cubature Random f postulated by Diaconis (1988), O’Hagan (1991), Ritter (2000), Rasmussen and Ghahramani (2003) and others: f ∼ GP(0, s2 Cθ), a Gaussian process from the sample space F with zero mean and covariance kernel, s2 Cθ, Cθ : X × X → R. Then c0 = X2 C(x, t) ν(dx)ν(dt), c = X C(xi, t) ν(dt) n i=1 C = C(xi, xj) n i,j=1 , w = wi n i=1 = C−1 c is optimal µ − ^µ = N(0, 1) × c0 − cTC−1c × s The scale parameter, s, and shape parameter, θ, should be estimated, e.g. by maximum likelihood estimation Pr |µ − ^µ| 2.58 c0 − cTC−1c × s = 99% allows probabilistic error bounds Requires O(n3 ) operations to compute C−1 θ , but see Anitescu et al. (2016) Ill-conditioning for smoother kernels (faster convergence) DSC(ν − ^ν; K) = c0 − cTC−1c for optimal weights an d K = Cθ See WG II Probabilistic Numerics 12/22
  • 27. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Multivariate Normal Probability µ = [a,b] exp −1 2 tT Σ−1 t (2π)d det(Σ) dt Genz (1993) = [0,1]d−1 f(x) dx µ − ^µ = N(0, 1) × c0 − cTC−1c × s Use a product Matérn kernel with modest smoothness: Cθ(x, t) = d k=1 (1 + θ |xk − tk|)e−θ|xk−tk| Smaller error using Bayesian cubature with scrambled Sobol’ data sites 13/22
  • 28. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Multivariate Normal Probability µ = [a,b] exp −1 2 tT Σ−1 t (2π)d det(Σ) dt Genz (1993) = [0,1]d−1 f(x) dx µ − ^µ = N(0, 1) × c0 − cTC−1c × s Use a product Matérn kernel with modest smoothness: Cθ(x, t) = d k=1 (1 + θ |xk − tk|)e−θ|xk−tk| Smaller error using Bayesian cubature with scrambled Sobol’ data sites 13/22
  • 29. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Multivariate Normal Probability µ = [a,b] exp −1 2 tT Σ−1 t (2π)d det(Σ) dt Genz (1993) = [0,1]d−1 f(x) dx µ − ^µ = N(0, 1) × c0 − cTC−1c × s Use a product Matérn kernel with modest smoothness: Cθ(x, t) = d k=1 (1 + θ |xk − tk|)e−θ|xk−tk| Smaller error using Bayesian cubature with scrambled Sobol’ data sites Confidence intervals succeed ≈ 83% of the time 13/22
  • 30. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References How Many Samples Are Needed? We know that µ − ^µ = cos(f, ηerr) DSC(ν − ^ν) f H But, given an error tolerance, ε, how do we decide how many samples, n, are needed to make |µ − ^µ| ε Bayesian cubature provides data-based confidence intervals for ^µ For digital net (e.g., Sobol’) and integration lattice sampling Independent replications present a trade-off between number of replications and number of samples But rigorous stopping criteria for a single sequence have been constructed in terms of the discrete Fourier Walsh/complex exponential coefficients of f assuming that the true Fourier coefficients decay steadily (H. and Jiménez Rugama, 2016; Jiménez Rugama and H., 2016; Li, 2016) Functions of integrals and relative error criteria may also be handled (H. and Jiménez Rugama, 2016) These automatic cubature rules are part of the Guaranteed Automatic Integration Library (GAIL) (Choi et al., 2013–2017) 14/22
  • 31. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Option Pricing µ = fair price = Rd e−rT max  1 d d j=1 Sj − K, 0   e−zT z/2 (2π)d/2 dz ≈ $13.12 Sj = S0e(r−σ2 /2)jT/d+σxj = stock price at time jT/d, x = Az, AAT = Σ = min(i, j)T/d d i,j=1 , T = 1/4, d = 13 here Abs. Error Median Worst 10% Worst 10% Tolerance Method A Error Accuracy n Time (s) 1E−2 IID diff 2E−3 100% 6.1E7 33.000 1E−2 Scr. Sobol’ PCA 1E−3 100% 1.6E4 0.040 1E−2 Scr. Sob. cont. var. PCA 2E−3 100% 4.1E3 0.019 1E−2 Bayes. Latt. PCA 2E−3 99% 1.6E4 0.051 15/22
  • 32. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Gaussian Probability µ = [a,b] exp −1 2 tT Σ−1 t (2π)d det(Σ) dt Genz (1993) = [0,1]d−1 f(x) dx For some typical choice of a, b, Σ, d = 3; µ ≈ 0.6763 Rel. Error Median Worst 10% Worst 10% Tolerance Method Error Accuracy n Time (s) 1E−2 IID Genz 5E−4 100% 8.1E4 0.020 1E−2 Scr. Sobol’ Genz 4E−5 100% 1.0E3 0.005 1E−2 Bayes. Latt. Genz 5E−5 100% 4.1E3 0.023 1E−3 IID Genz 9E−5 100% 2.0E6 0.400 1E−3 Scr. Sobol’ Genz 2E−5 100% 2.0E3 0.006 1E−3 Bayes. Latt. Genz 3E−7 100% 6.6E4 0.076 1E−4 Scr. Sobol’ Genz 4E−7 100% 1.6E4 0.018 1E−4 Bayes. Latt. Genz 6E−9 100% 5.2E5 0.580 1E−4 Bayes. Latt. Smth. Genz 1E−7 100% 3.3E4 0.047 16/22
  • 33. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Bayesian Inference for Logistic Regression µj µ0 = X bjL(b|data) (b) db X L(b|data) (b) db = Bayesian posterior expectation of βj yi ∼ Ber exp(β1 + β2xi) 1 + exp(β1 + β2xi) , (b) = exp −(b2 1 + b2 2)/2 2π ε = 0.001, n = 9 000–17 000 17/22
  • 34. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References Summary Reproducing kernel Hilbert spaces are a powerful for characterizing the error of quasi-Monte Carlo methods Digital nets and nodesets of lattice have faster decaying discrepancies which suggests faster decaying cubature errors How well these bounds fit what is observed depends on whether the integrand is typical within the space of intgrands Practical stopping criteria are an area of active research 18/22
  • 35. Thank you Slides available at www.slideshare.net/fjhickernell/ hickernell-fred-samsi-tutorial-aug-2017
  • 36. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References References I Anitescu, M., J. Chen, and M. Stein. 2016. An inversion-free estimating equation approach for Gaussian process models, J. Comput. Graph. Statist. Caflisch, R. E., W. Morokoff, and A. Owen. 1997. Valuation of mortgage backed securities using Brownian bridges to reduce effective dimension, J. Comput. Finance 1, 27–46. Choi, S.-C. T., Y. Ding, F. J. H., L. Jiang, Ll. A. Jiménez Rugama, X. Tong, Y. Zhang, and X. Zhou. 2013–2017. GAIL: Guaranteed Automatic Integration Library (versions 1.0–2.2). Diaconis, P. 1988. Bayesian numerical analysis, Statistical decision theory and related topics IV, Papers from the 4th Purdue symp., West Lafayette, Indiana 1986, pp. 163–175. Dick, J. and F. Pillichshammer. 2010. Digital nets and sequences: Discrepancy theory and quasi-Monte Carlo integration, Cambridge University Press, Cambridge. Fasshauer, G. E. and M. McCourt. 2015. Kernel-based approximation methods using MATLAB, Interdisciplinary Mathematical Sciences, vol. 19, World Scientific Publishing Co., Singapore. Genz, A. 1993. Comparison of methods for the computation of multivariate normal probabilities, Computing Science and Statistics 25, 400–405. Griebel, M., F. Y. Kuo, and I. H. Sloan. 2010. The smoothing effect of the ANOVA decomposition, J. Complexity 26, 523–551. . 2016+. The ANOVA decomposition of a non-smooth function of infinitely many variables can have every term smooth, Math. Comp. in press. 20/22
  • 37. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References References II Heinrich, S., F. J. H., and R. X. Yue. 2004. Optimal quadrature for Haar wavelet spaces, Math. Comp. 73, 259–277. H., F. J. 1998. A generalized discrepancy and quadrature error bound, Math. Comp. 67, 299–322. . 2000. What affects the accuracy of quasi-Monte Carlo quadrature?, Monte Carlo and quasi-Monte Carlo methods 1998, pp. 16–55. . 2017+. The trio identity for quasi-Monte Carlo error analysis, Monte Carlo and quasi-Monte Carlo methods: MCQMC, Stanford, usa, August 2016. submitted for publication, arXiv:1702.01487. H., F. J. and Ll. A. Jiménez Rugama. 2016. Reliable adaptive cubature using digital sequences, Monte Carlo and quasi-Monte Carlo methods: MCQMC, Leuven, Belgium, April 2014, pp. 367–383. arXiv:1410.8615 [math.NA]. H., F. J. and R. X. Yue. 2000. The mean square discrepancy of scrambled (t, s)-sequences, SIAM J. Numer. Anal. 38, 1089–1112. Jiménez Rugama, Ll. A. and F. J. H. 2016. Adaptive multidimensional integration based on rank-1 lattices, Monte Carlo and quasi-Monte Carlo methods: MCQMC, Leuven, Belgium, April 2014, pp. 407–422. arXiv:1411.1966. Li, D. 2016. Reliable quasi-Monte Carlo with control variates, Master’s Thesis. Meng, X. 2017+. Statistical paradises and paradoxes in big data. in preparation. Niederreiter, H. 1992. Random number generation and quasi-Monte Carlo methods, CBMS-NSF Regional Conference Series in Applied Mathematics, SIAM, Philadelphia. Novak, E. and H. Woźniakowski. 2010. Tractability of multivariate problems Volume ii: Standard information for functionals, EMS Tracts in Mathematics, European Mathematical Society, Zürich. 21/22
  • 38. Introduction RKHS Dimension Effects Interlude Bayesian Cubature GAIL Summary References References III O’Hagan, A. 1991. Bayes-Hermite quadrature, J. Statist. Plann. Inference 29, 245–260. Owen, A. B. 1997. Scrambled net variance for integrals of smooth functions, Ann. Stat. 25, 1541–1562. Paskov, S. and J. Traub. 1995. Faster valuation of financial derivatives, J. Portfolio Management 22, 113–120. Rasmussen, C. E. and Z. Ghahramani. 2003. Bayesian Monte Carlo, Advances in Neural Information Processing Systems, pp. 489–496. Ritter, K. 2000. Average-case analysis of numerical problems, Lecture Notes in Mathematics, vol. 1733, Springer-Verlag, Berlin. Sloan, I. H. and H. Woźniakowski. 1998. When are quasi-Monte Carlo algorithms efficient for high dimensional integrals?, J. Complexity 14, 1–33. 22/22