SlideShare a Scribd company logo
Department of Computing + Mathematical Sciences
California Institute of Technology
About ∞-Dimensional Geometric MCMC a b
Shiwei Lan
DEC 14, 2017 SAMSI, DUKE
aBeskos, Alexandros, Mark Girolami, Shiwei Lan, Patrick E. Farrell, and Andrew M. Stuart
(2017), Geometric MCMC for Infinite-Dimensional Inverse Problems. Journal of Computational
Physics, 335:327–351.
bHolbrook, Andrew, Shiwei Lan, Jeffrey Streets, and Babak Shahbaba. The non-parametric
Fisher information geometry and the chi-square process density prior. arXiv:1707.03117.
1
Table of contents
1. Introduction
2. Geometric Monte Carlo on Infinite Dimensions
3. Dimension Reduction
4. ∞-dimensional Spherical Hamiltonian Monte Carlo
5. Conclusion
S.Lan | ∞-Dimensional Geometric MCMC
2
Geometric Monte Carlo
powerful
RWM
θ
1
-2 -1 0 1 2
θ
2
-2
-1.5
-1
-0.5
0
0.5
1
1.5
2
HMC
θ
1
-2 -1 0 1 2
θ
2
-2
-1.5
-1
-0.5
0
0.5
1
1.5
2
RHMC
θ
1
-2 -1 0 1 2
θ
2
-2
-1.5
-1
-0.5
0
0.5
1
1.5
2
LMC
θ
1
-2 -1 0 1 2
θ
2
-2
-1.5
-1
-0.5
0
0.5
1
1.5
2
S.Lan | ∞-Dimensional Geometric MCMC
3
Dimension-Independent MCMC
robust
20 40 60 80 100 120 140 160 180 200
dimension
0
0.2
0.4
0.6
0.8
1
acceptancerate
HMC
LMC
∞-HMC
∞-mHMC
S.Lan | ∞-Dimensional Geometric MCMC
4
Bayesian Inverse Problems
Let X, Y be separable Banach spaces, equipped with Borel σ-algebra,
G : X → Y measurable. Want to find u from y
y = G(u) + η
Prior: u ∼ µ0 on X. Noise: η ∼ Q0 on Y and η ⊥ u.
Assume y|u ∼ Qu Q0 for u µ0-a.s. Define likelihood
dQu
dQ0
(y) = exp(−Φ(u; y))
Theorem (Bayes’ Theorem)
Assume for y Q0-a.s.: Z := X
exp(−Φ(u; y))µ0(du) > 0. Then u|y exists
under ν, denoted by µy
. Furthermore, µy
µ0 and for y ν-a.s.
dµy
dµ0
(u) =
1
Z
exp(−Φ(u; y)).
S.Lan | ∞-Dimensional Geometric MCMC
5
Metropolis-Hastings in general state space
(Tierney, 1998)
target µ(du), e.g. µy
(du) ∝ exp(−Φ(u; y))µ0(du), µ0 = N(0, C).
Denote Q(u, du ) as the proposal probability kernel
Denote the transition measure and its transpose as follows
ν(du, du ) = µ(du)Q(u, du ), νT
(du, du ) = ν(du , du)
When ν νT
(i.e. ν νT
and νT
ν) with density
r(u, u ) =
dν
dνT
(u, u )
The acceptance probability is
a(u, u ) = 1 ∧ r(u, u )
S.Lan | ∞-Dimensional Geometric MCMC
6
Random Walk
S.Lan | ∞-Dimensional Geometric MCMC
7
preconditioned Crank-Nicolson
(Cotter et al., 2013)
A finite difference method used for numerically solving PDE (Richtmyer
and Morton, 1994)
Modified Random Walk Metropolis (RWM)
Given u, sample ξ ∼ N(0, C)
Make proposal
u = 1 − β2u + βξ (1)
Accept u with probability
a(u, u ) = 1 ∧ r(u, u ), r(u, u ) = exp(−Φ(u ) + Φ(u)) (2)
Independent of dimension, mixing faster than RWM
S.Lan | ∞-Dimensional Geometric MCMC
8
+ Gradient
S.Lan | ∞-Dimensional Geometric MCMC
9
∞-dimensional MALA
(Beskos et al., 2008)
Consider the Langevin SDE
du
dt
= −
1
2
K(C−1
u + DΦ(u)) +
√
K
dW
dt
(3)
Semi-implicit scheme to discretize the above SDE
u − u
h
= −
1
2
KC−1
[(1 − θ)u + θu ] −
α
2
KDΦ(u)) +
K
h
ξ, ξ ∼ N(0, I)
which may be simplified to give
u = Aθu + Bθv, v =
√
Cξ −
α
2
√
hKCDΦ(u)
Aθ = (I +
θ
2
hKC−1
)−1
(I −
1 − θ
2
hKC−1
), Bθ = (I +
θ
2
hKC−1
)−1
√
hKC−1
(4)
where α = 1. α = 0 =⇒ pCN.
K = I (IA) or K = C (PIA).
S.Lan | ∞-Dimensional Geometric MCMC
10
∞-dimensional HMC
(Beskos et al., 2011)
Consider the Hamiltonian differential equation
d2
u
dt2
+ K(C−1
u + DΦ(u)) = 0, (v :=
du
dt
)
t=0
∼ N(0, K) (5)
Let K = C, f (u) := −DΦ(u). Störmer-Verlet/splitting scheme (Verlet,
1967; Neal, 2010) is used to discretize (5)
v−
= v +
t
2
Cf (u)
u
v+ =
cos t sin t
− sin t cos t
u
v−
v = v+
+
t
2
Cf (u )
(6)
This defines a mapping Ψt : (u, v) → (u , v )
S.Lan | ∞-Dimensional Geometric MCMC
11
+ Metric
S.Lan | ∞-Dimensional Geometric MCMC
12
∞-dimensional manifold MALA
(Beskos, 2014)
Choose K = G(u)−1
in (3) (Girolami and Calderhead, 2011)
du
dt
= −
1
2
G(u)−1
(C−1
u + DΦ(u)) + G(u)−1
dW
dt
(7)
Define G : X → L(X, X) and g : X → X by
g(u) = −G(u)−1
((C−1
− G(u))u + DΦ(u)) (8)
Similar semi-implicit scheme (θ = 1
2 ) yields
u = a 1
2
u + b 1
2
v, v =
√
h
2
g(u) + G(u)− 1
2 ξ
a 1
2
=
1 − h/4
1 + h/4
, b 1
2
=
√
h
1 + h/4
(9)
Note a2
1
2
+ b2
1
2
= 1 follows Crank-Nicolson scheme (Richtmyer and
Morton, 1994).
S.Lan | ∞-Dimensional Geometric MCMC
13
∞-dimensional manifold MALA
(Beskos, 2014)
Assumption (1)
N(0, G(u)−1
) N(0, C) for µ0-a.s. in u.
Assumption (2)
For µ0-a.s. in u, we have g(u) ∈ Im(C
1
2 ), i.e. N(g(u), C) N(0, C).
S.Lan | ∞-Dimensional Geometric MCMC
14
∞-dimensional manifold MALA
(Beskos, 2014)
Theorem (Beskos (2014))
Denote ν(du, du ) = µ(du)Q(u, du ) with Q(u, du ) being the transition kernel of (9). Under Assumptions
1 and 2, ν νT
, and the acceptance probability has the following form
a(u, u ) = 1 ∧
dνT
dν
(u, u ),
dνT
dν
(u, u ) =
exp{−Φ(u )}λ(ρ−1
(u; u ); u )
exp{−Φ(u)}λ(ρ−1(u ; u); u)
(10)
where ρ−1
(u ; u) = [(1 + h/4)u − (1 − h/4)u]/
√
h and λ(w; u) is calculated as follows:
λ(w; u) =
dN (
√
h/2g(u), G(u)−1
)
dN (0, C)
=
dN (
√
h/2g(u), G(u)−1
)
dN (0, G(u)−1)
dN (0, G(u)−1
)
dN (0, C)
= exp
√
h
2
G
1
2 (u)g(u), G
1
2 (u)w −
h
8
|G
1
2 (u)g(u)|
2
·
exp −
1
2
|G
1
2 (u)w|
2
+
1
2
|C
− 1
2 w|
2
|C
1
2 G
1
2 (u)|
S.Lan | ∞-Dimensional Geometric MCMC
15
∞-dimensional manifold HMC
Let K = G(u)−1
in (5) (Girolami and Calderhead, 2011)
d2
u
dt2
+ u = g(u), (v :=
du
dt
)
t=0
∼ N(0, G(u)−1
) (11)
Discretize (11) by a splitting method of the form
v−
= v +
t
2
g(u)
u
v+ =
cos τ sin τ
− sin τ cos τ
u
v−
v = v+
+
t
2
g(u )
(12)
where τ(t)/t → 1 as t → 0.
Evolving (u, v) for k-folds (12) defines Ψk
t : (u0, v0) → (uk, vk).
S.Lan | ∞-Dimensional Geometric MCMC
16
∞-dimensional manifold HMC
Theorem
Denote ν(du, du ) = µ(du)Q(u, du ) with Q(u, du ) being the transition kernel of Ψk
t . Under Assumptions
1 and 2, ν νT
, and the acceptance probability has the following form
1 ∧ exp(−∆H(u, v)) (13)
where
∆H(u, v) = H(Ψ
(k)
t (u, v)) − H(u, v)
=Φ(uk) − Φ(u0) +
1
2
(|(G(uk) − C
−1
)
1
2 vk|
2
− |(G(u0) − C
−1
)
1
2 v0|
2
− log |G(uk)| + log |G(u0)|)
+
t2
8
(|C
− 1
2 g(u0)|
2
− |C
− 1
2 g(uk)|
2
) +
t
2
k−1
=0
( v , C
−1
g(u ) + v +1, C
−1
g(u +1) )
S.Lan | ∞-Dimensional Geometric MCMC
17
Connections between ∞-dimensional MCMC
Remark
As a direct corollary, when I = 1, ∞-(m)HMC reduces to ∞-(m)MALA.
∞ − MALA
position-dependent pre-conditioner K(u)
−−−−−−−−−−−−−−−−−−−−→ ∞ − mMALA
h=4
−−→ SN
multiplesteps(k>1)
−−−−−−−−−−→
multiplesteps(k>1)
−−−−−−−−−−→
∞ − HMC
position-dependent pre-conditioner K(u)
−−−−−−−−−−−−−−−−−−−−→ ∞ − mHMC
S.Lan | ∞-Dimensional Geometric MCMC
18
Geometric Monte Carlo
weight (computation) control
RWM
θ
1
-2 -1 0 1 2
θ
2
-2
-1.5
-1
-0.5
0
0.5
1
1.5
2
HMC
θ
1
-2 -1 0 1 2
θ
2
-2
-1.5
-1
-0.5
0
0.5
1
1.5
2
RHMC
θ
1
-2 -1 0 1 2
θ
2
-2
-1.5
-1
-0.5
0
0.5
1
1.5
2
LMC
θ
1
-2 -1 0 1 2
θ
2
-2
-1.5
-1
-0.5
0
0.5
1
1.5
2
S.Lan | ∞-Dimensional Geometric MCMC
19
Dimension Reduction
intrinsic low-dimensional subspace
S.Lan | ∞-Dimensional Geometric MCMC
20
Dimension Reduction
intrinsic low-dimensional subspace
Particular choice of K(u) = G(u)−1
could be
K(u) = (C−1
+ H(u))−1
, H(u) = EY|u DΦ(u), DΦ(u) (14)
Computationally infeasible to update Fisher metric H(u) in each u
Given eigenbasis {φi(x)} we define projection operator Pr as follows:
Pr : X → Xr, u → ur
:=
r
i=1
φi φi, u (15)
Truncate H(u) on the r-dimensional subspace Xr ⊂ X (X = Xr + X⊥)
Hr(u)(v, w) = Prv, EY|u[DrΦ(u)DrΦ(u)
T
]Prw , ∀v, w ∈ X (16)
K(u)−1
can be approximated
K(u)−1
≈ Hr(u) + C−1
(17)
S.Lan | ∞-Dimensional Geometric MCMC
21
Prior-Based Dimension Reduction
With the eigen-pair {λi, ui(x)}, the prior covariance operator C can be
written and approximated as
C = UΛU∗
≈ UrΛrU∗
r (18)
Then we can approximate the posterior covariance
K(u) = (C−1
+ H(u))−1
≈ C + UrΛ
1
2
r (Dr − Ir)Λ
1
2
r U∗
r (19)
where Dr := (ˆHr(u) + Ir)−1
and ˆHr(u) := Λ
1
2
r U∗
r H(u)UrΛ
1
2
r .
By applying U∗
r and U∗
⊥ to Langevin SDE (3) respectively we get
dur = −
1
2
Drurdt −
γr
2
Dr ur Φ(u; y)dt +
√
DrdWr (20a)
du⊥ = −
1
2
u⊥dt −
γ⊥
2
u⊥
Φ(u; y)dt + dW⊥ (20b)
where ur = Λ
− 1
2
r U∗
r u and u⊥ = Λ
− 1
2
⊥ U∗
⊥u.
S.Lan | ∞-Dimensional Geometric MCMC
22
Karhunen-Loève Expansion
Let X be a Hilbert space H = L2
(D; R) on bounded open D ⊂ Rd
with Lipschitz boundary, and with inner-product ·, · and norm ·
Consider the following covariance operator C
C := σ2
(αI − ∆)−s
(21)
Let {λ2
i } and {φi(x)} denote eigenvalues and eigenfunctions of C.
If s > d/2 and λi i− s
d , C defines a Gaussian measure N(0, C) that
each draw u(·) ∼ N(0, C) admits Karhunen-Loève (K-L) expansion
(Adler, 1981; Bogachev, 1998; Dashti and Stuart, 2015):
u(x) =
+∞
i=0
uiλiφi(x), ui
iid
∼ N(0, 1) (22)
S.Lan | ∞-Dimensional Geometric MCMC
23
Laminar Jet
formulation of the inverse problem
We consider the following 2d Navier-Stokes equation:
Momentum equation:
− div ν( u + uT
) + u · u + p = 0, (23)
where u = (u, v) is the velocity field and p the pressure;
Continuity equation:
div u = 0. (24)
Denote σn = −pn + ν( u + uT
) · n as the boundary traction.
u · n = −θ(y), σn × n = 0 on I := x = 0, y ∈ (−
1
2
Ly,
1
2
Ly)
σn + β(u · n)− u = 0 on O := x = Lx, y ∈ (−
1
2
Ly,
1
2
Ly)
u · n = 0, σn × n = 0 on B := x ∈ (0, Lx), y = ±
1
2
Ly
where (u · n)− = u·n−|u·n|
2 , and β ∈ (0, 1] is the backflow stabilization
parameter
S.Lan | ∞-Dimensional Geometric MCMC
24
Laminar Jet
results
Laminar Jet problem: the location of observations (left) and the forward PDE
solutions with true unknown (right). Fluid viscosity ν = 3 × 10−2
.
S.Lan | ∞-Dimensional Geometric MCMC
25
Laminar Jet
results
Laminar Jet test problem: true inflow velocity profiles with different number of
modes (left) and posterior estimates given by different algorithms (right). Shaded
region shows the 95% credible band estimated with samples by ∞-mHMC.
S.Lan | ∞-Dimensional Geometric MCMC
26
Laminar Jet
results
Laminar Jet problem: the trace plots of data-misfit function evaluated with each
sample (left, values have been offset to be better compared with) and the
auto-correlation of data-misfits as a function of lag (right).
S.Lan | ∞-Dimensional Geometric MCMC
27
Laminar Jet
results
Method AP s/iter ESS(min,med,max) minESS/s spdup PDEsolns
pCN 0.61 1.29 ( 5.24, 6.66, 13.33) 4.05E-04 1.00 22004
∞-MALA 0.66 1.68 ( 5.38, 6.62, 19.53) 3.21E-04 0.79 33005
∞-HMC 0.72 3.81 ( 5.41, 7.43, 16.44) 1.42E-04 0.35 82466
∞-mMALA 0.68 5.97 ( 1075.24, 2851.22, 3867.08) 1.80E-02 44.47 2233205
∞-mHMC 0.58 13.33 ( 2058.42, 3394.17, 4560.03) 1.54E-02 38.13 5575696
split ∞-MMALA 0.57 3.66 ( 1079.55, 1805.89, 2395.13) 2.95E-02 72.82 693065
split ∞-mHMC 0.60 6.88 ( 2749.63, 3974.36, 5498.03) 4.00E-02 98.67 1721694
Sampling Efficiency in Laminar Jet problem. AP: acceptance probability, s/iter:
seconds per iteration, ESS(min,med,max): Effective Sample Size (minimum, median,
maximum), and MinESS/s: minimum ESS per second. Dimension =100. HMC
algorithms do leap frog for steps randomly chosen between 1 and 4. Split∗
truncates
K-L at r=30.
S.Lan | ∞-Dimensional Geometric MCMC
28
Likelihood-Informed Dimension Reduction
By whitening coordinates vi(x) = C− 1
2 ui(x), generalized eigenproblem
H(u)ui(x) = λiC−1
ui(x) (25)
can be shown equivalent to the eigen-problem for ppGNH H(v)
H(v)vi(x) = C
1
2 H(u)C
1
2 vi(x) = λivi(x) (26)
The local posterior covariance is approximated (Dr := (Ir + Λr)−1
)
K(v) = (I + H(v))−1
≈ I + Vr(Dr − Ir)V∗
r (27)
By applying V∗
r and V∗
⊥ to whitened Langevin SDE respectively
dvr = −
1
2
Drvrdt −
γr
2
Dr vr
Φ(v; y)dt +
√
DrdWr (28a)
dv⊥ = −
1
2
v⊥dt −
γ⊥
2
v⊥
Φ(v; y)dt + dW⊥ (28b)
where vr = V∗
r v and v⊥ = V∗
⊥v.
S.Lan | ∞-Dimensional Geometric MCMC
29
Connection to DILI
(Cui et al., 2016)
By considering the approximation H(v) ≈ VrΛrV∗
r , we have the
posterior covariance projected to r-dimensional subspace:
Kr = Covµ[V∗
r v] = V∗
r (I + H(v))−1
Vr ≈ Dr := (Ir + Λr)−1
(29)
While DILI computes the posterior covariance in the subspace
Kr := Covµ[V∗
r v] empirically.
They both have the following approximate posterior covariance
Kv = Covµ[v] ≈ VrKrV∗
r + I − VrV∗
r (30)
Since we directly work with (29), empirical calculation of Kr is avoided;
The approximation (29) is already in diagonal form thus the rotation
Ψr = VrWr in DILI is not needed.
They capture the similar geometric feature of subspace.
S.Lan | ∞-Dimensional Geometric MCMC
30
Elliptic Inverse Problem
formulation of the inverse problem
Consider the elliptic inverse problem as in (DILI Cui et al., 2016)
defined on the unit square domain Ω = [0, 1]2
:
− · (k(s) p(s)) = f (s), s ∈ Ω
k(s) p(s), n(s) = 0, s ∈ ∂Ω
∂Ω
p(s)dl(s) = 0
(31)
where k(s) is the transmissivity field, p(s) is the potential function,
f (s) is the forcing term, and n(s) is the outward normal to the
boundary.
The inverse problem is to infer u = log k from observations {yn}.
25 observations arise from the solutions (solved on a 80 × 80 mesh)
contaminated by additive Gaussian error:
yn = p(xn) + εn, εn ∼ N(0, σ2
y), SNR := max
s
{u(s)}/σy = 100
S.Lan | ∞-Dimensional Geometric MCMC
31
Elliptic Inverse Problem
results
S.Lan | ∞-Dimensional Geometric MCMC
32
Elliptic Inverse Problem
results
Elliptic inverse problem (SNR = 100): Bayesian posterior mean estimates of the
log-transmissivity field u(s) based on 2000 samples by various MCMC algorithms;
the upper-left corner shows the MAP estimate.
S.Lan | ∞-Dimensional Geometric MCMC
33
Elliptic Inverse Problem
results
Method h AP s/iter ESS(min,med,max) minESS/s spdup PDEsolns
pCN 0.01 0.57 0.99 (2.67,6.95,37.79) 0.0013 1.00 2501
∞-MALA 0.04 0.61 1.62 (4.32,15.34,51.45) 0.0013 0.99 5002
∞-HMC 0.04 0.59 3.52 (24.36,92.13,184.84) 0.0035 2.57 12342
DR-∞-mMALA 0.52 0.67 8.85 (127.25,210.84,460.07) 0.0072 5.34 80032
DR-∞-mHMC 0.25 0.56 22.97 (190.2,322.29,687.11) 0.0041 3.08 198176
DILI (0.1, 0.2) 0.69 1.59 (30.52,133.67,221.97) 0.0096 7.13 6612
aDR-∞-mMALA 0.25 0.71 1.61 (12.09,89.17,174.36) 0.0037 2.79 6612
aDR-∞-mHMC 0.10 0.69 3.63 (70.99,234.42,364.31) 0.0098 7.26 14056
Sampling efficiency in elliptic inverse problem (SNR=100). Column labels are as
follows. h: step size(s) used for making MCMC proposal; AP: average acceptance
probability; s/iter: average seconds per iteration; ESS(min,med,max): minimum,
median, maximum of Effective Sample Size across all posterior coordinates;
min(ESS)/s: minimum ESS per second; spdup: speed-up relative to base pCN
algorithm; PDEsolns: number of PDE solutions during execution.
S.Lan | ∞-Dimensional Geometric MCMC
34
∞-dimensional Spherical Hamiltonian Monte Carlo
qqqq qqq qqqqq qq qq qq qq qqqq qqqq qqqq qqqq qqq qqq qqqq qq qq qqqq qqqq qqq qqq qq qqqq qq qq qqqq qq qq qq qq qq qqqq qqq qqq qqqq qqqq qq qq qq qqqq qq qq qq qq qqqq qq qq qq qq qqqq qq qqqq qq qqqq qq qq qq qq qq qq qq qq qqq q qqqq qq qq qqqq qqqq qq qq qq qq qq qq qqqq qq qq qqqq qqqq qq qq qq qq qqqq qq qqqq qqqq qq qq qqqq qq qq qq qq qqqq qqqq qq qq qq qq qq qq qqqq q qqqqq qq qq qq qq qqqq qq qqqq qqqq qq qq qq qqqq qq qq qq qq qq qq qq qq qqqq qq qq qq qqqq qqqq qq qq qq qq qqqq qqqq qq qq qq qq qq qq qq qq qq qqqq qqqq qq qq qq qqqq qq qq qq qqqq qq qq qq qq qq qqqq qq qqqq qqqq qq qq qq qq qq qq qqqq qq qq qq qq qq qq qq qq qqqq qq qq qq qqqq qq qq qqqq qq qq qqqq qq qqqq qqqq qqqq qq qq qq qq qq qq qq qq qq qq qq qqqq qq qq qq qqq q qqqq qqqq qqqq qq qq qqqq qq qq qqqq qq qq qqqq qqqq qq qq qq qq qq qq qq qq qqqq qq qqqq qq qq qqqq qqqq qq qq qq qq qq qqqq qqqq qq qq qq qqqq qqqq qq qqqq qq qq qq qq qq qq qq qq qq qqqq qq qq qqqq qqqq qqqq qq qq qq qq qq qq qq qqq q qq qq qqqq qq qqqq qqqq qq qqqq qq qq qqqq qq qq qq qqqq qqqq qq qq qq qq qq qqqq qq qq qq qq qqqq qq qq qq qq qq qq qq qq qq qq qqqq qq qqqq qq qq qqqq qqqq qqqq qqqq qq qq qq qqqq qq qq qq qq qq qq qq qq qq qq qq qq qq qqqq qqqq qq qq qq qq qq qq q q qq qq qqqq qqqq qq qq qqqq qq qqqq qqqq qqqq qq qq qq qq qq qq qq qqqq qq qq qq qq qqqq qq qq qq qq qq qqqq qq qq qq qq qq qq qq qqqq qq qqqq qq qqqq qq qqqq qqqq qq qq qq qq qqqq qq qq qq qq qq qq qq qq qq qq qqqq qq qq qqqq qq qq qq qq qqqq qqqq qq qq qq qq qqqq qqqq qq qq qq qq qqqq q q qq qq qqqq qq qq qq qq qqqq qqqq qq qq qq qq qq qq qq qq qqqq qqqq qq qq qq qq qqqq qq qq qq qq qqqq qq qq qq qqqq qq qq qq qq qqqq qq qq qq qq qq qqqq qqqq qq qq qq qqqq qqqq qq qq qqqq qq qq qq qq qq qq qq qq qq qq qq qq qqqq qq qq qq qq qqqq qq qqqq qq qq qq qq qqqq qq qq qq qq qq qq qq qq qqq qqq qq qqqq qq qq qq qq qqqq qq qqqq qqqq qqqq qq qq qq qqqq qq qq qq qq qq qq qq qq qq qqqq qq qqqq qqqq qq qq qqqq qqqq qqqq qq qq qq qq qq qq qq qq qq qq qqqq qq qq qq qqqq qq qq qq qq qqqq qq qq qqqq qqqq qq qq qq qq qq qq qq qq qqqq qq qq qq qqqq qqqq qq qq qq qq qqqq qqqq qq qqqq qq qq qq qq qq qq q qqq qq qqqq qq qq qq qq qq qq qqqq qq qqqq qq qqqq qq qqqq qqqq qq qq qq qq qq qq qq qq qq qq qq qqqq qqqq qq qqqq qq qq qq qqqq qqqq qq qqqq qq qq qq qq qqqq qqqq qq qq qq qq qq qq qq qq qqqq qq qq qqqq qq qq qqqq qq qqqq qq qq qq qq qq qq qq qq qq qqqq qq qq qq qq qq qq qqqq qq qq qqqq qq qq qq qq qqqq qq qq qq qq qq qqqq qq qqqq qqqq qqqq qqqq qqqq qqqq qq qq qq qq qqqq qq qq qq qqqq qq qq qqqq qq qq qq qq qq qqqq qq qq qq qq qq qq qqqq qqq q qqqq qq qq qqqq qq qq qq qq qq qq qqqq qq qqqq qq qqqq qq qq qq qq qqqq qqqq qqqq qq qq qqqq qq qq qq qq qqqq qq qq qqqq qq qq qqqq qqqq qq qq qq qq qq qqq q qqqq qqqq qq qqqq qqqq qq qqqq qqqq qq qq qq qq qq qq qqqq qq qq qqqq qq qq qq qq qq qq qqqq qq qq qq qq qq qq qqqq qq qqqq qq qq qqqq qqqq qqqq qqqq qqqq qq qq qqqq qq qq qqqq qqqq qq qq qqqq qqqq qqqq qq qq qqqq qqqq qq qq qq qq qq qq qqqq qqqq qqqq qq qq q q qqqq qq qq qq qq qqqq qq qq qqqq qqqq qq qq qq qq qq qq qqqq qq qq qq qq qq qqqq qq qqqq qq qq qq qq qq qq qq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qq qq qqqq qqqq qqqq qqqq qq qq qq qq qqqq qqqq qqqq qqqq qq qq qq qq qqqq qqqq qqqq q q qqqq qqqq qqqq qqqq qqqq qq qq qqqq qqqq qq qq qqqq qqqq qqqq qq qq qq qq qqqq qq qqqq qqqq qqqq qq qq qqqq qqqq qq qqqq qqqq qq qqqqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qq qqqq qq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qq qqqq qqqq qq qqqq qqqq qqqq qqqq qqqq qqqqqqqqqqqqqqqqqqqqqqqq qqq qq qqq qqqq qqqq qqqq qqq qq qq qq qqq qqqq qqqq qqqq qqqq qqqq qqqq qqq qq qqq qqqq qqqq qqqq qqqq qqq qq qq qq qqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqqqq qqq q qqqq qqqq qqqq qqqq qqqq q qq q qqqq qq qqqq q qq q qqqq qqqq q qq qqq qq qqqq qqqq q qq q qqqq qqqq qqqq qqqq q qq qq qqq qq qqqq qqqq q qq q qq qq qqqq qqqq qqqq q qq q qqqq qq qqqq qq qqqq qqqq qqqq q qq q qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq q qq q qq qqq qq qq qq q qq q qq q qq qqq q q qq qq q qqqq q qq q qq qq q qq qq qq q qqqq q qq qq qq q qqqq qq qq q q q qq qq qq qq qq q qqqq q qq qq qq qq qq qqq qq qq qq q qq qq qq q qq qq q qq q qqqq q qq qq q q qq qq qq qq q qq qq q qq q qq qq q qq qq qq qq qq qq qq qq qq qq qq q qqqq qq qqqq qqqq qq qqq q q q qqqq q q q qq qq q qq qq qqqq qq qq qqqq q q q q qq qq qq qqq qq q qqqq q qq q qqqq qq qq qq qq qqqq q qq q q q qq qq qq qq qq qq qqqq q qq q qqqq qq qq qqqq qqqq qqqq qqqq q q q q qqqq qqqq qqqq qq qqqq qq qqqqq qq qq q q qqqqq qq qq qq qqq qq qq qq qq q q qqqqq qq qq qqq qq qqq qq qq qqq q q qq qq qqq qq qq qq qqqqq q q qqqqq qq qqq qq qq q q qqq qq qq qqq qq qq qq qq qq qq qq qqq qqq qq qqqqqq qq qqq qq qq qq qq qq qq qq q q qqqq q qq q qqqq qq qq qqqq qq qq qq qqqq qq qq qq qqqq qq qq qq qqqq qqqq qq qq qq qqqq qq qq qqqq qq qq qq qq qq qq qqqq qqqq q qq qqq qqqq qq qqqq qq qqqq qqqq qq qqqq qqqq qq qqqqq qq qq qq q q qq q qq qq qq qq qq qq qq qq qq qqq qq q qqqq q qq qqq qqq qq qqq qq qq qq qq qq qqqq qq qqq qq q q qqq qq qq qq qqqq qq qqqq qqqq qq q qq q qq qq qqqq qq qq qq qq qq qq qq qq qq qqqq qq qq qq q qq qqq qqqq qq qqqq qq qq qq qq qqqq q qq qqq qqqq qq qq qq qq qq qq qq qq qq qq q qq q qqqq qqqq qq qq qqqq q qq qq qq q qq qq qq qq qq qq qq qq q qq q qqqq qq qqq qq qqq qqqq qqqq qqq qq qqq qqqq qq qqqq qq qqq qqq qqq q q qqq qq q qq q qq qqqq qq qq qq qq qq qq qq qqqq qq qq qq qqq qq q qq qq qq qq qq qq q qq qq qq q qq qq qqqq qq qqq qq q qqqq qq qq qq qqqq qqqq qq qq q qq q qq qqqq qq qq qq qq qqqq qqqq qqq qq qqq qq qq qq qqqq q qq q qq qq q qq q qqqq qq qq qqqq q qq qq qq qq qq q qq qqqq qqq qq q qqqq qq qq q qq qq qq qq q qq qq q qq qq qq q qq qq qq qq qq qq qqqq qq qq qq qqq qq qqq qq qqqq qq qq qq qq q qq q qq qq qq qq qqqq qq qq qq qq qqqq qqqq q qq qqq qq qqqq qq qqqq q qq q qqqq qq qq qq qqqq qq qq qq qq qq qq qqqq qqqq qq qqqq qq qqq q q qqq qq qq qqqq qqqq qqqq qq qqqq qq qqq qq qqqqq qq qq q q qq qq q qq q qq qq qqqq qqqq qq qq qq qq qq qqqq qq qq qq qq qq qq qq qqqq qq qq qqqq qqqq qqqq qqqq qq qqqq qqq qq qqq qq qq qq qq qq qq qq qq qq qqq qq qq q q qq q q q qqqq qq qq qq qq qq qqqq qqqq qq q qq q qqqq qq qq qqqq q qq qqq qq qqqq qqqq qq qq q q qq qq qq qqq q q q qq qq qq qq qqqq qqqq qq qq qqqq qq qq qq qq qq qq qq qqq qq qqq qq qqqq qq qqq q q qq q q qqq qqqq qq qq qq qq qq qqqq qqqq qq qqqqq qq qqqqq qq qq qq qq qq qqq q q qqq qq qqqq qq qq q qq qq qq qqqqq qq qq qqqq qqqq qqqq q qq q q qq qq q qq qqq qq qqqqqq q q qqq qq qq qqq qq qqq qqq q q qq qq qqqqqq qq qqq qq q qq qq qq q qq qq qq qqqq qq qq qq qq qq qq q qq q qqqq qq qq qqqq qq qq qqqq qq qq qq qqqq qq qq qqq q q qqq qqq qqq qq qqqq q qq q qq qqqq qqqq qq qq qq qq qq qqqq qq qq qqqq qq qq qq qq qq qq qqqq q qq qqq qq qq qqqq qq qq qq qqq q q qq qq q qqqq qqqq qq qqqq qqqq qqq qq q qq qq qq qq qq qqq q qqqq q qq q qq qq qqqq qqqq q qq q qq qq qqqq qqqq q qq q qq qq q qq qqq qqqq qqqq q qq qq qq qq qq q qq qq qq qq qqqq qqqq qq qq qq qq qq qq qqqq qqqq q qq q q qqq qq qqqq qq qq qq qqqq qq qq qq qq qqqq qq qqqq qq qq qqq q q qqqqq qqqq qq qq qq qqqq qq qq qqq qqq qqqq qq qq qqqq qqqqqq qq qq qqqq qq qqqq qq qqqq qqqq qqqq qq qq qqq qqq qqqq qq qq qqqq qqqq qq qq qq qq qqqqq qqq qqqq
θ
q
A B
θ
~
=(θ,θD+1)
θD+1=(1−||θ||2
)0.5
qqq
qq
qq
qqq
qq
q
qq
qq
q
q
q
q
q
q
qq
q q
q
qq
q
q
q
q
q
q
q
qq
qq
q
qq
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q q
qq
q
q
q
q
qq
q
q
q
q
q q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q qq
q
q
q
q
q
q qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
qq
q
qq
q
q
q
qq
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
qq
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
qq
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q q
qqq
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q qq q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
qq
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qqq
q
q
q q
q
q q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q q
q
q
q
q
q
qq q
q
q
qq
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
qq
q
q
qq
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
qq
q
qq
q
q
q
q
q
q qqq
q
q
q
q
q
qq
q
q
q qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q q
q
q
q
qq q
q
qq q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
qqq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq q
q
q
qq
q
q
qqq
q
q
qqqq
q
q
q
q
q
q
q q
q
qq
q q
q
q
q q
qq
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
qq
q
q
q
q q
q
q
q
q
q
q
q
q
q
qqq q
q
q
qq
q
qq
q
qq
q
q
q
qq
qq
q
q
q
q
q q q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
qq
q
q
q
q
q
qqq
q
q
q
q qq
q
q
q
q
qq
q
qq
q
q
q q
q
q
qq
q
q
qq
q
qq
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
qq q
q
q
q
q
q
q
q
qq
q
q
q q
q
q
q
q
q
q q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
qq q
q
q
qq
q
q
q qq
qqq qq
q
qq
q
q
q
qqqq q
q
q
q
q
q
q
q
q
q qq
q
qqq
q
q
q
qq qq
q
q
q
q q
q
qqq
q
q
q qq q
q
qq qq
q
qq
q
q
q
q
q
qq
q
qq q
q
qq
q
q qq
q
q
q
qq
q
qq
q
qq
q
qq
qq
qq
q
q
q
q
q
q
q
q q
qq
q
qq
q
q
q
q
q
q
q q
q
q
q
q
q
q
qq
q qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq qq
q
q
q
qq
q
qq
q
q
q
qq
q
qq
q
q
q
q
qq
qqq
q
q
q
q
q
q
q
q q
q
q
qqq
q
q
q
qq
q
q q
q
qq
qq
q
qqqqq
q
qq qqq
q
q qqq
q
qq qq qq
q
qq
q
q qq qq
q
qq
q
qqq q
q
qqq
q
qq qq
q
q
q
qqqqq
q
q qqq q qqq qqqq qq qqqqqq qqq qqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqq qqqq qq qqqq qqq qqq q qq
q
qq qq q
q
q
q
qqq q
q
qqq
q
qqq q
q
qq
q
qqq qq
q
qq
q
qqqqq q
q
qqqq
q
qqqq q
q
qqqqq
q
qqqq
q
q q
q
qq
q
q
q
qq q
q
q
q
q
q
q
q
q
q
q
q
q qq
qq
qq
q
q
q q
q
q q
q
q
q
qq
q
qq
q
qq
q
qqq
qq
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q qq
qq
q
q
q
q
q
q
qq
q
q
q
qq
q
qq
q
qq
qq
q
q
q
q
q
q
q
qq
q q
qq
q
qq
q
qq
q
qq
q
q
q
q
qq
qqq
q
qqq q
q
q
q
q
q
q
q
q
q
qq qq q
q
qq qq
q
q
q
qq
q
q
q
q
q
q
q
qq q
q
q
q
qqqq
q
qq
q
q
q
q
q
q
qq
q
q qqq
q
q
q
qqq
q
qq qq
qq
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q q
q
q
q
qq
q
q
q
q
q
q
q q
q
q
q
qq q
q
qq
q
q
q
q
qq
q
q
q q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
qq
q
q
q
q
q
qq
q
q
q
q
q q
q
qq
q
q
q
q
qqq
q
qq
q
qq
q
q
q
q
q
q q
q
q
q
q
qq
q
q q
q
q
q
q
q
q
qqq
q
q
q
q
qq
qq
q
q
q
q q
q
q q
q
qq
q
q
q
qqq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q q
q
q
qq
q
q qq
q
q
q
q
q
q
q
q
q
q
q
q q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
qq
qq
q
q
q q
q q
q
q
q
q
q
q
q
q
q
qq q q
q
q
q qq
q
q
q q
q
q
qq
q
qq
q
q
q
q
q
q
q q
q
q q
q
q
q
q
q
q
q
q
q
qq q
q
q
q
qq q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq q
q
qqq
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q q
q
q
q
q
q
qqqq
q
q
q
q
qqq
q
q q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q q
q
q
qq
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq q
q
q
q q
q
q
qqq
q
q
q q
q
qq
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
qq
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
qq
q
qq
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqqq
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q q
q
q
q
q
qqq
q
q
q q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q q
qq
q
q q
qq
q
q
q
q
q
q
q
q
qq
q
q
q q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
qq
q
q q
qq
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q q
q
qq
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q q
q
q
q
q
q
q
q
qq
q
qq
qq
q
q
q
q
q
q
q
q
q
qq
q
q
qqq
qq
q
q
qqq
θ
θD+1
q A
B
S.Lan | ∞-Dimensional Geometric MCMC
35
Representation of Probability Densities
Consider probability distributions over smooth manifolds D. Having
fixed a background measure µ, let
P := p : D → R | p ≥ 0,
D
p(x) µ(dx) = 1 (32)
Define the following nonparametric Fisher metric on the tangent space
TpP := φ ∈ C∞
(D) | D
φ(x) µ(dx) = 0 :
gF (φ, ψ)p :=
D
φ(x)ψ(x)
p(x)
µ(dx). (33)
The square-root mapping S : (P, gF ) → (Q, ·, · 2), S(p) = q = 2
√
p
is a Riemannian isometry, where Q is ∞-dimensional sphere in L2
(D)
Q := q : D → R |
D
q(x)2
µ(dx) = 1 , f , h 2 =
D
fhdµ(x) (34)
S.Lan | ∞-Dimensional Geometric MCMC
36
Nonparametric Density Modeling
It is easier to work with root density q ∈ Q (e.g. clean geodesic flow).
Restrict Gaussian process prior q(·) ∼ GP(0, K(·)) to Q, where the
covariance operator K = σ2
(α − ∆)−s
has eigen-pairs {λ2
i , φi(x)}∞
i=1.
Then q(x) 2 =
∞
i=1 qiφi(x) 2 = 1 with qi ∼ N(0, λ2
i ) implies
q 2
2 :=
∞
i=1
q2
i = 1, i.e. q := (qi) ∈ S∞
(35)
Given data x = {xn ∈ D}N
n , we have the posterior density
π(q|x) ∝ π(q) π(x|q) =
∞
i=1
exp − q2
i /(2λ2
i ) δ q 2
(1)
N
n=1
q2
(xn) (36)
Sampling q = (qi) can be done by spherical HMC (Lan et al., 2014).
S.Lan | ∞-Dimensional Geometric MCMC
37
Spherical Hamiltonian Monte Carlo
(Lan et al., 2014)
Truncate q := (qi) at I, q := (qi)I
i=1. Define the total energy
E(q, v) := U(q) + K(v; q) = ˜U(q) + K0(v; q) (37)
˜U(q) := U(q) −
1
2
log |G(q−I)| = − log π(q|x) + log |qI| (38)
K0(v; q) :=
1
2
vT
−IG(q−I)v−I =
1
2
vT
v (39)
Discretizing the Hamiltonian equation results in
v−
= v −
h
2
P(q)g(q)
q
v+ =
r 0
0 v−
2
cos( v−
2r−1
h) + sin( v−
2r−1
h)
− sin( v−
2r−1
h) + cos( v−
2r−1
h)
r−1
0
0 v− −1
2
q
v−
v = v+
−
h
2
P(q )g(q )
(40)
where g(q) := q
˜U(q), P(q) := ID − r−2
qqT
.
S.Lan | ∞-Dimensional Geometric MCMC
38
Nonparametric Density Modeling
simulation result
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
qq q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
0.00
0.25
0.50
0.75
0.0 0.2 0.4 0.6 0.8
Dimension2
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
0.00
0.25
0.50
0.75
1.00
0.00 0.25 0.50 0.75 1.00
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q q
q
q
q
qq
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q q
q
qq
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
0.00
0.25
0.50
0.75
1.00
0.00 0.25 0.50 0.75 1.00
Dimension 1
Dimension2
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
0.00
0.25
0.50
0.75
1.00
0.00 0.25 0.50 0.75 1.00
Dimension 1
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
qq q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
0.00
0.25
0.50
0.75
0.0 0.2 0.4 0.6 0.8
Dimension2
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
0.00
0.25
0.50
0.75
1.00
0.00 0.25 0.50 0.75 1.00
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q q
q
q
q
qq
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q q
q
qq
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
0.00
0.25
0.50
0.75
1.00
0.00 0.25 0.50 0.75 1.00
Dimension 1
Dimension2
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
0.00
0.25
0.50
0.75
1.00
0.00 0.25 0.50 0.75 1.00
Dimension 1
Figure: The contours (black) of the posterior median from 1,000 draws of the
χ2
-process density sampler. Each posterior is conditioned on 1,000 data points (red).
S.Lan | ∞-Dimensional Geometric MCMC
39
Nonparametric Density Modeling
result of real data
q
qqq
q q
qq
qqq
qq
qqqqq
qq
q
q
qqq
q
qqqq
q
qq
q
q
q
q
qqq
q
q
q
q
qqq
qq
q
qq
q
q
qq
q
q
q
q
q qq
qqqq
q
q qq q
q
qq
qqq qq
qq
qq
qqq
qq
qq
q
qq
q q
q
qqqqq
qqq
qqqqq
qqq
q
q
q
q
qqq
q
q
qqq
q
q
q
q
qqqq
q
q
qq
qq
q
q
q q
qq
qq
qq
q
q q
q
qqq
qq
q
qq
q
q
q
qq
q
qq
q
q
qq
q
q
qq
q
q
qq
qq
q
q
qq
q
q
q
qqq
q
q
q
q
q
qq
qq
q
q
qq
q
q
q
q
qqq
q
qqqq
q
q
qq
q
q
q
q
q
q
qqq
qq
qq
q
q
qqq
qqqqq
qq
q
qq
q
qqqq
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q q
q
q
q
q
q q
q
q
q
q q
q
qq
q
q
q q
q
qq
q
q
q
qq
q
q
q
qq qq
q
q
q q
qq
q
q
q
qq
qqqq
q
q
q
q
q
q
qqqq q
q
q
q
q
q
q
q
qqqq
q
q
q
qqq
q
q
q
q
q q
q
q
q
q
q
q
qq
q
qqq qqq
q
q
q
q
q
q q qq q
q
q
qq
q q
qq
q
qqqq
qq
q
qq
q
qq
q
q q qq
q
q
q
q
q
q
q q
q
q
qq
q
qqq qq
q
qqq
q
q
qqq
q
q
qq
qq
qqqqq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q q q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
qq
qq
q
q q q
q
q q
q
q
q
q
qqq
q
qq
q
q
q
q
qq
q
qqq
qq
qq
q
qqqqq
q
q
q
qq
q q
qqqq
qq
q
q
q
q
q
q
q q
q
q
q
q
q
q
qq
qq
qq
q q
q
q
q
q
qq
qq
q
q
q
qq
q
q
q
q
q
qq
q
q
qqq
q
q
q
qq
q
q
qq
qqq
qq
qq
q
qq
qqq
q
q q
q
q
q
qq
q
q
q
q
q q
q
qq
q
q
q
qqq
q
qq
q
q
q
q
qq
q
q
q
q
q
q
qq
q
qq
q
qq
q
qq
q
q
q
q
q q
q
q
q
q
q
q
qq qqqq
q q
q
qqq
qqq
q
qq
q
q
q
q
q
q
q
qq
qqq
q q
qqq
q
q
q q
q
q
q q q
q
q
q
q
q q
qq
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
qqq
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q q
q
q
q
q
q
q
q
q
q
qq
qq
qqq q
qq
q
q
q
q
q
0.00
0.25
0.50
0.75
1.00
0.00 0.25 0.50 0.75 1.00
Dimension 1
Dimension2
Pointwise posterior mean
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
0.00
0.25
0.50
0.75
1.00
0.00 0.25 0.50 0.75 1.00
Dimension 1
Posterior predictive sample
q
qqq
q q
qq
qqq
qq
qqqqq
qq
q
q
qqq
q
qqqq
q
qq
q
q
q
q
qqq
q
q
q
q
qqq
qq
q
qq
q
q
qq
q
q
q
q
q qq
qqqq
q
q qq q
q
qq
qqq qq
qq
qq
qqq
qq
qq
q
qq
q q
q
qqqqq
qqq
qqqqq
qqq
q
q
q
q
qqq
q
q
qqq
q
q
q
q
qqqq
q
q
qq
qq
q
q
q q
qq
qq
qq
q
q q
q
qqq
qq
q
qq
q
q
q
qq
q
qq
q
q
qq
q
q
qq
q
q
qq
qq
q
q
qq
q
q
q
qqq
q
q
q
q
q
qq
qq
q
q
qq
q
q
q
q
qqq
q
qqqq
q
q
qq
q
q
q
q
q
q
qqq
qq
qq
q
q
qqq
qqqqq
qq
q
qq
q
qqqq
q
q
q
qq
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
qq
q
q q
q
q
q
q
q q
q
q
q
q q
q
qq
q
q
q q
q
qq
q
q
q
qq
q
q
q
qq qq
q
q
q q
qq
q
q
q
qq
qqqq
q
q
q
q
q
q
qqqq q
q
q
q
q
q
q
q
qqqq
q
q
q
qqq
q
q
q
q
q q
q
q
q
q
q
q
qq
q
qqq qqq
q
q
q
q
q
q q qq q
q
q
qq
q q
qq
q
qqqq
qq
q
qq
q
qq
q
q q qq
q
q
q
q
q
q
q q
q
q
qq
q
qqq qq
q
qqq
q
q
qqq
q
q
qq
qq
qqqqq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q q q
q
q
q
q
q
q
q
q
q
q
qq
q
q
qq
q
q
qq
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
qq
q
q
qq
qq
q
q q q
q
q q
q
q
q
q
qqq
q
qq
q
q
q
q
qq
q
qqq
qq
qq
q
qqqqq
q
q
q
qq
q q
qqqq
qq
q
q
q
q
q
q
q q
q
q
q
q
q
q
qq
qq
qq
q q
q
q
q
q
qq
qq
q
q
q
qq
q
q
q
q
q
qq
q
q
qqq
q
q
q
qq
q
q
qq
qqq
qq
qq
q
qq
qqq
q
q q
q
q
q
qq
q
q
q
q
q q
q
qq
q
q
q
qqq
q
qq
q
q
q
q
qq
q
q
q
q
q
q
qq
q
qq
q
qq
q
qq
q
q
q
q
q q
q
q
q
q
q
q
qq qqqq
q q
q
qqq
qqq
q
qq
q
q
q
q
q
q
q
qq
qqq
q q
qqq
q
q
q q
q
q
q q q
q
q
q
q
q q
qq
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
qqq
q
qq
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q q
q
q
q
q
q
q
q
q
q
qq
qq
qqq q
qq
q
q
q
q
q
0.00
0.25
0.50
0.75
1.00
0.00 0.25 0.50 0.75 1.00
Dimension 1
Dimension2
Pointwise posterior mean
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
0.00
0.25
0.50
0.75
1.00
0.00 0.25 0.50 0.75 1.00
Dimension 1
Posterior predictive sample
Figure: Hutchings’ bramble canes data: the first figure depicts the 823 bramble
canes (red), a heatmap of the pointwise posterior mean (black is low, white is high),
and a single contour at density 0.3 (blue) including all but a few points. The second
figure shows 823 draws from the χ2
-process density posterior predictive
distribution, obtained using a rejection sampling scheme.
S.Lan | ∞-Dimensional Geometric MCMC
40
Conclusion
Geometric information (gradient, metric) helps MCMC in mixing but
comes at computational cost, alleviated by dimension reduction.
Key: find intrinsic finite dimensional subspace where most
information concentrates and manifold MCMC can be applied
Light-computation algorithms, e.g. pCN, preconditioned MALA, can be
applied in the less informative complementary subspace
MCMC defined on manifold can help solve challenging statistical
problems, e.g. density estimation, modeling covariance/correlation
matrices.
S.Lan | ∞-Dimensional Geometric MCMC
40
References I
Adler, R. J. (1981). The geometry of random fields, volume 62. Siam.
Beskos, A. (2014). A stable manifold mcmc method for high dimensions. Statistics & Probability Letters,
90:46–52.
Beskos, A., Pinski, F. J., Sanz-Serna, J. M., and Stuart, A. M. (2011). Hybrid Monte-Carlo on Hilbert spaces.
Stochastic Processes and their Applications, 121:2201–2230.
Beskos, A., Roberts, G., Stuart, A., and Voss, J. (2008). Mcmc methods for diffusion bridges. Stochastics and
Dynamics, 8(03):319–350.
Bogachev, V. I. (1998). Gaussian measures. Number 62. American Mathematical Soc.
Cotter, S. L., Roberts, G. O., Stuart, A., White, D., et al. (2013). Mcmc methods for functions: modifying old
algorithms to make them faster. Statistical Science, 28(3):424–446.
Cui, T., Law, K. J., and Marzouk, Y. M. (2016). Dimension-independent likelihood-informed {MCMC}. Journal of
Computational Physics, 304:109 – 137.
Dashti, M. and Stuart, A. M. (2015). The Bayesian Approach To Inverse Problems. ArXiv e-prints.
Girolami, M. and Calderhead, B. (2011). Riemann manifold Langevin and Hamiltonian Monte Carlo methods.
Journal of the Royal Statistical Society, Series B, (with discussion) 73(2):123–214.
Lan, S., Zhou, B., and Shahbaba, B. (2014). Spherical hamiltonian monte carlo for constrained target
distributions. In Proceedings of The 31st International Conference on Machine Learning, pages 629–637,
Beijing, China.
Neal, R. M. (2010). MCMC using Hamiltonian dynamics. In Brooks, S., Gelman, A., Jones, G., and Meng, X. L.,
editors, Handbook of Markov Chain Monte Carlo. Chapman and Hall/CRC.
S.Lan | ∞-Dimensional Geometric MCMC
41
References II
Richtmyer, R. D. and Morton, K. W. (1994). Difference methods for initial-value problems. Malabar, Fla.:
Krieger Publishing Co.,| c1994, 2nd ed., 1.
Tierney, L. (1998). A note on metropolis-hastings kernels for general state spaces. Annals of Applied
Probability, pages 1–9.
Verlet, L. (1967). Computer "Experiments" on Classical Fluids. I. Thermodynamical Properties of
Lennard-Jones Molecules. Phys. Rev., 159(1):98–103.
S.Lan | ∞-Dimensional Geometric MCMC
Thank you !

More Related Content

What's hot

Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
The Statistical and Applied Mathematical Sciences Institute
 
Unbiased Bayes for Big Data
Unbiased Bayes for Big DataUnbiased Bayes for Big Data
Unbiased Bayes for Big Data
Christian Robert
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
Nested sampling
Nested samplingNested sampling
Nested sampling
Christian Robert
 
ABC in Venezia
ABC in VeneziaABC in Venezia
ABC in Venezia
Christian Robert
 
Bayesian hybrid variable selection under generalized linear models
Bayesian hybrid variable selection under generalized linear modelsBayesian hybrid variable selection under generalized linear models
Bayesian hybrid variable selection under generalized linear models
Caleb (Shiqiang) Jin
 
2018 MUMS Fall Course - Bayesian inference for model calibration in UQ - Ralp...
2018 MUMS Fall Course - Bayesian inference for model calibration in UQ - Ralp...2018 MUMS Fall Course - Bayesian inference for model calibration in UQ - Ralp...
2018 MUMS Fall Course - Bayesian inference for model calibration in UQ - Ralp...
The Statistical and Applied Mathematical Sciences Institute
 
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithmsRao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
Christian Robert
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...
Valentin De Bortoli
 
ABC with Wasserstein distances
ABC with Wasserstein distancesABC with Wasserstein distances
ABC with Wasserstein distances
Christian Robert
 
Approximate Bayesian Computation with Quasi-Likelihoods
Approximate Bayesian Computation with Quasi-LikelihoodsApproximate Bayesian Computation with Quasi-Likelihoods
Approximate Bayesian Computation with Quasi-Likelihoods
Stefano Cabras
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
accurate ABC Oliver Ratmann
accurate ABC Oliver Ratmannaccurate ABC Oliver Ratmann
accurate ABC Oliver Ratmannolli0601
 
MCMC and likelihood-free methods
MCMC and likelihood-free methodsMCMC and likelihood-free methods
MCMC and likelihood-free methods
Christian Robert
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 

What's hot (20)

Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
 
Unbiased Bayes for Big Data
Unbiased Bayes for Big DataUnbiased Bayes for Big Data
Unbiased Bayes for Big Data
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Nested sampling
Nested samplingNested sampling
Nested sampling
 
ABC in Venezia
ABC in VeneziaABC in Venezia
ABC in Venezia
 
Bayesian hybrid variable selection under generalized linear models
Bayesian hybrid variable selection under generalized linear modelsBayesian hybrid variable selection under generalized linear models
Bayesian hybrid variable selection under generalized linear models
 
2018 MUMS Fall Course - Bayesian inference for model calibration in UQ - Ralp...
2018 MUMS Fall Course - Bayesian inference for model calibration in UQ - Ralp...2018 MUMS Fall Course - Bayesian inference for model calibration in UQ - Ralp...
2018 MUMS Fall Course - Bayesian inference for model calibration in UQ - Ralp...
 
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithmsRao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...
 
ABC with Wasserstein distances
ABC with Wasserstein distancesABC with Wasserstein distances
ABC with Wasserstein distances
 
Approximate Bayesian Computation with Quasi-Likelihoods
Approximate Bayesian Computation with Quasi-LikelihoodsApproximate Bayesian Computation with Quasi-Likelihoods
Approximate Bayesian Computation with Quasi-Likelihoods
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
accurate ABC Oliver Ratmann
accurate ABC Oliver Ratmannaccurate ABC Oliver Ratmann
accurate ABC Oliver Ratmann
 
MCMC and likelihood-free methods
MCMC and likelihood-free methodsMCMC and likelihood-free methods
MCMC and likelihood-free methods
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 

Similar to QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop, About Infinite-Dimensional Geometric MCMC - Shiwei Lan, Dec 14, 2017

Adaptive Restore algorithm & importance Monte Carlo
Adaptive Restore algorithm & importance Monte CarloAdaptive Restore algorithm & importance Monte Carlo
Adaptive Restore algorithm & importance Monte Carlo
Christian Robert
 
Trilinear embedding for divergence-form operators
Trilinear embedding for divergence-form operatorsTrilinear embedding for divergence-form operators
Trilinear embedding for divergence-form operators
VjekoslavKovac1
 
Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Computational Information Geometry on Matrix Manifolds (ICTP 2013)Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Frank Nielsen
 
On approximating the Riemannian 1-center
On approximating the Riemannian 1-centerOn approximating the Riemannian 1-center
On approximating the Riemannian 1-center
Frank Nielsen
 
From planar maps to spatial topology change in 2d gravity
From planar maps to spatial topology change in 2d gravityFrom planar maps to spatial topology change in 2d gravity
From planar maps to spatial topology change in 2d gravity
Timothy Budd
 
Litvinenko low-rank kriging +FFT poster
Litvinenko low-rank kriging +FFT  posterLitvinenko low-rank kriging +FFT  poster
Litvinenko low-rank kriging +FFT poster
Alexander Litvinenko
 
QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...
QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...
QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...
The Statistical and Applied Mathematical Sciences Institute
 
Computing f-Divergences and Distances of\\ High-Dimensional Probability Densi...
Computing f-Divergences and Distances of\\ High-Dimensional Probability Densi...Computing f-Divergences and Distances of\\ High-Dimensional Probability Densi...
Computing f-Divergences and Distances of\\ High-Dimensional Probability Densi...
Alexander Litvinenko
 
PCA on graph/network
PCA on graph/networkPCA on graph/network
PCA on graph/network
Daisuke Yoneoka
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...
Alexander Litvinenko
 
Quasistatic Fracture using Nonliner-Nonlocal Elastostatics with an Analytic T...
Quasistatic Fracture using Nonliner-Nonlocal Elastostatics with an Analytic T...Quasistatic Fracture using Nonliner-Nonlocal Elastostatics with an Analytic T...
Quasistatic Fracture using Nonliner-Nonlocal Elastostatics with an Analytic T...
Patrick Diehl
 
Tucker tensor analysis of Matern functions in spatial statistics
Tucker tensor analysis of Matern functions in spatial statistics Tucker tensor analysis of Matern functions in spatial statistics
Tucker tensor analysis of Matern functions in spatial statistics
Alexander Litvinenko
 
Natalini nse slide_giu2013
Natalini nse slide_giu2013Natalini nse slide_giu2013
Natalini nse slide_giu2013Madd Maths
 
Accelerating Pseudo-Marginal MCMC using Gaussian Processes
Accelerating Pseudo-Marginal MCMC using Gaussian ProcessesAccelerating Pseudo-Marginal MCMC using Gaussian Processes
Accelerating Pseudo-Marginal MCMC using Gaussian Processes
Matt Moores
 
On maximal and variational Fourier restriction
On maximal and variational Fourier restrictionOn maximal and variational Fourier restriction
On maximal and variational Fourier restriction
VjekoslavKovac1
 
Low rank tensor approximation of probability density and characteristic funct...
Low rank tensor approximation of probability density and characteristic funct...Low rank tensor approximation of probability density and characteristic funct...
Low rank tensor approximation of probability density and characteristic funct...
Alexander Litvinenko
 
QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017
Fred J. Hickernell
 
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
The Statistical and Applied Mathematical Sciences Institute
 
dhirota_hone_corrected
dhirota_hone_correcteddhirota_hone_corrected
dhirota_hone_correctedAndy Hone
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
Christian Robert
 

Similar to QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop, About Infinite-Dimensional Geometric MCMC - Shiwei Lan, Dec 14, 2017 (20)

Adaptive Restore algorithm & importance Monte Carlo
Adaptive Restore algorithm & importance Monte CarloAdaptive Restore algorithm & importance Monte Carlo
Adaptive Restore algorithm & importance Monte Carlo
 
Trilinear embedding for divergence-form operators
Trilinear embedding for divergence-form operatorsTrilinear embedding for divergence-form operators
Trilinear embedding for divergence-form operators
 
Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Computational Information Geometry on Matrix Manifolds (ICTP 2013)Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Computational Information Geometry on Matrix Manifolds (ICTP 2013)
 
On approximating the Riemannian 1-center
On approximating the Riemannian 1-centerOn approximating the Riemannian 1-center
On approximating the Riemannian 1-center
 
From planar maps to spatial topology change in 2d gravity
From planar maps to spatial topology change in 2d gravityFrom planar maps to spatial topology change in 2d gravity
From planar maps to spatial topology change in 2d gravity
 
Litvinenko low-rank kriging +FFT poster
Litvinenko low-rank kriging +FFT  posterLitvinenko low-rank kriging +FFT  poster
Litvinenko low-rank kriging +FFT poster
 
QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...
QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...
QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...
 
Computing f-Divergences and Distances of\\ High-Dimensional Probability Densi...
Computing f-Divergences and Distances of\\ High-Dimensional Probability Densi...Computing f-Divergences and Distances of\\ High-Dimensional Probability Densi...
Computing f-Divergences and Distances of\\ High-Dimensional Probability Densi...
 
PCA on graph/network
PCA on graph/networkPCA on graph/network
PCA on graph/network
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...
 
Quasistatic Fracture using Nonliner-Nonlocal Elastostatics with an Analytic T...
Quasistatic Fracture using Nonliner-Nonlocal Elastostatics with an Analytic T...Quasistatic Fracture using Nonliner-Nonlocal Elastostatics with an Analytic T...
Quasistatic Fracture using Nonliner-Nonlocal Elastostatics with an Analytic T...
 
Tucker tensor analysis of Matern functions in spatial statistics
Tucker tensor analysis of Matern functions in spatial statistics Tucker tensor analysis of Matern functions in spatial statistics
Tucker tensor analysis of Matern functions in spatial statistics
 
Natalini nse slide_giu2013
Natalini nse slide_giu2013Natalini nse slide_giu2013
Natalini nse slide_giu2013
 
Accelerating Pseudo-Marginal MCMC using Gaussian Processes
Accelerating Pseudo-Marginal MCMC using Gaussian ProcessesAccelerating Pseudo-Marginal MCMC using Gaussian Processes
Accelerating Pseudo-Marginal MCMC using Gaussian Processes
 
On maximal and variational Fourier restriction
On maximal and variational Fourier restrictionOn maximal and variational Fourier restriction
On maximal and variational Fourier restriction
 
Low rank tensor approximation of probability density and characteristic funct...
Low rank tensor approximation of probability density and characteristic funct...Low rank tensor approximation of probability density and characteristic funct...
Low rank tensor approximation of probability density and characteristic funct...
 
QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017
 
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
 
dhirota_hone_corrected
dhirota_hone_correcteddhirota_hone_corrected
dhirota_hone_corrected
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
 

More from The Statistical and Applied Mathematical Sciences Institute

Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
The Statistical and Applied Mathematical Sciences Institute
 
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
The Statistical and Applied Mathematical Sciences Institute
 
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
The Statistical and Applied Mathematical Sciences Institute
 
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
The Statistical and Applied Mathematical Sciences Institute
 
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
The Statistical and Applied Mathematical Sciences Institute
 
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
The Statistical and Applied Mathematical Sciences Institute
 

More from The Statistical and Applied Mathematical Sciences Institute (20)

Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
 
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
 
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
 
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
 
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
 
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
 
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
 
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
 
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
 
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
 
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
 
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
 
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
 
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
 
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
 
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
 
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
 
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
 
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
 
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
 

Recently uploaded

PART A. Introduction to Costumer Service
PART A. Introduction to Costumer ServicePART A. Introduction to Costumer Service
PART A. Introduction to Costumer Service
PedroFerreira53928
 
How to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERPHow to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERP
Celine George
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
Jisc
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
siemaillard
 
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
Nguyen Thanh Tu Collection
 
1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx
JosvitaDsouza2
 
Synthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptxSynthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptx
Pavel ( NSTU)
 
Chapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptxChapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptx
Mohd Adib Abd Muin, Senior Lecturer at Universiti Utara Malaysia
 
How to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleHow to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS Module
Celine George
 
Ethnobotany and Ethnopharmacology ......
Ethnobotany and Ethnopharmacology ......Ethnobotany and Ethnopharmacology ......
Ethnobotany and Ethnopharmacology ......
Ashokrao Mane college of Pharmacy Peth-Vadgaon
 
Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)
rosedainty
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
Atul Kumar Singh
 
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptxMARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
bennyroshan06
 
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdfESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
Fundacja Rozwoju Społeczeństwa Przedsiębiorczego
 
How to Break the cycle of negative Thoughts
How to Break the cycle of negative ThoughtsHow to Break the cycle of negative Thoughts
How to Break the cycle of negative Thoughts
Col Mukteshwar Prasad
 
2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...
Sandy Millin
 
Digital Tools and AI for Teaching Learning and Research
Digital Tools and AI for Teaching Learning and ResearchDigital Tools and AI for Teaching Learning and Research
Digital Tools and AI for Teaching Learning and Research
Vikramjit Singh
 
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXXPhrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
MIRIAMSALINAS13
 
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCECLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
BhavyaRajput3
 
Introduction to Quality Improvement Essentials
Introduction to Quality Improvement EssentialsIntroduction to Quality Improvement Essentials
Introduction to Quality Improvement Essentials
Excellence Foundation for South Sudan
 

Recently uploaded (20)

PART A. Introduction to Costumer Service
PART A. Introduction to Costumer ServicePART A. Introduction to Costumer Service
PART A. Introduction to Costumer Service
 
How to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERPHow to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERP
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
 
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
 
1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx
 
Synthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptxSynthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptx
 
Chapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptxChapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptx
 
How to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleHow to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS Module
 
Ethnobotany and Ethnopharmacology ......
Ethnobotany and Ethnopharmacology ......Ethnobotany and Ethnopharmacology ......
Ethnobotany and Ethnopharmacology ......
 
Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
 
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptxMARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
 
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdfESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
 
How to Break the cycle of negative Thoughts
How to Break the cycle of negative ThoughtsHow to Break the cycle of negative Thoughts
How to Break the cycle of negative Thoughts
 
2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...
 
Digital Tools and AI for Teaching Learning and Research
Digital Tools and AI for Teaching Learning and ResearchDigital Tools and AI for Teaching Learning and Research
Digital Tools and AI for Teaching Learning and Research
 
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXXPhrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
 
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCECLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
 
Introduction to Quality Improvement Essentials
Introduction to Quality Improvement EssentialsIntroduction to Quality Improvement Essentials
Introduction to Quality Improvement Essentials
 

QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop, About Infinite-Dimensional Geometric MCMC - Shiwei Lan, Dec 14, 2017

  • 1. Department of Computing + Mathematical Sciences California Institute of Technology About ∞-Dimensional Geometric MCMC a b Shiwei Lan DEC 14, 2017 SAMSI, DUKE aBeskos, Alexandros, Mark Girolami, Shiwei Lan, Patrick E. Farrell, and Andrew M. Stuart (2017), Geometric MCMC for Infinite-Dimensional Inverse Problems. Journal of Computational Physics, 335:327–351. bHolbrook, Andrew, Shiwei Lan, Jeffrey Streets, and Babak Shahbaba. The non-parametric Fisher information geometry and the chi-square process density prior. arXiv:1707.03117.
  • 2. 1 Table of contents 1. Introduction 2. Geometric Monte Carlo on Infinite Dimensions 3. Dimension Reduction 4. ∞-dimensional Spherical Hamiltonian Monte Carlo 5. Conclusion S.Lan | ∞-Dimensional Geometric MCMC
  • 3. 2 Geometric Monte Carlo powerful RWM θ 1 -2 -1 0 1 2 θ 2 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 HMC θ 1 -2 -1 0 1 2 θ 2 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 RHMC θ 1 -2 -1 0 1 2 θ 2 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 LMC θ 1 -2 -1 0 1 2 θ 2 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 S.Lan | ∞-Dimensional Geometric MCMC
  • 4. 3 Dimension-Independent MCMC robust 20 40 60 80 100 120 140 160 180 200 dimension 0 0.2 0.4 0.6 0.8 1 acceptancerate HMC LMC ∞-HMC ∞-mHMC S.Lan | ∞-Dimensional Geometric MCMC
  • 5. 4 Bayesian Inverse Problems Let X, Y be separable Banach spaces, equipped with Borel σ-algebra, G : X → Y measurable. Want to find u from y y = G(u) + η Prior: u ∼ µ0 on X. Noise: η ∼ Q0 on Y and η ⊥ u. Assume y|u ∼ Qu Q0 for u µ0-a.s. Define likelihood dQu dQ0 (y) = exp(−Φ(u; y)) Theorem (Bayes’ Theorem) Assume for y Q0-a.s.: Z := X exp(−Φ(u; y))µ0(du) > 0. Then u|y exists under ν, denoted by µy . Furthermore, µy µ0 and for y ν-a.s. dµy dµ0 (u) = 1 Z exp(−Φ(u; y)). S.Lan | ∞-Dimensional Geometric MCMC
  • 6. 5 Metropolis-Hastings in general state space (Tierney, 1998) target µ(du), e.g. µy (du) ∝ exp(−Φ(u; y))µ0(du), µ0 = N(0, C). Denote Q(u, du ) as the proposal probability kernel Denote the transition measure and its transpose as follows ν(du, du ) = µ(du)Q(u, du ), νT (du, du ) = ν(du , du) When ν νT (i.e. ν νT and νT ν) with density r(u, u ) = dν dνT (u, u ) The acceptance probability is a(u, u ) = 1 ∧ r(u, u ) S.Lan | ∞-Dimensional Geometric MCMC
  • 7. 6 Random Walk S.Lan | ∞-Dimensional Geometric MCMC
  • 8. 7 preconditioned Crank-Nicolson (Cotter et al., 2013) A finite difference method used for numerically solving PDE (Richtmyer and Morton, 1994) Modified Random Walk Metropolis (RWM) Given u, sample ξ ∼ N(0, C) Make proposal u = 1 − β2u + βξ (1) Accept u with probability a(u, u ) = 1 ∧ r(u, u ), r(u, u ) = exp(−Φ(u ) + Φ(u)) (2) Independent of dimension, mixing faster than RWM S.Lan | ∞-Dimensional Geometric MCMC
  • 9. 8 + Gradient S.Lan | ∞-Dimensional Geometric MCMC
  • 10. 9 ∞-dimensional MALA (Beskos et al., 2008) Consider the Langevin SDE du dt = − 1 2 K(C−1 u + DΦ(u)) + √ K dW dt (3) Semi-implicit scheme to discretize the above SDE u − u h = − 1 2 KC−1 [(1 − θ)u + θu ] − α 2 KDΦ(u)) + K h ξ, ξ ∼ N(0, I) which may be simplified to give u = Aθu + Bθv, v = √ Cξ − α 2 √ hKCDΦ(u) Aθ = (I + θ 2 hKC−1 )−1 (I − 1 − θ 2 hKC−1 ), Bθ = (I + θ 2 hKC−1 )−1 √ hKC−1 (4) where α = 1. α = 0 =⇒ pCN. K = I (IA) or K = C (PIA). S.Lan | ∞-Dimensional Geometric MCMC
  • 11. 10 ∞-dimensional HMC (Beskos et al., 2011) Consider the Hamiltonian differential equation d2 u dt2 + K(C−1 u + DΦ(u)) = 0, (v := du dt ) t=0 ∼ N(0, K) (5) Let K = C, f (u) := −DΦ(u). Störmer-Verlet/splitting scheme (Verlet, 1967; Neal, 2010) is used to discretize (5) v− = v + t 2 Cf (u) u v+ = cos t sin t − sin t cos t u v− v = v+ + t 2 Cf (u ) (6) This defines a mapping Ψt : (u, v) → (u , v ) S.Lan | ∞-Dimensional Geometric MCMC
  • 12. 11 + Metric S.Lan | ∞-Dimensional Geometric MCMC
  • 13. 12 ∞-dimensional manifold MALA (Beskos, 2014) Choose K = G(u)−1 in (3) (Girolami and Calderhead, 2011) du dt = − 1 2 G(u)−1 (C−1 u + DΦ(u)) + G(u)−1 dW dt (7) Define G : X → L(X, X) and g : X → X by g(u) = −G(u)−1 ((C−1 − G(u))u + DΦ(u)) (8) Similar semi-implicit scheme (θ = 1 2 ) yields u = a 1 2 u + b 1 2 v, v = √ h 2 g(u) + G(u)− 1 2 ξ a 1 2 = 1 − h/4 1 + h/4 , b 1 2 = √ h 1 + h/4 (9) Note a2 1 2 + b2 1 2 = 1 follows Crank-Nicolson scheme (Richtmyer and Morton, 1994). S.Lan | ∞-Dimensional Geometric MCMC
  • 14. 13 ∞-dimensional manifold MALA (Beskos, 2014) Assumption (1) N(0, G(u)−1 ) N(0, C) for µ0-a.s. in u. Assumption (2) For µ0-a.s. in u, we have g(u) ∈ Im(C 1 2 ), i.e. N(g(u), C) N(0, C). S.Lan | ∞-Dimensional Geometric MCMC
  • 15. 14 ∞-dimensional manifold MALA (Beskos, 2014) Theorem (Beskos (2014)) Denote ν(du, du ) = µ(du)Q(u, du ) with Q(u, du ) being the transition kernel of (9). Under Assumptions 1 and 2, ν νT , and the acceptance probability has the following form a(u, u ) = 1 ∧ dνT dν (u, u ), dνT dν (u, u ) = exp{−Φ(u )}λ(ρ−1 (u; u ); u ) exp{−Φ(u)}λ(ρ−1(u ; u); u) (10) where ρ−1 (u ; u) = [(1 + h/4)u − (1 − h/4)u]/ √ h and λ(w; u) is calculated as follows: λ(w; u) = dN ( √ h/2g(u), G(u)−1 ) dN (0, C) = dN ( √ h/2g(u), G(u)−1 ) dN (0, G(u)−1) dN (0, G(u)−1 ) dN (0, C) = exp √ h 2 G 1 2 (u)g(u), G 1 2 (u)w − h 8 |G 1 2 (u)g(u)| 2 · exp − 1 2 |G 1 2 (u)w| 2 + 1 2 |C − 1 2 w| 2 |C 1 2 G 1 2 (u)| S.Lan | ∞-Dimensional Geometric MCMC
  • 16. 15 ∞-dimensional manifold HMC Let K = G(u)−1 in (5) (Girolami and Calderhead, 2011) d2 u dt2 + u = g(u), (v := du dt ) t=0 ∼ N(0, G(u)−1 ) (11) Discretize (11) by a splitting method of the form v− = v + t 2 g(u) u v+ = cos τ sin τ − sin τ cos τ u v− v = v+ + t 2 g(u ) (12) where τ(t)/t → 1 as t → 0. Evolving (u, v) for k-folds (12) defines Ψk t : (u0, v0) → (uk, vk). S.Lan | ∞-Dimensional Geometric MCMC
  • 17. 16 ∞-dimensional manifold HMC Theorem Denote ν(du, du ) = µ(du)Q(u, du ) with Q(u, du ) being the transition kernel of Ψk t . Under Assumptions 1 and 2, ν νT , and the acceptance probability has the following form 1 ∧ exp(−∆H(u, v)) (13) where ∆H(u, v) = H(Ψ (k) t (u, v)) − H(u, v) =Φ(uk) − Φ(u0) + 1 2 (|(G(uk) − C −1 ) 1 2 vk| 2 − |(G(u0) − C −1 ) 1 2 v0| 2 − log |G(uk)| + log |G(u0)|) + t2 8 (|C − 1 2 g(u0)| 2 − |C − 1 2 g(uk)| 2 ) + t 2 k−1 =0 ( v , C −1 g(u ) + v +1, C −1 g(u +1) ) S.Lan | ∞-Dimensional Geometric MCMC
  • 18. 17 Connections between ∞-dimensional MCMC Remark As a direct corollary, when I = 1, ∞-(m)HMC reduces to ∞-(m)MALA. ∞ − MALA position-dependent pre-conditioner K(u) −−−−−−−−−−−−−−−−−−−−→ ∞ − mMALA h=4 −−→ SN multiplesteps(k>1) −−−−−−−−−−→ multiplesteps(k>1) −−−−−−−−−−→ ∞ − HMC position-dependent pre-conditioner K(u) −−−−−−−−−−−−−−−−−−−−→ ∞ − mHMC S.Lan | ∞-Dimensional Geometric MCMC
  • 19. 18 Geometric Monte Carlo weight (computation) control RWM θ 1 -2 -1 0 1 2 θ 2 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 HMC θ 1 -2 -1 0 1 2 θ 2 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 RHMC θ 1 -2 -1 0 1 2 θ 2 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 LMC θ 1 -2 -1 0 1 2 θ 2 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 S.Lan | ∞-Dimensional Geometric MCMC
  • 20. 19 Dimension Reduction intrinsic low-dimensional subspace S.Lan | ∞-Dimensional Geometric MCMC
  • 21. 20 Dimension Reduction intrinsic low-dimensional subspace Particular choice of K(u) = G(u)−1 could be K(u) = (C−1 + H(u))−1 , H(u) = EY|u DΦ(u), DΦ(u) (14) Computationally infeasible to update Fisher metric H(u) in each u Given eigenbasis {φi(x)} we define projection operator Pr as follows: Pr : X → Xr, u → ur := r i=1 φi φi, u (15) Truncate H(u) on the r-dimensional subspace Xr ⊂ X (X = Xr + X⊥) Hr(u)(v, w) = Prv, EY|u[DrΦ(u)DrΦ(u) T ]Prw , ∀v, w ∈ X (16) K(u)−1 can be approximated K(u)−1 ≈ Hr(u) + C−1 (17) S.Lan | ∞-Dimensional Geometric MCMC
  • 22. 21 Prior-Based Dimension Reduction With the eigen-pair {λi, ui(x)}, the prior covariance operator C can be written and approximated as C = UΛU∗ ≈ UrΛrU∗ r (18) Then we can approximate the posterior covariance K(u) = (C−1 + H(u))−1 ≈ C + UrΛ 1 2 r (Dr − Ir)Λ 1 2 r U∗ r (19) where Dr := (ˆHr(u) + Ir)−1 and ˆHr(u) := Λ 1 2 r U∗ r H(u)UrΛ 1 2 r . By applying U∗ r and U∗ ⊥ to Langevin SDE (3) respectively we get dur = − 1 2 Drurdt − γr 2 Dr ur Φ(u; y)dt + √ DrdWr (20a) du⊥ = − 1 2 u⊥dt − γ⊥ 2 u⊥ Φ(u; y)dt + dW⊥ (20b) where ur = Λ − 1 2 r U∗ r u and u⊥ = Λ − 1 2 ⊥ U∗ ⊥u. S.Lan | ∞-Dimensional Geometric MCMC
  • 23. 22 Karhunen-Loève Expansion Let X be a Hilbert space H = L2 (D; R) on bounded open D ⊂ Rd with Lipschitz boundary, and with inner-product ·, · and norm · Consider the following covariance operator C C := σ2 (αI − ∆)−s (21) Let {λ2 i } and {φi(x)} denote eigenvalues and eigenfunctions of C. If s > d/2 and λi i− s d , C defines a Gaussian measure N(0, C) that each draw u(·) ∼ N(0, C) admits Karhunen-Loève (K-L) expansion (Adler, 1981; Bogachev, 1998; Dashti and Stuart, 2015): u(x) = +∞ i=0 uiλiφi(x), ui iid ∼ N(0, 1) (22) S.Lan | ∞-Dimensional Geometric MCMC
  • 24. 23 Laminar Jet formulation of the inverse problem We consider the following 2d Navier-Stokes equation: Momentum equation: − div ν( u + uT ) + u · u + p = 0, (23) where u = (u, v) is the velocity field and p the pressure; Continuity equation: div u = 0. (24) Denote σn = −pn + ν( u + uT ) · n as the boundary traction. u · n = −θ(y), σn × n = 0 on I := x = 0, y ∈ (− 1 2 Ly, 1 2 Ly) σn + β(u · n)− u = 0 on O := x = Lx, y ∈ (− 1 2 Ly, 1 2 Ly) u · n = 0, σn × n = 0 on B := x ∈ (0, Lx), y = ± 1 2 Ly where (u · n)− = u·n−|u·n| 2 , and β ∈ (0, 1] is the backflow stabilization parameter S.Lan | ∞-Dimensional Geometric MCMC
  • 25. 24 Laminar Jet results Laminar Jet problem: the location of observations (left) and the forward PDE solutions with true unknown (right). Fluid viscosity ν = 3 × 10−2 . S.Lan | ∞-Dimensional Geometric MCMC
  • 26. 25 Laminar Jet results Laminar Jet test problem: true inflow velocity profiles with different number of modes (left) and posterior estimates given by different algorithms (right). Shaded region shows the 95% credible band estimated with samples by ∞-mHMC. S.Lan | ∞-Dimensional Geometric MCMC
  • 27. 26 Laminar Jet results Laminar Jet problem: the trace plots of data-misfit function evaluated with each sample (left, values have been offset to be better compared with) and the auto-correlation of data-misfits as a function of lag (right). S.Lan | ∞-Dimensional Geometric MCMC
  • 28. 27 Laminar Jet results Method AP s/iter ESS(min,med,max) minESS/s spdup PDEsolns pCN 0.61 1.29 ( 5.24, 6.66, 13.33) 4.05E-04 1.00 22004 ∞-MALA 0.66 1.68 ( 5.38, 6.62, 19.53) 3.21E-04 0.79 33005 ∞-HMC 0.72 3.81 ( 5.41, 7.43, 16.44) 1.42E-04 0.35 82466 ∞-mMALA 0.68 5.97 ( 1075.24, 2851.22, 3867.08) 1.80E-02 44.47 2233205 ∞-mHMC 0.58 13.33 ( 2058.42, 3394.17, 4560.03) 1.54E-02 38.13 5575696 split ∞-MMALA 0.57 3.66 ( 1079.55, 1805.89, 2395.13) 2.95E-02 72.82 693065 split ∞-mHMC 0.60 6.88 ( 2749.63, 3974.36, 5498.03) 4.00E-02 98.67 1721694 Sampling Efficiency in Laminar Jet problem. AP: acceptance probability, s/iter: seconds per iteration, ESS(min,med,max): Effective Sample Size (minimum, median, maximum), and MinESS/s: minimum ESS per second. Dimension =100. HMC algorithms do leap frog for steps randomly chosen between 1 and 4. Split∗ truncates K-L at r=30. S.Lan | ∞-Dimensional Geometric MCMC
  • 29. 28 Likelihood-Informed Dimension Reduction By whitening coordinates vi(x) = C− 1 2 ui(x), generalized eigenproblem H(u)ui(x) = λiC−1 ui(x) (25) can be shown equivalent to the eigen-problem for ppGNH H(v) H(v)vi(x) = C 1 2 H(u)C 1 2 vi(x) = λivi(x) (26) The local posterior covariance is approximated (Dr := (Ir + Λr)−1 ) K(v) = (I + H(v))−1 ≈ I + Vr(Dr − Ir)V∗ r (27) By applying V∗ r and V∗ ⊥ to whitened Langevin SDE respectively dvr = − 1 2 Drvrdt − γr 2 Dr vr Φ(v; y)dt + √ DrdWr (28a) dv⊥ = − 1 2 v⊥dt − γ⊥ 2 v⊥ Φ(v; y)dt + dW⊥ (28b) where vr = V∗ r v and v⊥ = V∗ ⊥v. S.Lan | ∞-Dimensional Geometric MCMC
  • 30. 29 Connection to DILI (Cui et al., 2016) By considering the approximation H(v) ≈ VrΛrV∗ r , we have the posterior covariance projected to r-dimensional subspace: Kr = Covµ[V∗ r v] = V∗ r (I + H(v))−1 Vr ≈ Dr := (Ir + Λr)−1 (29) While DILI computes the posterior covariance in the subspace Kr := Covµ[V∗ r v] empirically. They both have the following approximate posterior covariance Kv = Covµ[v] ≈ VrKrV∗ r + I − VrV∗ r (30) Since we directly work with (29), empirical calculation of Kr is avoided; The approximation (29) is already in diagonal form thus the rotation Ψr = VrWr in DILI is not needed. They capture the similar geometric feature of subspace. S.Lan | ∞-Dimensional Geometric MCMC
  • 31. 30 Elliptic Inverse Problem formulation of the inverse problem Consider the elliptic inverse problem as in (DILI Cui et al., 2016) defined on the unit square domain Ω = [0, 1]2 : − · (k(s) p(s)) = f (s), s ∈ Ω k(s) p(s), n(s) = 0, s ∈ ∂Ω ∂Ω p(s)dl(s) = 0 (31) where k(s) is the transmissivity field, p(s) is the potential function, f (s) is the forcing term, and n(s) is the outward normal to the boundary. The inverse problem is to infer u = log k from observations {yn}. 25 observations arise from the solutions (solved on a 80 × 80 mesh) contaminated by additive Gaussian error: yn = p(xn) + εn, εn ∼ N(0, σ2 y), SNR := max s {u(s)}/σy = 100 S.Lan | ∞-Dimensional Geometric MCMC
  • 32. 31 Elliptic Inverse Problem results S.Lan | ∞-Dimensional Geometric MCMC
  • 33. 32 Elliptic Inverse Problem results Elliptic inverse problem (SNR = 100): Bayesian posterior mean estimates of the log-transmissivity field u(s) based on 2000 samples by various MCMC algorithms; the upper-left corner shows the MAP estimate. S.Lan | ∞-Dimensional Geometric MCMC
  • 34. 33 Elliptic Inverse Problem results Method h AP s/iter ESS(min,med,max) minESS/s spdup PDEsolns pCN 0.01 0.57 0.99 (2.67,6.95,37.79) 0.0013 1.00 2501 ∞-MALA 0.04 0.61 1.62 (4.32,15.34,51.45) 0.0013 0.99 5002 ∞-HMC 0.04 0.59 3.52 (24.36,92.13,184.84) 0.0035 2.57 12342 DR-∞-mMALA 0.52 0.67 8.85 (127.25,210.84,460.07) 0.0072 5.34 80032 DR-∞-mHMC 0.25 0.56 22.97 (190.2,322.29,687.11) 0.0041 3.08 198176 DILI (0.1, 0.2) 0.69 1.59 (30.52,133.67,221.97) 0.0096 7.13 6612 aDR-∞-mMALA 0.25 0.71 1.61 (12.09,89.17,174.36) 0.0037 2.79 6612 aDR-∞-mHMC 0.10 0.69 3.63 (70.99,234.42,364.31) 0.0098 7.26 14056 Sampling efficiency in elliptic inverse problem (SNR=100). Column labels are as follows. h: step size(s) used for making MCMC proposal; AP: average acceptance probability; s/iter: average seconds per iteration; ESS(min,med,max): minimum, median, maximum of Effective Sample Size across all posterior coordinates; min(ESS)/s: minimum ESS per second; spdup: speed-up relative to base pCN algorithm; PDEsolns: number of PDE solutions during execution. S.Lan | ∞-Dimensional Geometric MCMC
  • 35. 34 ∞-dimensional Spherical Hamiltonian Monte Carlo qqqq qqq qqqqq qq qq qq qq qqqq qqqq qqqq qqqq qqq qqq qqqq qq qq qqqq qqqq qqq qqq qq qqqq qq qq qqqq qq qq qq qq qq qqqq qqq qqq qqqq qqqq qq qq qq qqqq qq qq qq qq qqqq qq qq qq qq qqqq qq qqqq qq qqqq qq qq qq qq qq qq qq qq qqq q qqqq qq qq qqqq qqqq qq qq qq qq qq qq qqqq qq qq qqqq qqqq qq qq qq qq qqqq qq qqqq qqqq qq qq qqqq qq qq qq qq qqqq qqqq qq qq qq qq qq qq qqqq q qqqqq qq qq qq qq qqqq qq qqqq qqqq qq qq qq qqqq qq qq qq qq qq qq qq qq qqqq qq qq qq qqqq qqqq qq qq qq qq qqqq qqqq qq qq qq qq qq qq qq qq qq qqqq qqqq qq qq qq qqqq qq qq qq qqqq qq qq qq qq qq qqqq qq qqqq qqqq qq qq qq qq qq qq qqqq qq qq qq qq qq qq qq qq qqqq qq qq qq qqqq qq qq qqqq qq qq qqqq qq qqqq qqqq qqqq qq qq qq qq qq qq qq qq qq qq qq qqqq qq qq qq qqq q qqqq qqqq qqqq qq qq qqqq qq qq qqqq qq qq qqqq qqqq qq qq qq qq qq qq qq qq qqqq qq qqqq qq qq qqqq qqqq qq qq qq qq qq qqqq qqqq qq qq qq qqqq qqqq qq qqqq qq qq qq qq qq qq qq qq qq qqqq qq qq qqqq qqqq qqqq qq qq qq qq qq qq qq qqq q qq qq qqqq qq qqqq qqqq qq qqqq qq qq qqqq qq qq qq qqqq qqqq qq qq qq qq qq qqqq qq qq qq qq qqqq qq qq qq qq qq qq qq qq qq qq qqqq qq qqqq qq qq qqqq qqqq qqqq qqqq qq qq qq qqqq qq qq qq qq qq qq qq qq qq qq qq qq qq qqqq qqqq qq qq qq qq qq qq q q qq qq qqqq qqqq qq qq qqqq qq qqqq qqqq qqqq qq qq qq qq qq qq qq qqqq qq qq qq qq qqqq qq qq qq qq qq qqqq qq qq qq qq qq qq qq qqqq qq qqqq qq qqqq qq qqqq qqqq qq qq qq qq qqqq qq qq qq qq qq qq qq qq qq qq qqqq qq qq qqqq qq qq qq qq qqqq qqqq qq qq qq qq qqqq qqqq qq qq qq qq qqqq q q qq qq qqqq qq qq qq qq qqqq qqqq qq qq qq qq qq qq qq qq qqqq qqqq qq qq qq qq qqqq qq qq qq qq qqqq qq qq qq qqqq qq qq qq qq qqqq qq qq qq qq qq qqqq qqqq qq qq qq qqqq qqqq qq qq qqqq qq qq qq qq qq qq qq qq qq qq qq qq qqqq qq qq qq qq qqqq qq qqqq qq qq qq qq qqqq qq qq qq qq qq qq qq qq qqq qqq qq qqqq qq qq qq qq qqqq qq qqqq qqqq qqqq qq qq qq qqqq qq qq qq qq qq qq qq qq qq qqqq qq qqqq qqqq qq qq qqqq qqqq qqqq qq qq qq qq qq qq qq qq qq qq qqqq qq qq qq qqqq qq qq qq qq qqqq qq qq qqqq qqqq qq qq qq qq qq qq qq qq qqqq qq qq qq qqqq qqqq qq qq qq qq qqqq qqqq qq qqqq qq qq qq qq qq qq q qqq qq qqqq qq qq qq qq qq qq qqqq qq qqqq qq qqqq qq qqqq qqqq qq qq qq qq qq qq qq qq qq qq qq qqqq qqqq qq qqqq qq qq qq qqqq qqqq qq qqqq qq qq qq qq qqqq qqqq qq qq qq qq qq qq qq qq qqqq qq qq qqqq qq qq qqqq qq qqqq qq qq qq qq qq qq qq qq qq qqqq qq qq qq qq qq qq qqqq qq qq qqqq qq qq qq qq qqqq qq qq qq qq qq qqqq qq qqqq qqqq qqqq qqqq qqqq qqqq qq qq qq qq qqqq qq qq qq qqqq qq qq qqqq qq qq qq qq qq qqqq qq qq qq qq qq qq qqqq qqq q qqqq qq qq qqqq qq qq qq qq qq qq qqqq qq qqqq qq qqqq qq qq qq qq qqqq qqqq qqqq qq qq qqqq qq qq qq qq qqqq qq qq qqqq qq qq qqqq qqqq qq qq qq qq qq qqq q qqqq qqqq qq qqqq qqqq qq qqqq qqqq qq qq qq qq qq qq qqqq qq qq qqqq qq qq qq qq qq qq qqqq qq qq qq qq qq qq qqqq qq qqqq qq qq qqqq qqqq qqqq qqqq qqqq qq qq qqqq qq qq qqqq qqqq qq qq qqqq qqqq qqqq qq qq qqqq qqqq qq qq qq qq qq qq qqqq qqqq qqqq qq qq q q qqqq qq qq qq qq qqqq qq qq qqqq qqqq qq qq qq qq qq qq qqqq qq qq qq qq qq qqqq qq qqqq qq qq qq qq qq qq qq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qq qq qqqq qqqq qqqq qqqq qq qq qq qq qqqq qqqq qqqq qqqq qq qq qq qq qqqq qqqq qqqq q q qqqq qqqq qqqq qqqq qqqq qq qq qqqq qqqq qq qq qqqq qqqq qqqq qq qq qq qq qqqq qq qqqq qqqq qqqq qq qq qqqq qqqq qq qqqq qqqq qq qqqqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qq qqqq qq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qq qqqq qqqq qq qqqq qqqq qqqq qqqq qqqq qqqqqqqqqqqqqqqqqqqqqqqq qqq qq qqq qqqq qqqq qqqq qqq qq qq qq qqq qqqq qqqq qqqq qqqq qqqq qqqq qqq qq qqq qqqq qqqq qqqq qqqq qqq qq qq qq qqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqqqq qqq q qqqq qqqq qqqq qqqq qqqq q qq q qqqq qq qqqq q qq q qqqq qqqq q qq qqq qq qqqq qqqq q qq q qqqq qqqq qqqq qqqq q qq qq qqq qq qqqq qqqq q qq q qq qq qqqq qqqq qqqq q qq q qqqq qq qqqq qq qqqq qqqq qqqq q qq q qqqq qqqq qqqq qqqq qqqq qqqq qqqq qqqq q qq q qq qqq qq qq qq q qq q qq q qq qqq q q qq qq q qqqq q qq q qq qq q qq qq qq q qqqq q qq qq qq q qqqq qq qq q q q qq qq qq qq qq q qqqq q qq qq qq qq qq qqq qq qq qq q qq qq qq q qq qq q qq q qqqq q qq qq q q qq qq qq qq q qq qq q qq q qq qq q qq qq qq qq qq qq qq qq qq qq qq q qqqq qq qqqq qqqq qq qqq q q q qqqq q q q qq qq q qq qq qqqq qq qq qqqq q q q q qq qq qq qqq qq q qqqq q qq q qqqq qq qq qq qq qqqq q qq q q q qq qq qq qq qq qq qqqq q qq q qqqq qq qq qqqq qqqq qqqq qqqq q q q q qqqq qqqq qqqq qq qqqq qq qqqqq qq qq q q qqqqq qq qq qq qqq qq qq qq qq q q qqqqq qq qq qqq qq qqq qq qq qqq q q qq qq qqq qq qq qq qqqqq q q qqqqq qq qqq qq qq q q qqq qq qq qqq qq qq qq qq qq qq qq qqq qqq qq qqqqqq qq qqq qq qq qq qq qq qq qq q q qqqq q qq q qqqq qq qq qqqq qq qq qq qqqq qq qq qq qqqq qq qq qq qqqq qqqq qq qq qq qqqq qq qq qqqq qq qq qq qq qq qq qqqq qqqq q qq qqq qqqq qq qqqq qq qqqq qqqq qq qqqq qqqq qq qqqqq qq qq qq q q qq q qq qq qq qq qq qq qq qq qq qqq qq q qqqq q qq qqq qqq qq qqq qq qq qq qq qq qqqq qq qqq qq q q qqq qq qq qq qqqq qq qqqq qqqq qq q qq q qq qq qqqq qq qq qq qq qq qq qq qq qq qqqq qq qq qq q qq qqq qqqq qq qqqq qq qq qq qq qqqq q qq qqq qqqq qq qq qq qq qq qq qq qq qq qq q qq q qqqq qqqq qq qq qqqq q qq qq qq q qq qq qq qq qq qq qq qq q qq q qqqq qq qqq qq qqq qqqq qqqq qqq qq qqq qqqq qq qqqq qq qqq qqq qqq q q qqq qq q qq q qq qqqq qq qq qq qq qq qq qq qqqq qq qq qq qqq qq q qq qq qq qq qq qq q qq qq qq q qq qq qqqq qq qqq qq q qqqq qq qq qq qqqq qqqq qq qq q qq q qq qqqq qq qq qq qq qqqq qqqq qqq qq qqq qq qq qq qqqq q qq q qq qq q qq q qqqq qq qq qqqq q qq qq qq qq qq q qq qqqq qqq qq q qqqq qq qq q qq qq qq qq q qq qq q qq qq qq q qq qq qq qq qq qq qqqq qq qq qq qqq qq qqq qq qqqq qq qq qq qq q qq q qq qq qq qq qqqq qq qq qq qq qqqq qqqq q qq qqq qq qqqq qq qqqq q qq q qqqq qq qq qq qqqq qq qq qq qq qq qq qqqq qqqq qq qqqq qq qqq q q qqq qq qq qqqq qqqq qqqq qq qqqq qq qqq qq qqqqq qq qq q q qq qq q qq q qq qq qqqq qqqq qq qq qq qq qq qqqq qq qq qq qq qq qq qq qqqq qq qq qqqq qqqq qqqq qqqq qq qqqq qqq qq qqq qq qq qq qq qq qq qq qq qq qqq qq qq q q qq q q q qqqq qq qq qq qq qq qqqq qqqq qq q qq q qqqq qq qq qqqq q qq qqq qq qqqq qqqq qq qq q q qq qq qq qqq q q q qq qq qq qq qqqq qqqq qq qq qqqq qq qq qq qq qq qq qq qqq qq qqq qq qqqq qq qqq q q qq q q qqq qqqq qq qq qq qq qq qqqq qqqq qq qqqqq qq qqqqq qq qq qq qq qq qqq q q qqq qq qqqq qq qq q qq qq qq qqqqq qq qq qqqq qqqq qqqq q qq q q qq qq q qq qqq qq qqqqqq q q qqq qq qq qqq qq qqq qqq q q qq qq qqqqqq qq qqq qq q qq qq qq q qq qq qq qqqq qq qq qq qq qq qq q qq q qqqq qq qq qqqq qq qq qqqq qq qq qq qqqq qq qq qqq q q qqq qqq qqq qq qqqq q qq q qq qqqq qqqq qq qq qq qq qq qqqq qq qq qqqq qq qq qq qq qq qq qqqq q qq qqq qq qq qqqq qq qq qq qqq q q qq qq q qqqq qqqq qq qqqq qqqq qqq qq q qq qq qq qq qq qqq q qqqq q qq q qq qq qqqq qqqq q qq q qq qq qqqq qqqq q qq q qq qq q qq qqq qqqq qqqq q qq qq qq qq qq q qq qq qq qq qqqq qqqq qq qq qq qq qq qq qqqq qqqq q qq q q qqq qq qqqq qq qq qq qqqq qq qq qq qq qqqq qq qqqq qq qq qqq q q qqqqq qqqq qq qq qq qqqq qq qq qqq qqq qqqq qq qq qqqq qqqqqq qq qq qqqq qq qqqq qq qqqq qqqq qqqq qq qq qqq qqq qqqq qq qq qqqq qqqq qq qq qq qq qqqqq qqq qqqq θ q A B θ ~ =(θ,θD+1) θD+1=(1−||θ||2 )0.5 qqq qq qq qqq qq q qq qq q q q q q q qq q q q qq q q q q q q q qq qq q qq q q q q qq qq q q q q q q q q q qq q q q q q q q q q qq q q q q q q qq q q q qq q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q qq q qq qq q q q q q q q q q qq q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q qq q q q q qq q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q qq q q q q q q q q q q q q q q q q q q qqq q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q qq q q qq q qq q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q qq q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q qq q qq q q q qq q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q qq q q q q q qq qq q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q q qqq q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q qq q q qq q q qq q q q q q q q q q q q q q q q q q q q q qq qq q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q q q q qq q q q qq q q q q q q q q q q q q q q qq q q q q qq q q q q q q q q q qq q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q qq qq q q q q q q q q q qq q q q q q qqq q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q qq q qq q q q q q q q q q q qq q q q qq q q q q q q q q q q q q q q qq q q q qq q q q q q qq q q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q q qq q q qqq q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q qq qq qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q qq q qq q q q q q q qqq q q q q q qq q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q qq q q qq q q q q q q q q qq q q q q qq q q q q q q qq q q q q q q q q q q q q qq q q q q q q q q q q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q qqq q q q qqq q q q q q q q q q qq q q q q q q q q q q q qq q q q qq q q qqq q q qqqq q q q q q q q q q qq q q q q q q qq q qq q qq q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q q qq q qq q q q q q q q q q q q qq q q qq q q qq q q q q q q q q q q q q q q qqq q q q qq q qq q qq q q q qq qq q q q q q q q q q q q q q qq q qq q q q q qq q q q q q qqq q q q q qq q q q q qq q qq q q q q q q qq q q qq q qq q qq q q q q q qq q q q q q q q q q q q q q q q q q q qqq q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q qq q q q q q q q q q qq q qq q q q qq q q q qq qqq qq q qq q q q qqqq q q q q q q q q q q qq q qqq q q q qq qq q q q q q q qqq q q q qq q q qq qq q qq q q q q q qq q qq q q qq q q qq q q q qq q qq q qq q qq qq qq q q q q q q q q q qq q qq q q q q q q q q q q q q q q qq q qq q qq q q q q q q q q q q q q q qq qq qq q q q qq q qq q q q qq q qq q q q q qq qqq q q q q q q q q q q q qqq q q q qq q q q q qq qq q qqqqq q qq qqq q q qqq q qq qq qq q qq q q qq qq q qq q qqq q q qqq q qq qq q q q qqqqq q q qqq q qqq qqqq qq qqqqqq qqq qqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqq qqqq qq qqqq qqq qqq q qq q qq qq q q q q qqq q q qqq q qqq q q qq q qqq qq q qq q qqqqq q q qqqq q qqqq q q qqqqq q qqqq q q q q qq q q q qq q q q q q q q q q q q q q qq qq qq q q q q q q q q q q qq q qq q qq q qqq qq q q q q q q q q q qq q q qq q q qq qq q q q q q q qq q q q qq q qq q qq qq q q q q q q q qq q q qq q qq q qq q qq q q q q qq qqq q qqq q q q q q q q q q q qq qq q q qq qq q q q qq q q q q q q q qq q q q q qqqq q qq q q q q q q qq q q qqq q q q qqq q qq qq qq q q q q q qq q qq q q q q q q q q q q qq q q q q q q q q q q q q q qq q q q q q q q q q q q qq q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q qq q q q q q qq q q q q q q q qq q q q q qqq q qq q qq q q q q q q q q q q q qq q q q q q q q q q qqq q q q q qq qq q q q q q q q q q qq q q q qqq q q q q q q q q q qq q q q q q q q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q qq q qq q qq qq q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q qq q q qqq q q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q q q q q qq q q q q q q q qq q q q q q q q q q q qq q q qqq q q q q q q q q q q q q q q q q qqq q q q q q q q q q qqqq q q q q qqq q q q q q q qq q q q q qq q q q q q q q q q q q q q q qq q q q q q q q q q q q q qq qq qq q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q q q qqq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q qqq q q q q q q q q q q q q q qq q q q qq q q q q q q q q q q qq q qq q q q q q q q q q q q q q qq q qq q q q q q q q q q qq q qq qq q q q qq q q q q q q q q q q q q q q qqqq q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q qq q q q q qq q q q q qq q q q q q q q q q qqq q q q q q q q qq q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q q q q q q qq q q q q q q q q q q q q q q q qq q qq q q q qq q q qq q q q q q q q qq q qq q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q qq q q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q qq q q q qq q q qq q q q q q q q q qq q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q q q qq q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q qq q q q qq q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q qq q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q qq q qq q q qq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q qq q q q q q q q qq q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q q qq q q q q q q q q q q q qq q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q qq q q q q qq q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q qq q q q q q q q q qq q q qq q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q qq q q q q q qq q q q q q qq q q q q q q q q q q qq q qq q q q q q q q q q q q q qq q qq q q q q q q q q q qq q qq qq q q q q q q q q q qq q q qqq qq q q qqq θ θD+1 q A B S.Lan | ∞-Dimensional Geometric MCMC
  • 36. 35 Representation of Probability Densities Consider probability distributions over smooth manifolds D. Having fixed a background measure µ, let P := p : D → R | p ≥ 0, D p(x) µ(dx) = 1 (32) Define the following nonparametric Fisher metric on the tangent space TpP := φ ∈ C∞ (D) | D φ(x) µ(dx) = 0 : gF (φ, ψ)p := D φ(x)ψ(x) p(x) µ(dx). (33) The square-root mapping S : (P, gF ) → (Q, ·, · 2), S(p) = q = 2 √ p is a Riemannian isometry, where Q is ∞-dimensional sphere in L2 (D) Q := q : D → R | D q(x)2 µ(dx) = 1 , f , h 2 = D fhdµ(x) (34) S.Lan | ∞-Dimensional Geometric MCMC
  • 37. 36 Nonparametric Density Modeling It is easier to work with root density q ∈ Q (e.g. clean geodesic flow). Restrict Gaussian process prior q(·) ∼ GP(0, K(·)) to Q, where the covariance operator K = σ2 (α − ∆)−s has eigen-pairs {λ2 i , φi(x)}∞ i=1. Then q(x) 2 = ∞ i=1 qiφi(x) 2 = 1 with qi ∼ N(0, λ2 i ) implies q 2 2 := ∞ i=1 q2 i = 1, i.e. q := (qi) ∈ S∞ (35) Given data x = {xn ∈ D}N n , we have the posterior density π(q|x) ∝ π(q) π(x|q) = ∞ i=1 exp − q2 i /(2λ2 i ) δ q 2 (1) N n=1 q2 (xn) (36) Sampling q = (qi) can be done by spherical HMC (Lan et al., 2014). S.Lan | ∞-Dimensional Geometric MCMC
  • 38. 37 Spherical Hamiltonian Monte Carlo (Lan et al., 2014) Truncate q := (qi) at I, q := (qi)I i=1. Define the total energy E(q, v) := U(q) + K(v; q) = ˜U(q) + K0(v; q) (37) ˜U(q) := U(q) − 1 2 log |G(q−I)| = − log π(q|x) + log |qI| (38) K0(v; q) := 1 2 vT −IG(q−I)v−I = 1 2 vT v (39) Discretizing the Hamiltonian equation results in v− = v − h 2 P(q)g(q) q v+ = r 0 0 v− 2 cos( v− 2r−1 h) + sin( v− 2r−1 h) − sin( v− 2r−1 h) + cos( v− 2r−1 h) r−1 0 0 v− −1 2 q v− v = v+ − h 2 P(q )g(q ) (40) where g(q) := q ˜U(q), P(q) := ID − r−2 qqT . S.Lan | ∞-Dimensional Geometric MCMC
  • 39. 38 Nonparametric Density Modeling simulation result q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q 0.00 0.25 0.50 0.75 0.0 0.2 0.4 0.6 0.8 Dimension2 q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q 0.00 0.25 0.50 0.75 1.00 0.00 0.25 0.50 0.75 1.00 q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q 0.00 0.25 0.50 0.75 1.00 0.00 0.25 0.50 0.75 1.00 Dimension 1 Dimension2 q q q q q qq q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q 0.00 0.25 0.50 0.75 1.00 0.00 0.25 0.50 0.75 1.00 Dimension 1 q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q 0.00 0.25 0.50 0.75 0.0 0.2 0.4 0.6 0.8 Dimension2 q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q 0.00 0.25 0.50 0.75 1.00 0.00 0.25 0.50 0.75 1.00 q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q 0.00 0.25 0.50 0.75 1.00 0.00 0.25 0.50 0.75 1.00 Dimension 1 Dimension2 q q q q q qq q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q 0.00 0.25 0.50 0.75 1.00 0.00 0.25 0.50 0.75 1.00 Dimension 1 Figure: The contours (black) of the posterior median from 1,000 draws of the χ2 -process density sampler. Each posterior is conditioned on 1,000 data points (red). S.Lan | ∞-Dimensional Geometric MCMC
  • 40. 39 Nonparametric Density Modeling result of real data q qqq q q qq qqq qq qqqqq qq q q qqq q qqqq q qq q q q q qqq q q q q qqq qq q qq q q qq q q q q q qq qqqq q q qq q q qq qqq qq qq qq qqq qq qq q qq q q q qqqqq qqq qqqqq qqq q q q q qqq q q qqq q q q q qqqq q q qq qq q q q q qq qq qq q q q q qqq qq q qq q q q qq q qq q q qq q q qq q q qq qq q q qq q q q qqq q q q q q qq qq q q qq q q q q qqq q qqqq q q qq q q q q q q qqq qq qq q q qqq qqqqq qq q qq q qqqq q q q qq q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q qq q q q q q qq q q q qq q q q qq qq q q q q qq q q q qq qqqq q q q q q q qqqq q q q q q q q q qqqq q q q qqq q q q q q q q q q q q q qq q qqq qqq q q q q q q q qq q q q qq q q qq q qqqq qq q qq q qq q q q qq q q q q q q q q q q qq q qqq qq q qqq q q qqq q q qq qq qqqqq q q q qq q q q q q q q q q q q qq q qq q q q q q q q q q q q q q qq q q qq q q qq q q q q qq q q q q q q q q q q qq q qq q q qq qq q q q q q q q q q q q qqq q qq q q q q qq q qqq qq qq q qqqqq q q q qq q q qqqq qq q q q q q q q q q q q q q q qq qq qq q q q q q q qq qq q q q qq q q q q q qq q q qqq q q q qq q q qq qqq qq qq q qq qqq q q q q q q qq q q q q q q q qq q q q qqq q qq q q q q qq q q q q q q qq q qq q qq q qq q q q q q q q q q q q q qq qqqq q q q qqq qqq q qq q q q q q q q qq qqq q q qqq q q q q q q q q q q q q q q q qq q qq q q qq q q q q q q q q q qqq q qq q q qq q q q q q q q q q q q q qq q q q q q q q q q q q q q qq qq qqq q qq q q q q q 0.00 0.25 0.50 0.75 1.00 0.00 0.25 0.50 0.75 1.00 Dimension 1 Dimension2 Pointwise posterior mean qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q 0.00 0.25 0.50 0.75 1.00 0.00 0.25 0.50 0.75 1.00 Dimension 1 Posterior predictive sample q qqq q q qq qqq qq qqqqq qq q q qqq q qqqq q qq q q q q qqq q q q q qqq qq q qq q q qq q q q q q qq qqqq q q qq q q qq qqq qq qq qq qqq qq qq q qq q q q qqqqq qqq qqqqq qqq q q q q qqq q q qqq q q q q qqqq q q qq qq q q q q qq qq qq q q q q qqq qq q qq q q q qq q qq q q qq q q qq q q qq qq q q qq q q q qqq q q q q q qq qq q q qq q q q q qqq q qqqq q q qq q q q q q q qqq qq qq q q qqq qqqqq qq q qq q qqqq q q q qq q q q q q q qq q q q q q q q q qq q q q q q q q q q q q q q q q qq q q q q q qq q q q qq q q q qq qq q q q q qq q q q qq qqqq q q q q q q qqqq q q q q q q q q qqqq q q q qqq q q q q q q q q q q q q qq q qqq qqq q q q q q q q qq q q q qq q q qq q qqqq qq q qq q qq q q q qq q q q q q q q q q q qq q qqq qq q qqq q q qqq q q qq qq qqqqq q q q qq q q q q q q q q q q q qq q qq q q q q q q q q q q q q q qq q q qq q q qq q q q q qq q q q q q q q q q q qq q qq q q qq qq q q q q q q q q q q q qqq q qq q q q q qq q qqq qq qq q qqqqq q q q qq q q qqqq qq q q q q q q q q q q q q q q qq qq qq q q q q q q qq qq q q q qq q q q q q qq q q qqq q q q qq q q qq qqq qq qq q qq qqq q q q q q q qq q q q q q q q qq q q q qqq q qq q q q q qq q q q q q q qq q qq q qq q qq q q q q q q q q q q q q qq qqqq q q q qqq qqq q qq q q q q q q q qq qqq q q qqq q q q q q q q q q q q q q q q qq q qq q q qq q q q q q q q q q qqq q qq q q qq q q q q q q q q q q q q qq q q q q q q q q q q q q q qq qq qqq q qq q q q q q 0.00 0.25 0.50 0.75 1.00 0.00 0.25 0.50 0.75 1.00 Dimension 1 Dimension2 Pointwise posterior mean qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q qq q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q q 0.00 0.25 0.50 0.75 1.00 0.00 0.25 0.50 0.75 1.00 Dimension 1 Posterior predictive sample Figure: Hutchings’ bramble canes data: the first figure depicts the 823 bramble canes (red), a heatmap of the pointwise posterior mean (black is low, white is high), and a single contour at density 0.3 (blue) including all but a few points. The second figure shows 823 draws from the χ2 -process density posterior predictive distribution, obtained using a rejection sampling scheme. S.Lan | ∞-Dimensional Geometric MCMC
  • 41. 40 Conclusion Geometric information (gradient, metric) helps MCMC in mixing but comes at computational cost, alleviated by dimension reduction. Key: find intrinsic finite dimensional subspace where most information concentrates and manifold MCMC can be applied Light-computation algorithms, e.g. pCN, preconditioned MALA, can be applied in the less informative complementary subspace MCMC defined on manifold can help solve challenging statistical problems, e.g. density estimation, modeling covariance/correlation matrices. S.Lan | ∞-Dimensional Geometric MCMC
  • 42. 40 References I Adler, R. J. (1981). The geometry of random fields, volume 62. Siam. Beskos, A. (2014). A stable manifold mcmc method for high dimensions. Statistics & Probability Letters, 90:46–52. Beskos, A., Pinski, F. J., Sanz-Serna, J. M., and Stuart, A. M. (2011). Hybrid Monte-Carlo on Hilbert spaces. Stochastic Processes and their Applications, 121:2201–2230. Beskos, A., Roberts, G., Stuart, A., and Voss, J. (2008). Mcmc methods for diffusion bridges. Stochastics and Dynamics, 8(03):319–350. Bogachev, V. I. (1998). Gaussian measures. Number 62. American Mathematical Soc. Cotter, S. L., Roberts, G. O., Stuart, A., White, D., et al. (2013). Mcmc methods for functions: modifying old algorithms to make them faster. Statistical Science, 28(3):424–446. Cui, T., Law, K. J., and Marzouk, Y. M. (2016). Dimension-independent likelihood-informed {MCMC}. Journal of Computational Physics, 304:109 – 137. Dashti, M. and Stuart, A. M. (2015). The Bayesian Approach To Inverse Problems. ArXiv e-prints. Girolami, M. and Calderhead, B. (2011). Riemann manifold Langevin and Hamiltonian Monte Carlo methods. Journal of the Royal Statistical Society, Series B, (with discussion) 73(2):123–214. Lan, S., Zhou, B., and Shahbaba, B. (2014). Spherical hamiltonian monte carlo for constrained target distributions. In Proceedings of The 31st International Conference on Machine Learning, pages 629–637, Beijing, China. Neal, R. M. (2010). MCMC using Hamiltonian dynamics. In Brooks, S., Gelman, A., Jones, G., and Meng, X. L., editors, Handbook of Markov Chain Monte Carlo. Chapman and Hall/CRC. S.Lan | ∞-Dimensional Geometric MCMC
  • 43. 41 References II Richtmyer, R. D. and Morton, K. W. (1994). Difference methods for initial-value problems. Malabar, Fla.: Krieger Publishing Co.,| c1994, 2nd ed., 1. Tierney, L. (1998). A note on metropolis-hastings kernels for general state spaces. Annals of Applied Probability, pages 1–9. Verlet, L. (1967). Computer "Experiments" on Classical Fluids. I. Thermodynamical Properties of Lennard-Jones Molecules. Phys. Rev., 159(1):98–103. S.Lan | ∞-Dimensional Geometric MCMC