SlideShare a Scribd company logo
1 of 71
Download to read offline
Hierarchical Bayesian Models for Inverse Problems and
Uncertainty Quantification
Bani K Mallick
Department of Statistics
Texas A&M University
Collaborators: Nilbja Guha, Yalchin Efendiev, Bangti Jin, Keren Yang
Bani K Mallick Bayesian Inverse problems and UQ 1 / 59
Model, Method, Computation
Massive data and tremendous scale of computational power
Ability to use highly heterogeneous data sets
Possibility of modeling more complex structures and systems
Making sense of data, in the context of the modeling of complex
systems, is a challenging task
Bani K Mallick Bayesian Inverse problems and UQ 2 / 59
Statistics, Applied Maths, Computation
Statistics provides a rational basis for the analysis of data
In many application area, there is an enormous amount of information
in the form of mathematical models, often developed over decades or
centuries
Applied Maths and Stat are required to work in concert
Combination of Data driven methods as well as systematic predictive
tools (often in the form of partial differential equations)
Bani K Mallick Bayesian Inverse problems and UQ 3 / 59
Computer Experiment
No need for expensive lab equipments and
materials, less costly than physical experiment
Not affected by human
and environmental
factors
Study dangerous
or infeasible
physical
experiments
Simulation
• Experiments are very expensive
• Mathematical models (huge number of partial
differential equations) to predict radiative
shock behavior
• Algorithm or code is available and simulation
can be done from this system
• This algorithm needs some input parameters
which may need to be calibrated
Mathematical Model
• A Mathematical model for a physical experiment
is a set of equations which relate inputs to
outputs
• Inputs relate variables which can be adjusted
before the experiment takes place
• Output represent quantities which can be
measured as a result of the experiment
• The forward problem refers to using the
mathematical model to predict the output of an
experiment from a given input
Inverse Problem
• Using the mathematical model to make inference
about input(s) to the mathematical model based on
the output data
• The output data will be noisy
• Observations may be limited in numbers relative to
the dimension or complexity of the model space
• Inverse problems are ill posed in the sense that small
perturbations in the data may lead to large errors in
the inversion estimates
Inverse problem
Inverse Problem arises in different branches of science and engineering
:
Petroleum Engineering, Aerospace Engineering, Earth Science
A physical system K(u) is parametrized by input parameter u
We observe:
y = K(u) + .
Based on the observation, we want to estimate u and quantify the
uncertainty
Bani K Mallick Bayesian Inverse problems and UQ 4 / 59
Inverse Problem
Often K(u) is determined by a series of ordinary/partial differential
equations (ODE/PDE)
u Forward problem K(u)
K(u) + error
y
Inverse problem u
Bani K Mallick Bayesian Inverse problems and UQ 5 / 59
Examples
Porous media characterization:
Characterize the flow of water/oil in reservoir in petroleum
engineering
The flow depends on the underlying permeability of the media
A highly varying spatial parameter
Heat distribution on the boundary of an object:
Infer about the temperature of the boundary from available data on
some accessible part
Re-entrance of space ship where heat sensors are in the accessible
part of the boundary
Bani K Mallick Bayesian Inverse problems and UQ 6 / 59
Large scale inverse problems and UQ
High dimensional parameter space and large amount of
data
• Expensive forward model
• Solution of this problem as well as quantification of
uncertainty
Center for Radiative shock hydrodynamics
• Supernova : explosion of massive supergiant
stars happens once a century in a Galaxy
• Most energetic explosion in nature, equivalent
to the power in a 10^28 megaton bomb (a few
octillion nuclear wareheads)
• In 1987, there was a supernova explosion in
the Large Magellanic Cloud, a companion
galaxy to the Milky Way.
After and before pictures of Supernova 1987
Shock waves
• Supernova creates radiative shock waves
which passes through a regime in which the
shock layer collapsed in space because of
radiative energy losses
The shock wave’s powerful punch is creating a spectacular light
show
Experiment
The shock waves emerging from
supernovae and the shock waves produced
in the laboratory experiments have specific
similarities
Like to investigate the effect of this shock
on materials
Overview of the talk
Bayesian approaches to Inverse problems
Hierarchical Bayesian models
Structural assumptions at different stages of the Hierarchy for
flexibility, Sparsity or computational efficiency
Heat equation with asymmetric heavy tailed error (linear)
A separation of variable approach for porous media flow (non-linear)
Bani K Mallick Bayesian Inverse problems and UQ 8 / 59
Inverse Problem
y = K(u) + .
p(y|u) ∝ e−
(y−K(u))2
2σ2 if ∼ N(0, σ2)
Ill-posed problem : Even if K(u1) − K(u2) is small, u1 − u2 can
be large.
Least square or maximum–likelihood (mle) estimate may produce
unstable solution: umle = argmax up(y|u)
Regularized estimate can be helpful
Bani K Mallick Bayesian Inverse problems and UQ 9 / 59
Bayesian solution
Bayesian method provides a natural framework, through regularization by
a prior on u.
Likelihood: p(y|u)
Prior: π(u)
Posterior: Π(u|y) ∝ p(y|u)π(u)
Bani K Mallick Bayesian Inverse problems and UQ 10 / 59
Bayesian solution
Bayesian methods provide regularization through prior
Can incorporate non-Gaussianity, non linearity
Can incorporate physical constraints through prior
Simulation and approximation based computational approaches are
available
Bani K Mallick Bayesian Inverse problems and UQ 11 / 59
Likelihood Calculation
y = K(u) + .
p(y|u) ∝ e−
(y−K(u))2
2σ2 if ∼ N(0, σ2)
It is like a black-box likelihood which we cant write analytically,
although we do have a code K that will compute it.
We need to run K to compute the likelihood which is expensive.
Hence, no hope of having any conjugacy in the model, other than for
the error variance in the likelihood.
Need to be somewhat intelligent about the update steps during
MCMC so that do not spend too much time computing likelihoods for
poor candidates.
Bani K Mallick Bayesian Inverse problems and UQ 12 / 59
Increasing computational efficiency
Reducing the dimension: Karhunen–Loeve expansion (KLE)
Variational Bayesian method: approximate the posterior through
separability
Separable solution: accelerate the MCMC through a fast forward
solution
Bani K Mallick Bayesian Inverse problems and UQ 13 / 59
Cases we discuss
A linearized forward model with heavy tailed asymmetric error
We provide a regularized solution with appropriate prior
We derive posterior approximation techniques and related results
A non linear forward model where a numerical forward solver is needed
We derive an accelerated MCMC technique based on separation of
variables
We derive an efficient posterior approximation method
Bani K Mallick Bayesian Inverse problems and UQ 14 / 59
A linearized case with asymmetric error
Bani K Mallick Bayesian Inverse problems and UQ 15 / 59
Heat equation with asymmetric error (Guha et al., 2014)
Domain Ω with boundary Γ = Γ0 ∪ Γ1, where Γ0 and Γ1 (accessible and
inaccessible parts) The temperature field u and the flux are observed over
the subset Γ0.
−∆u = f in Ω; u = g in Γ0 and
∂u
∂n
= q on Γ0,
where f is the source term, n is the unit outward normal direction to the
boundary and ∆ = ∂2
∂x2 + ∂2
∂y2 .
Bani K Mallick Bayesian Inverse problems and UQ 16 / 59
Numerical solution
The inverse problem is to estimate θ = u on the inaccessible boundary Γ1
from q and measured data g on a subset of Γ0.
The observation error is potentially skewed and heavy tailed
Using finite element method, the unknown temperature θ is parameterized
by
θ(x) =
m
i=1
wj (x)uj
over the boundary Γ1 where wj (x) are finite element basis functions
defined on Γ1, m is the number of basis functions and uj are the unknowns
to be estimated.
Bani K Mallick Bayesian Inverse problems and UQ 17 / 59
Ill-posed problem
This yields the model yt = Ku = K(u) on Γ0 , where
K ( n × m) is the sensitivity matrix obtained from finite element
solutions of the variational equations
n is the number of observations on Γ0
m is the number of basis elements on the boundary Γ1.
The problem is ill-posed:
Many singular values of K are close to zero
Restriction on the parameters:
Here, uj , uj+1 are temperatures of adjacent points, ∀j
|uj − uj+1| should be small
Bani K Mallick Bayesian Inverse problems and UQ 18 / 59
Likelihood with skewed error
y = K(u) + ,
with i ’s following skew-t distribution with parameters σ2, α, ν where ν=
degrees of freedom, σ= scale parameter, α= skewness parameter.
Bani K Mallick Bayesian Inverse problems and UQ 19 / 59
Skew-t Distribution
Can be represented as a Scale Mixture of skew-normals or
mean-scale-mixture of normals
Degenerated to regular Student’s t distribution with skewness
parameter α = 0
Degrees of freedom ν large it becomes skew-normal distribution
Bani K Mallick Bayesian Inverse problems and UQ 20 / 59
Hierarchical structure
We use the following representation:
yi = K(u)i + ∆zi + w
−1
2
i τ
1
2 Ni
where
zi = w
− 1
2
i |N0,i | and N0,i ∼ N(0, 1)
Ni ∼ N(0, 1) and Ni is independent of N0,i
∆ = σδ and τ = σ2(1 − δ2), with δ = α√
1+α2
and wi ∼ Gamma(ν
2 , ν
2 )
Hence introducing the variable w, we have a scale mixture
representation useful to derive conditional distributions or moments
Bani K Mallick Bayesian Inverse problems and UQ 21 / 59
Likelihood and regularization through prior
Likelihood:
p(y|u, z, w, ∆, τ) ∝
n
i=1
(w−1
i τ)−1
2 exp −
wi
2τ
(yi − K(u)i − ∆zi )2
,
Prior on u:
p(u|λ) ∝ λm
exp (−λ(|u1| + |u2 − u1| + · · · + |um − um−1|)) .
Again, this can be represented as a scale mixture of normals!!!
Bani K Mallick Bayesian Inverse problems and UQ 22 / 59
Prior regularization
L be an m × m matrix with L(1, 1) = 1.
For i > 1, the ith row, denoted by Li , has 1 in the ith entry and −1
in the (i − 1)th entry and the rest of the elements of the vector Li are
zero.
Bani K Mallick Bayesian Inverse problems and UQ 23 / 59
Σs is a m × m matrix with diagonal elements s2
i : Local variance
component works for scale mixing
λ2: Global variance component (like the regularization parameter in the
penalized likelihood formulation)
p(u|λ) ∝ (
m
j=1
sj )−1
exp −
1
2
ut
Lt
Σ−1
s Lu ×
m
j=1 λ2 exp −
λ2s2
j
2 ds2
1 . . . ds2
m.
(Bayesian Lasso)
Bani K Mallick Bayesian Inverse problems and UQ 24 / 59
Hierarchical Prior Distribution
p(u|s) ∼ MVN(0, (Lt
Σ−1
s L)−1
),
p({s2
j }m
j=1|λ2
) ∝
m
j=1
λ2
exp −
λ2s2
j
2
,
λ2
∼ Gamma(a1, b1).
Bani K Mallick Bayesian Inverse problems and UQ 25 / 59
Sparsity vis Scale Mixing
Create different sparsity priors by choosing different mixing distributions
for s2
i
Exponential Distribution: Double exponential Prior or Bayesian Lasso
[Park and Cassella, 2008]
Jeffrey’s Prior: Normal/Jeffreys sparse prior [Bae and Mallick, 2004]
Inverse-Gamma Distribution: Student-t prior [Tipping 2001]
Half-Cauchy Distribution: Horseshoe Prior [Carvalho et al., 2010]
Bani K Mallick Bayesian Inverse problems and UQ 26 / 59
Posterior Distribution
p(u, z, s, w, τ, ∆, λ|y) ∝
n
i=1
wi τ−1
2 exp −
wi z2
i
2
−
wi
2τ
(yi − K(u)i − ∆zi )2
×p(w)p(u|s)p(s)p(∆)p(τ)p(λ).
Posterior sampling : We use MCMC based sampling approach
We use Variational Bayes posterior approximation
Bani K Mallick Bayesian Inverse problems and UQ 27 / 59
Posterior approximation: variational Bayes
Approximate the exact posterior p(θ|Y ) by a simpler distribution q(θ)
using Kullback-Leibler (KL) divergence
Two essential ingredients: Metric and simplifying assumption
Metric: KL, Helliger distance, Wasserstein distance
KL(q, p) = q log q
p .
Simplifying Assumptions: Mean field variational family:
q(θ) = m
i=1 qi (θi )
Bani K Mallick Bayesian Inverse problems and UQ 28 / 59
Posterior approximation: Variational Bayes
We approximate the posterior under the assumption of conditional
independence.
q(u, z, w, s, ∆, τ, λ) = qu(u)qz(z)qw (w)qs(s)q∆(∆)qτ (τ)qλ(λ).
For likelihood p(Y |Θ) with parameter Θ and posterior p(Θ|Y ) minimize:
KL(Q(Θ), p(Θ|Y )) where
Q(Θ) = s
i=1 qi (θi )
∪s
i=1θi = Θ
Bani K Mallick Bayesian Inverse problems and UQ 29 / 59
Mean field Variational Updates
Coordinate Ascent Variational Inference (CAVI) (Bishop, 2006)
Iteratively optimizes each factor (θi ) of the mean field variational density,
while holding the others fixed
This optimal conditional density has a closed form solution:
qi (θi ) ∝ exp(E−i log(p(Y , Θ)).
In our Variational updates the posterior has closed form
Bani K Mallick Bayesian Inverse problems and UQ 30 / 59
Numerical example of Heat Equation
Ω = [0, 1] × [0, 1]; The boundaries Γ1 and Γ0 are taken to be
Γ1 = (0, 1) × {1} and Γ0 = Γ  Γ1.
u(x1, x2) = sin(πx1)eπx2 + x1 + x2 with the source term f = 0.
y = u + .
Bani K Mallick Bayesian Inverse problems and UQ 31 / 59
Numerical example
The number of data points n = 2n1 on {0, 1} × (0, 1), with n1 equally
spaced points on each side of the square domain.
We use m many basis on the boundary Γ1
The skewness parameter δ := α/
√
1 + α2 = 0.8
We use n1 = 80 and m = 41
Bani K Mallick Bayesian Inverse problems and UQ 32 / 59
Numerical example
(a) Posterior mean of λ1 = λ2 (b) Inverse solution
The left panel shows a typical fit where the solid line shows the true values, dotted lines gives the estimates and dashed line
shows the fit if we ignore the skewness part.. The right panel shows a typical fit. Red, blue and black line shows MCMC
estimate, variational Bayes estimate and true temperature, respectively .
Bani K Mallick Bayesian Inverse problems and UQ 33 / 59
Figure
2.5 3.0 3.5 4.0 4.5
0.00.51.01.5
x
density
0.0 0.5 1.0 1.5 2.0
05101520253035
x
density
(a) posterior density q(∆) (b) posterior density q(τ−1)
Bani K Mallick Bayesian Inverse problems and UQ 34 / 59
Table 2
Table : A comparison of the computational cost between the variation
approximation and MCMC. The computational times are given in seconds.
Method Computational cost
m = 40, n1 = 80 m = 60, n1 = 120 m = 80, n1 = 160
Variational 0.82 1.96 4.20
MCMC 243.20 529.27 858
Bani K Mallick Bayesian Inverse problems and UQ 35 / 59
Comments
Variational Bayes performs similar to MCMC in estimation
Variational Bayes is several hundred times faster than MCMC
Bani K Mallick Bayesian Inverse problems and UQ 36 / 59
Nonlinear Problems
Many Practical Inverse Problems are described by nonlinear Forward
Models
First Order Taylor Expansion and use the idea of recursive
linearization
Adding further structures to develop a fast solver
Bani K Mallick Bayesian Inverse problems and UQ 37 / 59
Porous Media Flow
Bani K Mallick Bayesian Inverse problems and UQ 38 / 59
Fluid Flow in Porous Media
Studying flow of liquids (Ground water, oil) in aquifer (reservoir)
Applications: Oil production, Contaminant cleanup
Forward Model: Models the flow of liquid, output is the production data,
inputs are physical characteristics like permeability, porosity
Inverse problem: Inferring the permeability from the flow
Bani K Mallick Bayesian Inverse problems and UQ 39 / 59
Permeability Field
Primary parameter of interest is the permeability field
Permeability is a measure of how easily liquid flows through the aquifer at
that point
This permeability values vary over space
Effective recovery procedures rely on good permeability estimates, as one
must be able to identify high permeability channels and low permeability
barriers
Bani K Mallick Bayesian Inverse problems and UQ 40 / 59
Fluid Flow
Fluid flow in a porous media takes place due to pressure difference
This is achieved by injecting water (or other fluid) in the reservoir through
the injection wells
Fluid is driven toward the producing well due to diminished reservoir
pressure
Reservoirs having high permeability would produce with relative ease and
better recovery is usually expected
Bani K Mallick Bayesian Inverse problems and UQ 41 / 59
Estimation in porous media flow
Single phase flow:
− · (κ(x, µ) U(x, µ)) = f in Ωx .
Input Parameter: Permeability κ(x, µ) is a Spatial field which is unknown.
Solution U could be fractional flow (water-cut Data) or Pressure field.
Y = Uobs = U(κ) + f ,
f ∼ N(0, σ2
f ).
Given observations Y with noise, we want to estimate κ.
Bani K Mallick Bayesian Inverse problems and UQ 42 / 59
An illustration : porous media flow
Time
Left: a log permeability field. Right: water-cut (proportion of water) vs the normalized time.
Bani K Mallick Bayesian Inverse problems and UQ 43 / 59
Observation Location and κ
10 20 30 40 50
5
10
15
20
25
30
35
40
45
50 -0.3
-0.2
-0.1
0
0.1
0.2
0.3
0.4
0.5
*
*
*
*
**
*
*
*
*
*
*
Observation location and true field
Bani K Mallick Bayesian Inverse problems and UQ 44 / 59
Parametrization of κ using KLE
Unknown spatial field can be modeled as a realization of Gaussian random
field:
log(κ(x, ˜ω)) =
∞
k=1
λkθk(˜ω)Φk(x),
θk ∼ N(0, 1).
The underlying covariance Kernel is given by R(x, y) with
R(x, y)φk(x)dx = λkφk(y); Ex: R(x, y) = σ2
κe−
x−y 2
2l2 (isotropic)
Our goal is to find the posterior of κ: Π(κ|Y )
Bani K Mallick Bayesian Inverse problems and UQ 45 / 59
Reduction under separability
Separating the spatial and parametric part: U = N1
i=1 ai (µ)vi (x).
Here κ is parametrized by finite dimensional µ = {θ1, . . . , θN2 }. We use
N2 terms in KLE.
We impose ai (µ) = N2
j=1 ai,j (θj ).
U =
Nterm
i=1


N2
j=1
ai,j (θj )

 vi (x).
Assume, ai,j (.) ∼ G P(0, K ); where K is Covariance Kernel.
Bani K Mallick Bayesian Inverse problems and UQ 46 / 59
Hierarchical Model
Y = U(κ) + f ,
f ∼ N(0, σ2
f ),
U =
N1
i=1
ai (µ)vi (x)
ai (µ) =
N2
j=1
ai,j (θj )
ai,j (.) ∼ G P(0, K ),
log(κ(x, ˜ω)) =
∞
k=1
λkθk(˜ω)Φk(x),
θk ∼ N(0, 1).
Due to separability structures, it is possible to develop the variational
Bayes algorithmBani K Mallick Bayesian Inverse problems and UQ 47 / 59
Posterior Property
Let H be the probability distribution for the observation location on the
domain Ωx . We show
Theorem
For any neighborhood U of f ∗
x (y) (true density) given as
Ωx Ωy
|fx (y) − f ∗
x (y)|dydHx < , we have Π(U |data) → 1 with
probability 1 as N(number of observations) → ∞.
Bani K Mallick Bayesian Inverse problems and UQ 48 / 59
Example
Isotropic covariance kernel is used for KLE and to generate the
permeability field
N1 = 50, N2 = 30
For ai,,j ’s the Gaussian process covariance Kernel is given by Gaussian
kernel
Bani K Mallick Bayesian Inverse problems and UQ 49 / 59
Example
10 20 30 40 50
5
10
15
20
25
30
35
40
45
50 -0.3
-0.2
-0.1
0
0.1
0.2
0.3
0.4
0.5
10 20 30 40 50
5
10
15
20
25
30
35
40
45
50 -0.3
-0.2
-0.1
0
0.1
0.2
0.3
0.4
0.5
10 20 30 40 50
5
10
15
20
25
30
35
40
45
50 -0.3
-0.2
-0.1
0
0.1
0.2
0.3
0.4
0.5
True field Variational Bayes solution MCMC Solution
Bani K Mallick Bayesian Inverse problems and UQ 50 / 59
Posterior coefficient in variational Bayes
-5 -4 -3 -2 -1 0 1 2 3 4 5
0
0.5
1
1.5
2
2.5
3
3.5
Posterior Distribution
Posterior Mean
Prior Distribution
Posterior and prior density for a coefficient (θ3) of the KLE parametrization.
Bani K Mallick Bayesian Inverse problems and UQ 51 / 59
Variational Bayes convergence
0 5 10 15 20 25 30
-5
-4
-3
-2
-1
0
1
2
3
4
5
Convergence of the KLE coefficients over time. X axis shows the number of iteration
and Y axis shows the value of the coefficients θ1, . . . , θN2
.
Bani K Mallick Bayesian Inverse problems and UQ 52 / 59
Introduction
Reservoir
Prediction
Rock / Fluid
Physics
Multiple High Resolution
Geologic Models
Data Assimilation &
Model Calibration
Seismic
Input
Time Lapse
Seismic
Multiphase
Production Data
Fast Flow
Simulation
Permanent
Down hole Sensor
s
1
s
2
s
3
s
n
…
t1
t2
t3
t
m
…
Subsurface characterization is to identify subsurface properties
taking into account various data sources and their precisions and
predict future reservoir performance.
Spatial fields with channelized structure
Geologic environments often contains distinct geologic facies.
Sharp contrasts across facies boundaries.
Orientation of channel, channel geometry determines the fluid flow.
Bani K Mallick Bayesian Inverse problems and UQ 53 / 59
Non-Gaussian Field
Permeability field is non-Gaussian
It can be decomposed into subregions where each region represent a facies
Within each facies, the permeability field will be modeled using a
log-Gaussian process
Hence we can express it as k(x) = i IDi(x)ki (x) where ID is an indicator
function of the region D (i.e. I(x) = 1 if x ∈ D and I(x) = 0 otherwise
We estimate the boundaries of the facies (interfaces) adaptively.
Bani K Mallick Bayesian Inverse problems and UQ 54 / 59
Non-Gaussian Field
The boundaries are represented by piecewise linear functions
The location and number of pieces are unknown
That way it is capable of reproducing a wide variety of channel geometry
The shape of the channel boundaries is updated using the data
Reversible jump MCMC is required to do so
Bani K Mallick Bayesian Inverse problems and UQ 55 / 59
Permeability samples
Permeability Comparison
TRUE MODEL
INITIAL MODEL
4 UPDATED MODELS
After history matching,
we could change channel connectivity
Permeability Comparison
4 UPDATED MODELS
TRUE MODEL
INITIAL MODEL
Conclusion
There are alternative ways of fast posterior approximation or computation
methods (ABC, Multistage Preconditioned MCMC, Hamiltonian Monte
Carlo)
Fast Emulators to replace expensive simulators
Integration of Data with Science
Bani K Mallick Bayesian Inverse problems and UQ 56 / 59
Conclusion
Production Data from unconventional reservoir (shale reservoirs)
Decline Curve analysis: oil production over time (Weibull Model, Arps
Hyperbolic Model, Duong Model)
How to integrate these models with PDEs (physical Laws)?
Integration of Machine learning methods with PDEs
Bani K Mallick Bayesian Inverse problems and UQ 57 / 59
References
Yang, K., Guha, N., Efendiev, E. and Mallick, B. (2017) Bayesian and
Variational Bayes approaches for fows in heterogeneous random media”,
Journal of Computational Physics, 345, 275-293.
Guha, N., Wu,X., Efendiev, Y, Jin, B, and Mallick, B. (2015) A
Variational approach for inverse problems with skew-t error distributions,”
Journal of Computational Physics, 301, C;377-393.
Efendiev, Y., Leung, W. T., Cheung, S. W., Guha, N., Hoang, V. H.,
Mallick, B. (2017) Bayesian Multiscale Finite Element Methods modeling
missing subgrid information proba- bilistically, International Journal for
Multiscale Computational Engineering, 2,1,175-197.
Bani K Mallick Bayesian Inverse problems and UQ 58 / 59
References
Mondal, A., Mallick, B., Efendiev, Y. and Datta-Gupta, A. (2014),
Bayesian uncertainty quantification for subsurface inversion using
multiscale hierarchical model”, Technometrics,56,3,381- 392.
Chakraborty, A., Bingham, D., Dhavala, S., Kuranz, C., Drake, P.,
Grosskopf, M., Rutter, E., Holloway, J., McClarren, R and Mallick, B.
(2017) Emulation of Numerical Models with over-specified basis
functions”, Technometrics, 2, 59, 153-164.
Chakraborty, A., Mallick, B., McClareren, R., Kuranz, C. and Drake, P.
(2013) Spline based emulators for radiative shock experiments with
measurement error”, Journal of the American Statistical Association,108,
411-428
Bani K Mallick Bayesian Inverse problems and UQ 59 / 59

More Related Content

What's hot

What's hot (13)

Seattle.Slides.7
Seattle.Slides.7Seattle.Slides.7
Seattle.Slides.7
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
A quantum-inspired optimization heuristic for the multiple sequence alignment...
A quantum-inspired optimization heuristic for the multiple sequence alignment...A quantum-inspired optimization heuristic for the multiple sequence alignment...
A quantum-inspired optimization heuristic for the multiple sequence alignment...
 
PROJECT
PROJECTPROJECT
PROJECT
 
A short introduction to statistical learning
A short introduction to statistical learningA short introduction to statistical learning
A short introduction to statistical learning
 
Introduction to Evidential Neural Networks
Introduction to Evidential Neural NetworksIntroduction to Evidential Neural Networks
Introduction to Evidential Neural Networks
 
Alternating direction-implicit-finite-difference-method-for-transient-2 d-hea...
Alternating direction-implicit-finite-difference-method-for-transient-2 d-hea...Alternating direction-implicit-finite-difference-method-for-transient-2 d-hea...
Alternating direction-implicit-finite-difference-method-for-transient-2 d-hea...
 
Pattern Recognition
Pattern RecognitionPattern Recognition
Pattern Recognition
 
Finite Difference Method for Nonlocal Singularly Perturbed Problem
Finite Difference Method for Nonlocal Singularly Perturbed ProblemFinite Difference Method for Nonlocal Singularly Perturbed Problem
Finite Difference Method for Nonlocal Singularly Perturbed Problem
 
MUMS: Bayesian, Fiducial, and Frequentist Conference - Uncertainty Quantifica...
MUMS: Bayesian, Fiducial, and Frequentist Conference - Uncertainty Quantifica...MUMS: Bayesian, Fiducial, and Frequentist Conference - Uncertainty Quantifica...
MUMS: Bayesian, Fiducial, and Frequentist Conference - Uncertainty Quantifica...
 
Principal Component Analysis
Principal Component AnalysisPrincipal Component Analysis
Principal Component Analysis
 
Probability distributions for ml
Probability distributions for mlProbability distributions for ml
Probability distributions for ml
 
TruongNguyen_CV
TruongNguyen_CVTruongNguyen_CV
TruongNguyen_CV
 

Similar to MUMS Opening Workshop - Hierarchical Bayesian Models for Inverse Problems and Uncertainty Quantification - Bani Mallick, August 20, 2018

Numerical Solutions of Burgers' Equation Project Report
Numerical Solutions of Burgers' Equation Project ReportNumerical Solutions of Burgers' Equation Project Report
Numerical Solutions of Burgers' Equation Project Report
Shikhar Agarwal
 
Introduction of Inverse Problem and Its Applications
Introduction of Inverse Problem and Its ApplicationsIntroduction of Inverse Problem and Its Applications
Introduction of Inverse Problem and Its Applications
Komal Goyal
 
[Numerical Heat Transfer Part B Fundamentals 2001-sep vol. 40 iss. 3] C. Wan,...
[Numerical Heat Transfer Part B Fundamentals 2001-sep vol. 40 iss. 3] C. Wan,...[Numerical Heat Transfer Part B Fundamentals 2001-sep vol. 40 iss. 3] C. Wan,...
[Numerical Heat Transfer Part B Fundamentals 2001-sep vol. 40 iss. 3] C. Wan,...
d00a7ece
 
article_imen_ridha_2016_version_finale
article_imen_ridha_2016_version_finalearticle_imen_ridha_2016_version_finale
article_imen_ridha_2016_version_finale
Mdimagh Ridha
 
Using Qualitative Knowledge in Numerical Learning
Using Qualitative Knowledge in Numerical LearningUsing Qualitative Knowledge in Numerical Learning
Using Qualitative Knowledge in Numerical Learning
butest
 
Nonlinear inversion of absorptive/dispersive wave field measurements: prelimi...
Nonlinear inversion of absorptive/dispersive wave field measurements: prelimi...Nonlinear inversion of absorptive/dispersive wave field measurements: prelimi...
Nonlinear inversion of absorptive/dispersive wave field measurements: prelimi...
Arthur Weglein
 

Similar to MUMS Opening Workshop - Hierarchical Bayesian Models for Inverse Problems and Uncertainty Quantification - Bani Mallick, August 20, 2018 (20)

Numerical Solutions of Burgers' Equation Project Report
Numerical Solutions of Burgers' Equation Project ReportNumerical Solutions of Burgers' Equation Project Report
Numerical Solutions of Burgers' Equation Project Report
 
Talk iccf 19_ben_hammouda
Talk iccf 19_ben_hammoudaTalk iccf 19_ben_hammouda
Talk iccf 19_ben_hammouda
 
Introduction of Inverse Problem and Its Applications
Introduction of Inverse Problem and Its ApplicationsIntroduction of Inverse Problem and Its Applications
Introduction of Inverse Problem and Its Applications
 
(研究会輪読) Weight Uncertainty in Neural Networks
(研究会輪読) Weight Uncertainty in Neural Networks(研究会輪読) Weight Uncertainty in Neural Networks
(研究会輪読) Weight Uncertainty in Neural Networks
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Analyzing The Quantum Annealing Approach For Solving Linear Least Squares Pro...
Analyzing The Quantum Annealing Approach For Solving Linear Least Squares Pro...Analyzing The Quantum Annealing Approach For Solving Linear Least Squares Pro...
Analyzing The Quantum Annealing Approach For Solving Linear Least Squares Pro...
 
arijit ppt (1) (1).pptx
arijit ppt (1) (1).pptxarijit ppt (1) (1).pptx
arijit ppt (1) (1).pptx
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Yandex wg-talk
Yandex wg-talkYandex wg-talk
Yandex wg-talk
 
[Numerical Heat Transfer Part B Fundamentals 2001-sep vol. 40 iss. 3] C. Wan,...
[Numerical Heat Transfer Part B Fundamentals 2001-sep vol. 40 iss. 3] C. Wan,...[Numerical Heat Transfer Part B Fundamentals 2001-sep vol. 40 iss. 3] C. Wan,...
[Numerical Heat Transfer Part B Fundamentals 2001-sep vol. 40 iss. 3] C. Wan,...
 
Markov chain Monte Carlo methods and some attempts at parallelizing them
Markov chain Monte Carlo methods and some attempts at parallelizing themMarkov chain Monte Carlo methods and some attempts at parallelizing them
Markov chain Monte Carlo methods and some attempts at parallelizing them
 
article_imen_ridha_2016_version_finale
article_imen_ridha_2016_version_finalearticle_imen_ridha_2016_version_finale
article_imen_ridha_2016_version_finale
 
Using Qualitative Knowledge in Numerical Learning
Using Qualitative Knowledge in Numerical LearningUsing Qualitative Knowledge in Numerical Learning
Using Qualitative Knowledge in Numerical Learning
 
Aplicaciones de espacios y subespacios vectoriales en la carrera de tecnologi...
Aplicaciones de espacios y subespacios vectoriales en la carrera de tecnologi...Aplicaciones de espacios y subespacios vectoriales en la carrera de tecnologi...
Aplicaciones de espacios y subespacios vectoriales en la carrera de tecnologi...
 
APPLICATION OF NUMERICAL METHODS IN SMALL SIZE
APPLICATION OF NUMERICAL METHODS IN SMALL SIZEAPPLICATION OF NUMERICAL METHODS IN SMALL SIZE
APPLICATION OF NUMERICAL METHODS IN SMALL SIZE
 
Bayesian Generalization Error and Real Log Canonical Threshold in Non-negativ...
Bayesian Generalization Error and Real Log Canonical Threshold in Non-negativ...Bayesian Generalization Error and Real Log Canonical Threshold in Non-negativ...
Bayesian Generalization Error and Real Log Canonical Threshold in Non-negativ...
 
A Family Of Extragradient Methods For Solving Equilibrium Problems
A Family Of Extragradient Methods For Solving Equilibrium ProblemsA Family Of Extragradient Methods For Solving Equilibrium Problems
A Family Of Extragradient Methods For Solving Equilibrium Problems
 
Nonlinear inversion of absorptive/dispersive wave field measurements: prelimi...
Nonlinear inversion of absorptive/dispersive wave field measurements: prelimi...Nonlinear inversion of absorptive/dispersive wave field measurements: prelimi...
Nonlinear inversion of absorptive/dispersive wave field measurements: prelimi...
 
Machine learning with quantum computers
Machine learning with quantum computersMachine learning with quantum computers
Machine learning with quantum computers
 
Error analysis statistics
Error analysis   statisticsError analysis   statistics
Error analysis statistics
 

More from The Statistical and Applied Mathematical Sciences Institute

Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
The Statistical and Applied Mathematical Sciences Institute
 
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
The Statistical and Applied Mathematical Sciences Institute
 

More from The Statistical and Applied Mathematical Sciences Institute (20)

Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
 
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
 
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
 
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
 
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
 
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
 
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
 
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
 
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
 
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
 
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
 
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
 
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
 
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
 
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
 
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
 
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
 
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
 
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
 
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
 

Recently uploaded

Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
Chris Hunter
 

Recently uploaded (20)

Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docx
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 

MUMS Opening Workshop - Hierarchical Bayesian Models for Inverse Problems and Uncertainty Quantification - Bani Mallick, August 20, 2018

  • 1. Hierarchical Bayesian Models for Inverse Problems and Uncertainty Quantification Bani K Mallick Department of Statistics Texas A&M University Collaborators: Nilbja Guha, Yalchin Efendiev, Bangti Jin, Keren Yang Bani K Mallick Bayesian Inverse problems and UQ 1 / 59
  • 2. Model, Method, Computation Massive data and tremendous scale of computational power Ability to use highly heterogeneous data sets Possibility of modeling more complex structures and systems Making sense of data, in the context of the modeling of complex systems, is a challenging task Bani K Mallick Bayesian Inverse problems and UQ 2 / 59
  • 3. Statistics, Applied Maths, Computation Statistics provides a rational basis for the analysis of data In many application area, there is an enormous amount of information in the form of mathematical models, often developed over decades or centuries Applied Maths and Stat are required to work in concert Combination of Data driven methods as well as systematic predictive tools (often in the form of partial differential equations) Bani K Mallick Bayesian Inverse problems and UQ 3 / 59
  • 4. Computer Experiment No need for expensive lab equipments and materials, less costly than physical experiment Not affected by human and environmental factors Study dangerous or infeasible physical experiments
  • 5. Simulation • Experiments are very expensive • Mathematical models (huge number of partial differential equations) to predict radiative shock behavior • Algorithm or code is available and simulation can be done from this system • This algorithm needs some input parameters which may need to be calibrated
  • 6. Mathematical Model • A Mathematical model for a physical experiment is a set of equations which relate inputs to outputs • Inputs relate variables which can be adjusted before the experiment takes place • Output represent quantities which can be measured as a result of the experiment • The forward problem refers to using the mathematical model to predict the output of an experiment from a given input
  • 7. Inverse Problem • Using the mathematical model to make inference about input(s) to the mathematical model based on the output data • The output data will be noisy • Observations may be limited in numbers relative to the dimension or complexity of the model space • Inverse problems are ill posed in the sense that small perturbations in the data may lead to large errors in the inversion estimates
  • 8. Inverse problem Inverse Problem arises in different branches of science and engineering : Petroleum Engineering, Aerospace Engineering, Earth Science A physical system K(u) is parametrized by input parameter u We observe: y = K(u) + . Based on the observation, we want to estimate u and quantify the uncertainty Bani K Mallick Bayesian Inverse problems and UQ 4 / 59
  • 9. Inverse Problem Often K(u) is determined by a series of ordinary/partial differential equations (ODE/PDE) u Forward problem K(u) K(u) + error y Inverse problem u Bani K Mallick Bayesian Inverse problems and UQ 5 / 59
  • 10. Examples Porous media characterization: Characterize the flow of water/oil in reservoir in petroleum engineering The flow depends on the underlying permeability of the media A highly varying spatial parameter Heat distribution on the boundary of an object: Infer about the temperature of the boundary from available data on some accessible part Re-entrance of space ship where heat sensors are in the accessible part of the boundary Bani K Mallick Bayesian Inverse problems and UQ 6 / 59
  • 11. Large scale inverse problems and UQ High dimensional parameter space and large amount of data • Expensive forward model • Solution of this problem as well as quantification of uncertainty
  • 12. Center for Radiative shock hydrodynamics • Supernova : explosion of massive supergiant stars happens once a century in a Galaxy • Most energetic explosion in nature, equivalent to the power in a 10^28 megaton bomb (a few octillion nuclear wareheads) • In 1987, there was a supernova explosion in the Large Magellanic Cloud, a companion galaxy to the Milky Way.
  • 13. After and before pictures of Supernova 1987
  • 14. Shock waves • Supernova creates radiative shock waves which passes through a regime in which the shock layer collapsed in space because of radiative energy losses The shock wave’s powerful punch is creating a spectacular light show
  • 15. Experiment The shock waves emerging from supernovae and the shock waves produced in the laboratory experiments have specific similarities Like to investigate the effect of this shock on materials
  • 16. Overview of the talk Bayesian approaches to Inverse problems Hierarchical Bayesian models Structural assumptions at different stages of the Hierarchy for flexibility, Sparsity or computational efficiency Heat equation with asymmetric heavy tailed error (linear) A separation of variable approach for porous media flow (non-linear) Bani K Mallick Bayesian Inverse problems and UQ 8 / 59
  • 17. Inverse Problem y = K(u) + . p(y|u) ∝ e− (y−K(u))2 2σ2 if ∼ N(0, σ2) Ill-posed problem : Even if K(u1) − K(u2) is small, u1 − u2 can be large. Least square or maximum–likelihood (mle) estimate may produce unstable solution: umle = argmax up(y|u) Regularized estimate can be helpful Bani K Mallick Bayesian Inverse problems and UQ 9 / 59
  • 18. Bayesian solution Bayesian method provides a natural framework, through regularization by a prior on u. Likelihood: p(y|u) Prior: π(u) Posterior: Π(u|y) ∝ p(y|u)π(u) Bani K Mallick Bayesian Inverse problems and UQ 10 / 59
  • 19. Bayesian solution Bayesian methods provide regularization through prior Can incorporate non-Gaussianity, non linearity Can incorporate physical constraints through prior Simulation and approximation based computational approaches are available Bani K Mallick Bayesian Inverse problems and UQ 11 / 59
  • 20. Likelihood Calculation y = K(u) + . p(y|u) ∝ e− (y−K(u))2 2σ2 if ∼ N(0, σ2) It is like a black-box likelihood which we cant write analytically, although we do have a code K that will compute it. We need to run K to compute the likelihood which is expensive. Hence, no hope of having any conjugacy in the model, other than for the error variance in the likelihood. Need to be somewhat intelligent about the update steps during MCMC so that do not spend too much time computing likelihoods for poor candidates. Bani K Mallick Bayesian Inverse problems and UQ 12 / 59
  • 21. Increasing computational efficiency Reducing the dimension: Karhunen–Loeve expansion (KLE) Variational Bayesian method: approximate the posterior through separability Separable solution: accelerate the MCMC through a fast forward solution Bani K Mallick Bayesian Inverse problems and UQ 13 / 59
  • 22. Cases we discuss A linearized forward model with heavy tailed asymmetric error We provide a regularized solution with appropriate prior We derive posterior approximation techniques and related results A non linear forward model where a numerical forward solver is needed We derive an accelerated MCMC technique based on separation of variables We derive an efficient posterior approximation method Bani K Mallick Bayesian Inverse problems and UQ 14 / 59
  • 23. A linearized case with asymmetric error Bani K Mallick Bayesian Inverse problems and UQ 15 / 59
  • 24. Heat equation with asymmetric error (Guha et al., 2014) Domain Ω with boundary Γ = Γ0 ∪ Γ1, where Γ0 and Γ1 (accessible and inaccessible parts) The temperature field u and the flux are observed over the subset Γ0. −∆u = f in Ω; u = g in Γ0 and ∂u ∂n = q on Γ0, where f is the source term, n is the unit outward normal direction to the boundary and ∆ = ∂2 ∂x2 + ∂2 ∂y2 . Bani K Mallick Bayesian Inverse problems and UQ 16 / 59
  • 25. Numerical solution The inverse problem is to estimate θ = u on the inaccessible boundary Γ1 from q and measured data g on a subset of Γ0. The observation error is potentially skewed and heavy tailed Using finite element method, the unknown temperature θ is parameterized by θ(x) = m i=1 wj (x)uj over the boundary Γ1 where wj (x) are finite element basis functions defined on Γ1, m is the number of basis functions and uj are the unknowns to be estimated. Bani K Mallick Bayesian Inverse problems and UQ 17 / 59
  • 26. Ill-posed problem This yields the model yt = Ku = K(u) on Γ0 , where K ( n × m) is the sensitivity matrix obtained from finite element solutions of the variational equations n is the number of observations on Γ0 m is the number of basis elements on the boundary Γ1. The problem is ill-posed: Many singular values of K are close to zero Restriction on the parameters: Here, uj , uj+1 are temperatures of adjacent points, ∀j |uj − uj+1| should be small Bani K Mallick Bayesian Inverse problems and UQ 18 / 59
  • 27. Likelihood with skewed error y = K(u) + , with i ’s following skew-t distribution with parameters σ2, α, ν where ν= degrees of freedom, σ= scale parameter, α= skewness parameter. Bani K Mallick Bayesian Inverse problems and UQ 19 / 59
  • 28. Skew-t Distribution Can be represented as a Scale Mixture of skew-normals or mean-scale-mixture of normals Degenerated to regular Student’s t distribution with skewness parameter α = 0 Degrees of freedom ν large it becomes skew-normal distribution Bani K Mallick Bayesian Inverse problems and UQ 20 / 59
  • 29. Hierarchical structure We use the following representation: yi = K(u)i + ∆zi + w −1 2 i τ 1 2 Ni where zi = w − 1 2 i |N0,i | and N0,i ∼ N(0, 1) Ni ∼ N(0, 1) and Ni is independent of N0,i ∆ = σδ and τ = σ2(1 − δ2), with δ = α√ 1+α2 and wi ∼ Gamma(ν 2 , ν 2 ) Hence introducing the variable w, we have a scale mixture representation useful to derive conditional distributions or moments Bani K Mallick Bayesian Inverse problems and UQ 21 / 59
  • 30. Likelihood and regularization through prior Likelihood: p(y|u, z, w, ∆, τ) ∝ n i=1 (w−1 i τ)−1 2 exp − wi 2τ (yi − K(u)i − ∆zi )2 , Prior on u: p(u|λ) ∝ λm exp (−λ(|u1| + |u2 − u1| + · · · + |um − um−1|)) . Again, this can be represented as a scale mixture of normals!!! Bani K Mallick Bayesian Inverse problems and UQ 22 / 59
  • 31. Prior regularization L be an m × m matrix with L(1, 1) = 1. For i > 1, the ith row, denoted by Li , has 1 in the ith entry and −1 in the (i − 1)th entry and the rest of the elements of the vector Li are zero. Bani K Mallick Bayesian Inverse problems and UQ 23 / 59
  • 32. Σs is a m × m matrix with diagonal elements s2 i : Local variance component works for scale mixing λ2: Global variance component (like the regularization parameter in the penalized likelihood formulation) p(u|λ) ∝ ( m j=1 sj )−1 exp − 1 2 ut Lt Σ−1 s Lu × m j=1 λ2 exp − λ2s2 j 2 ds2 1 . . . ds2 m. (Bayesian Lasso) Bani K Mallick Bayesian Inverse problems and UQ 24 / 59
  • 33. Hierarchical Prior Distribution p(u|s) ∼ MVN(0, (Lt Σ−1 s L)−1 ), p({s2 j }m j=1|λ2 ) ∝ m j=1 λ2 exp − λ2s2 j 2 , λ2 ∼ Gamma(a1, b1). Bani K Mallick Bayesian Inverse problems and UQ 25 / 59
  • 34. Sparsity vis Scale Mixing Create different sparsity priors by choosing different mixing distributions for s2 i Exponential Distribution: Double exponential Prior or Bayesian Lasso [Park and Cassella, 2008] Jeffrey’s Prior: Normal/Jeffreys sparse prior [Bae and Mallick, 2004] Inverse-Gamma Distribution: Student-t prior [Tipping 2001] Half-Cauchy Distribution: Horseshoe Prior [Carvalho et al., 2010] Bani K Mallick Bayesian Inverse problems and UQ 26 / 59
  • 35. Posterior Distribution p(u, z, s, w, τ, ∆, λ|y) ∝ n i=1 wi τ−1 2 exp − wi z2 i 2 − wi 2τ (yi − K(u)i − ∆zi )2 ×p(w)p(u|s)p(s)p(∆)p(τ)p(λ). Posterior sampling : We use MCMC based sampling approach We use Variational Bayes posterior approximation Bani K Mallick Bayesian Inverse problems and UQ 27 / 59
  • 36. Posterior approximation: variational Bayes Approximate the exact posterior p(θ|Y ) by a simpler distribution q(θ) using Kullback-Leibler (KL) divergence Two essential ingredients: Metric and simplifying assumption Metric: KL, Helliger distance, Wasserstein distance KL(q, p) = q log q p . Simplifying Assumptions: Mean field variational family: q(θ) = m i=1 qi (θi ) Bani K Mallick Bayesian Inverse problems and UQ 28 / 59
  • 37. Posterior approximation: Variational Bayes We approximate the posterior under the assumption of conditional independence. q(u, z, w, s, ∆, τ, λ) = qu(u)qz(z)qw (w)qs(s)q∆(∆)qτ (τ)qλ(λ). For likelihood p(Y |Θ) with parameter Θ and posterior p(Θ|Y ) minimize: KL(Q(Θ), p(Θ|Y )) where Q(Θ) = s i=1 qi (θi ) ∪s i=1θi = Θ Bani K Mallick Bayesian Inverse problems and UQ 29 / 59
  • 38. Mean field Variational Updates Coordinate Ascent Variational Inference (CAVI) (Bishop, 2006) Iteratively optimizes each factor (θi ) of the mean field variational density, while holding the others fixed This optimal conditional density has a closed form solution: qi (θi ) ∝ exp(E−i log(p(Y , Θ)). In our Variational updates the posterior has closed form Bani K Mallick Bayesian Inverse problems and UQ 30 / 59
  • 39. Numerical example of Heat Equation Ω = [0, 1] × [0, 1]; The boundaries Γ1 and Γ0 are taken to be Γ1 = (0, 1) × {1} and Γ0 = Γ Γ1. u(x1, x2) = sin(πx1)eπx2 + x1 + x2 with the source term f = 0. y = u + . Bani K Mallick Bayesian Inverse problems and UQ 31 / 59
  • 40. Numerical example The number of data points n = 2n1 on {0, 1} × (0, 1), with n1 equally spaced points on each side of the square domain. We use m many basis on the boundary Γ1 The skewness parameter δ := α/ √ 1 + α2 = 0.8 We use n1 = 80 and m = 41 Bani K Mallick Bayesian Inverse problems and UQ 32 / 59
  • 41. Numerical example (a) Posterior mean of λ1 = λ2 (b) Inverse solution The left panel shows a typical fit where the solid line shows the true values, dotted lines gives the estimates and dashed line shows the fit if we ignore the skewness part.. The right panel shows a typical fit. Red, blue and black line shows MCMC estimate, variational Bayes estimate and true temperature, respectively . Bani K Mallick Bayesian Inverse problems and UQ 33 / 59
  • 42. Figure 2.5 3.0 3.5 4.0 4.5 0.00.51.01.5 x density 0.0 0.5 1.0 1.5 2.0 05101520253035 x density (a) posterior density q(∆) (b) posterior density q(τ−1) Bani K Mallick Bayesian Inverse problems and UQ 34 / 59
  • 43. Table 2 Table : A comparison of the computational cost between the variation approximation and MCMC. The computational times are given in seconds. Method Computational cost m = 40, n1 = 80 m = 60, n1 = 120 m = 80, n1 = 160 Variational 0.82 1.96 4.20 MCMC 243.20 529.27 858 Bani K Mallick Bayesian Inverse problems and UQ 35 / 59
  • 44. Comments Variational Bayes performs similar to MCMC in estimation Variational Bayes is several hundred times faster than MCMC Bani K Mallick Bayesian Inverse problems and UQ 36 / 59
  • 45. Nonlinear Problems Many Practical Inverse Problems are described by nonlinear Forward Models First Order Taylor Expansion and use the idea of recursive linearization Adding further structures to develop a fast solver Bani K Mallick Bayesian Inverse problems and UQ 37 / 59
  • 46. Porous Media Flow Bani K Mallick Bayesian Inverse problems and UQ 38 / 59
  • 47. Fluid Flow in Porous Media Studying flow of liquids (Ground water, oil) in aquifer (reservoir) Applications: Oil production, Contaminant cleanup Forward Model: Models the flow of liquid, output is the production data, inputs are physical characteristics like permeability, porosity Inverse problem: Inferring the permeability from the flow Bani K Mallick Bayesian Inverse problems and UQ 39 / 59
  • 48. Permeability Field Primary parameter of interest is the permeability field Permeability is a measure of how easily liquid flows through the aquifer at that point This permeability values vary over space Effective recovery procedures rely on good permeability estimates, as one must be able to identify high permeability channels and low permeability barriers Bani K Mallick Bayesian Inverse problems and UQ 40 / 59
  • 49. Fluid Flow Fluid flow in a porous media takes place due to pressure difference This is achieved by injecting water (or other fluid) in the reservoir through the injection wells Fluid is driven toward the producing well due to diminished reservoir pressure Reservoirs having high permeability would produce with relative ease and better recovery is usually expected Bani K Mallick Bayesian Inverse problems and UQ 41 / 59
  • 50. Estimation in porous media flow Single phase flow: − · (κ(x, µ) U(x, µ)) = f in Ωx . Input Parameter: Permeability κ(x, µ) is a Spatial field which is unknown. Solution U could be fractional flow (water-cut Data) or Pressure field. Y = Uobs = U(κ) + f , f ∼ N(0, σ2 f ). Given observations Y with noise, we want to estimate κ. Bani K Mallick Bayesian Inverse problems and UQ 42 / 59
  • 51. An illustration : porous media flow Time Left: a log permeability field. Right: water-cut (proportion of water) vs the normalized time. Bani K Mallick Bayesian Inverse problems and UQ 43 / 59
  • 52. Observation Location and κ 10 20 30 40 50 5 10 15 20 25 30 35 40 45 50 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 0.4 0.5 * * * * ** * * * * * * Observation location and true field Bani K Mallick Bayesian Inverse problems and UQ 44 / 59
  • 53. Parametrization of κ using KLE Unknown spatial field can be modeled as a realization of Gaussian random field: log(κ(x, ˜ω)) = ∞ k=1 λkθk(˜ω)Φk(x), θk ∼ N(0, 1). The underlying covariance Kernel is given by R(x, y) with R(x, y)φk(x)dx = λkφk(y); Ex: R(x, y) = σ2 κe− x−y 2 2l2 (isotropic) Our goal is to find the posterior of κ: Π(κ|Y ) Bani K Mallick Bayesian Inverse problems and UQ 45 / 59
  • 54. Reduction under separability Separating the spatial and parametric part: U = N1 i=1 ai (µ)vi (x). Here κ is parametrized by finite dimensional µ = {θ1, . . . , θN2 }. We use N2 terms in KLE. We impose ai (µ) = N2 j=1 ai,j (θj ). U = Nterm i=1   N2 j=1 ai,j (θj )   vi (x). Assume, ai,j (.) ∼ G P(0, K ); where K is Covariance Kernel. Bani K Mallick Bayesian Inverse problems and UQ 46 / 59
  • 55. Hierarchical Model Y = U(κ) + f , f ∼ N(0, σ2 f ), U = N1 i=1 ai (µ)vi (x) ai (µ) = N2 j=1 ai,j (θj ) ai,j (.) ∼ G P(0, K ), log(κ(x, ˜ω)) = ∞ k=1 λkθk(˜ω)Φk(x), θk ∼ N(0, 1). Due to separability structures, it is possible to develop the variational Bayes algorithmBani K Mallick Bayesian Inverse problems and UQ 47 / 59
  • 56. Posterior Property Let H be the probability distribution for the observation location on the domain Ωx . We show Theorem For any neighborhood U of f ∗ x (y) (true density) given as Ωx Ωy |fx (y) − f ∗ x (y)|dydHx < , we have Π(U |data) → 1 with probability 1 as N(number of observations) → ∞. Bani K Mallick Bayesian Inverse problems and UQ 48 / 59
  • 57. Example Isotropic covariance kernel is used for KLE and to generate the permeability field N1 = 50, N2 = 30 For ai,,j ’s the Gaussian process covariance Kernel is given by Gaussian kernel Bani K Mallick Bayesian Inverse problems and UQ 49 / 59
  • 58. Example 10 20 30 40 50 5 10 15 20 25 30 35 40 45 50 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 0.4 0.5 10 20 30 40 50 5 10 15 20 25 30 35 40 45 50 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 0.4 0.5 10 20 30 40 50 5 10 15 20 25 30 35 40 45 50 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 0.4 0.5 True field Variational Bayes solution MCMC Solution Bani K Mallick Bayesian Inverse problems and UQ 50 / 59
  • 59. Posterior coefficient in variational Bayes -5 -4 -3 -2 -1 0 1 2 3 4 5 0 0.5 1 1.5 2 2.5 3 3.5 Posterior Distribution Posterior Mean Prior Distribution Posterior and prior density for a coefficient (θ3) of the KLE parametrization. Bani K Mallick Bayesian Inverse problems and UQ 51 / 59
  • 60. Variational Bayes convergence 0 5 10 15 20 25 30 -5 -4 -3 -2 -1 0 1 2 3 4 5 Convergence of the KLE coefficients over time. X axis shows the number of iteration and Y axis shows the value of the coefficients θ1, . . . , θN2 . Bani K Mallick Bayesian Inverse problems and UQ 52 / 59
  • 61. Introduction Reservoir Prediction Rock / Fluid Physics Multiple High Resolution Geologic Models Data Assimilation & Model Calibration Seismic Input Time Lapse Seismic Multiphase Production Data Fast Flow Simulation Permanent Down hole Sensor s 1 s 2 s 3 s n … t1 t2 t3 t m … Subsurface characterization is to identify subsurface properties taking into account various data sources and their precisions and predict future reservoir performance.
  • 62. Spatial fields with channelized structure Geologic environments often contains distinct geologic facies. Sharp contrasts across facies boundaries. Orientation of channel, channel geometry determines the fluid flow. Bani K Mallick Bayesian Inverse problems and UQ 53 / 59
  • 63. Non-Gaussian Field Permeability field is non-Gaussian It can be decomposed into subregions where each region represent a facies Within each facies, the permeability field will be modeled using a log-Gaussian process Hence we can express it as k(x) = i IDi(x)ki (x) where ID is an indicator function of the region D (i.e. I(x) = 1 if x ∈ D and I(x) = 0 otherwise We estimate the boundaries of the facies (interfaces) adaptively. Bani K Mallick Bayesian Inverse problems and UQ 54 / 59
  • 64. Non-Gaussian Field The boundaries are represented by piecewise linear functions The location and number of pieces are unknown That way it is capable of reproducing a wide variety of channel geometry The shape of the channel boundaries is updated using the data Reversible jump MCMC is required to do so Bani K Mallick Bayesian Inverse problems and UQ 55 / 59
  • 66. Permeability Comparison TRUE MODEL INITIAL MODEL 4 UPDATED MODELS After history matching, we could change channel connectivity
  • 67. Permeability Comparison 4 UPDATED MODELS TRUE MODEL INITIAL MODEL
  • 68. Conclusion There are alternative ways of fast posterior approximation or computation methods (ABC, Multistage Preconditioned MCMC, Hamiltonian Monte Carlo) Fast Emulators to replace expensive simulators Integration of Data with Science Bani K Mallick Bayesian Inverse problems and UQ 56 / 59
  • 69. Conclusion Production Data from unconventional reservoir (shale reservoirs) Decline Curve analysis: oil production over time (Weibull Model, Arps Hyperbolic Model, Duong Model) How to integrate these models with PDEs (physical Laws)? Integration of Machine learning methods with PDEs Bani K Mallick Bayesian Inverse problems and UQ 57 / 59
  • 70. References Yang, K., Guha, N., Efendiev, E. and Mallick, B. (2017) Bayesian and Variational Bayes approaches for fows in heterogeneous random media”, Journal of Computational Physics, 345, 275-293. Guha, N., Wu,X., Efendiev, Y, Jin, B, and Mallick, B. (2015) A Variational approach for inverse problems with skew-t error distributions,” Journal of Computational Physics, 301, C;377-393. Efendiev, Y., Leung, W. T., Cheung, S. W., Guha, N., Hoang, V. H., Mallick, B. (2017) Bayesian Multiscale Finite Element Methods modeling missing subgrid information proba- bilistically, International Journal for Multiscale Computational Engineering, 2,1,175-197. Bani K Mallick Bayesian Inverse problems and UQ 58 / 59
  • 71. References Mondal, A., Mallick, B., Efendiev, Y. and Datta-Gupta, A. (2014), Bayesian uncertainty quantification for subsurface inversion using multiscale hierarchical model”, Technometrics,56,3,381- 392. Chakraborty, A., Bingham, D., Dhavala, S., Kuranz, C., Drake, P., Grosskopf, M., Rutter, E., Holloway, J., McClarren, R and Mallick, B. (2017) Emulation of Numerical Models with over-specified basis functions”, Technometrics, 2, 59, 153-164. Chakraborty, A., Mallick, B., McClareren, R., Kuranz, C. and Drake, P. (2013) Spline based emulators for radiative shock experiments with measurement error”, Journal of the American Statistical Association,108, 411-428 Bani K Mallick Bayesian Inverse problems and UQ 59 / 59