Choosing the Right CBSE School A Comprehensive Guide for Parents
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ralph Smith, November 6, 2018
1. Uncertainty Propagation
Setting:
• We assume that we have determined distributions for parameters
• e.g., Bayesian inference, prior experiments, expert opinion
Goal: Construct statistics for quantities of interest
• e.g., Expected viral load in HIV patient with
appropriate uncertainty intervals
• Note: Often involves moderate to high-
dimensional integration
˙T1 = 1 - d1T1 - (1 - ")k1VT1
˙T2 = 2 - d2T2 - (1 - f")k2VT2
˙T⇤
1 = (1 - ")k1VT1 - T⇤
1 - m1ET⇤
1
˙T⇤
2 = (1 - f")k2VT2 - T⇤
2 - m2ET⇤
2
˙V = NT (T⇤
1 + T⇤
2 ) - cV - [(1 - ")⇢1k1T1 + (1 - f")⇢2k2T2]V
˙E = E +
bE (T⇤
1 + T⇤
2 )
T⇤
1 + T⇤
2 + Kb
E -
dE (T⇤
1 + T⇤
2 )
T⇤
1 + T⇤
2 + Kd
E - E E
E[V(t)] =
Z
R6
V(t, q)⇢(q)dq
Questions:
• How do we effectively propagate
input uncertainties?
• Efficient quadrature techniques
2. Uncertainty Propagation Techniques
Techniques and Issues:
• Analytic expressions for linearly parameterized problems
• Perturbation methods for nonlinearly parameterized problems
• Direct sampling methods
o Often require surrogate models
• Computation of moments using stochastic spectral methods
o Stochastic Galerkin – aka polynomial chaos
o Discrete projection
o Stochastic collocation
3. Forward Uncertainty Propagation: Linear Models
Linear Models: Analytic mean and variance relations
Example: Linear stress-strain relation
⌥i = Eei + E2e3
i + "i , i = 1, . . . , n
Let E, E2 and var(E), var(E2) denote parameter means and variance. Then
Model Statistics:
E[Eei + E2e3
i ] = Eei + E2e3
i
var[Eei + E2e3
i ] = e2
i var(E) + e6
i var(E2) + 2e4
i cov(E, E2)
Response Statistics: Assume measurement errors uncorrelated from model
response.
E[⌥i] = Eei + E2e3
i
var[⌥i] = e2
i var(E) + e6
i var(E2) + 2e4
i cov(E, E2) + var("i)
Problem: Models are almost always nonlinearly parameterized
4. Forward Uncertainty Propagation: Perturbation Methods
Strategy: Consider
Q = ¯q + Q
= [¯q1 + Q1, . . . , ¯qp + qp]
Then
f(Q) = f(¯q) +
pX
i=1
⇥f
⇥Qi ¯q
QI + H.O.T
⇡ ¯y +
pX
i=1
si Qi
Q = [Q1, . . . , Qp] with joint pdf Q(q). Take
and
E(Qi) = ¯qi
var(Qi) =
Z
Rp
(qi ¯qi)2
⇥Q(q)dq =
Z
Rp
( qi)2
⇥Q(q)dq
cov(Qi, Qj) =
Z
Rp
(qi ¯qi)(qj ¯qj)⇥Q(q)dq =
Z
Rp
qi qj⇥Q(q)dq
5. Forward Uncertainty Propagation: Perturbation Methods
It follows that
E[f(Q)] = ¯y
Z
Rp
Q(q)dq +
pX
i=1
si
Z
Rp
(qi ¯qi) Q(q)dq = ¯y
and
var[f(Q)] = E[(f(Q) ¯y)2
]
=
Z
Rp
pX
i=1
si qi
!2
⇥Q(q)dq
=
pX
i=1
s2
i
Z
Rp
( qi)2
⇥Q(q)dq +
pX
i=1
pX
j=1
j6=i
sisj
Z
Rp
( qi)( qj)⇥Q(q)dq
=
pX
i=1
s2
i var(Qi) +
pX
i=1
pX
j=1
j6=i
sisjcov(Qi, Qj)
= ST
V S
Notes:
• S and V are the local sensitivity vector and covariance matrix.
• This is often termed the “sandwich relation”.
• Suppose Qi ⇠ N(¯qi, 2
i ) are mutually independent. Then F(Q) ⇠ N(¯y, ST
V S).
6. Forward Uncertainty Propagation: Sampling Methods
Advantages:
• Applicable to nonlinear models.
• Parameters can be correlated and non-Gaussian.
• Straight-forward to apply and convergence rate is independent of number of
parameters.
• Can directly incorporate both parameter and measurement uncertainties.
• No additional cost for DRAM if interpolating.
Disadvantages:
• Very slow convergence rate:
• 100-fold more evaluations required to gain additional place of accuracy.
O‘ (1/
p
M) where M is the number of samples.
Strategy: Randomly sample from parameter and measurement error
distributions and propagate through model to quantify response uncertainty.
7. Perturbation and Sampling for Forward Propagation
Example: Consider
m
d2
z
dt2
+ c
dz
dt
= kz = f0 cos( F t)
z(0) = z0 ,
dz
dt
(0) = z1
with Q = [m,c,k]. This has the amplitude
z0(Q) =
f0
p
m2( 2
0
2
F )2 + c2 2
F
, 0 =
p
k/m
Consider the response
y = f( F , Q) =
z0(Q)
f0
=
1
p
(k m 2
F )2 + (c 2
F )
Take q = [m, c, k] ⇠ N(¯q, V ) where
¯q = [2.7, 0.24, 8.5] , V =
2
4
0.0022
0 0
0 0.0652
0
0 0 .0012
3
5
8. Perturbation and Sampling for Forward Propagation
Notes:
• Natural frequency
• Analytic sensitivity relations
¯0 = 1.7743 Hz
sT
=
f
m
,
f
c
,
f
dk
Compare:
• Perturbation result
• Sample M = 10,000 and construct sample mean and variance
¯ys( F ) =
1
M
MX
m=1
f( F , qm
)
2
s ( F ) =
1
M 1
MX
m=1
[f( F , qm
) ¯ys( F )]
2
9. Perturbation and Sampling for Forward Propagation
Results:
1.4 1.6 1.8 2 2.2 2.4 2.6
0
0.5
1
1.5
2
2.5
3
F
Response
Sampling Mean
Perturbation Mean
1.4 1.6 1.8 2 2.2 2.4 2.6
0
0.5
1
1.5
F
ResponseStandardDeviation
s
p
Means Standard Deviations
11. Review of Confidence and Credible Intervals for Parameters
Data: = [ 1, · · · , n] of iid random observations
90% Confidence Intervals 90% Credible Interval
Confidence Interval (Frequentist): A 100 ⇥ (1 ↵)% confidence interval for a
Credible Interval (Bayesian): A 100 ⇥ (1 ↵) % credible interval is that having
probability at least 1 of containing q.
Strategy: Sample out of parameter density Q(q)
fixed, unknown parameter q0 is a random interval [Lc( ), Uc( )], having
probability at least 1 of covering q0 under the joint distribution of .
12. Confidence and Prediction Intervals for Responses
Linear Model:
Consider x0 in domain but not among data used to estimate q and .
= Xq0 + ⇥ , ⇥i ⇠ N(0, 2
0)
Two Cases:
(i) New prediction ⌥x0
Case (ii) First: Consider the unbiased estimator
(ii) Mean response µx0 = E( x0 )
bx0 = xT
0 ˆq
Example: Linear regression
i = q1 +
pX
j=2
xijqj + i , i = 1, · · · , n
x0 = [1, x02, · · · , x0p]T
13. Confidence and Prediction Intervals for Responses
From the properties
var
nX
i=1
aiXi
!
=
nX
i=1
a2
i var(Xi)
var(ˆq) = 2
0(XT
X) 1
it follows that
var(bx0 ) = 2
0
⇥
xT
0 (XT
X) 1
x0
⇤
Estimator:
b2
(bx0 ) = b2
⇥
xT
0 (XT
X) 1
x0
⇤
Note: ⇥i ⇠ N(0, 2
0) implies that
bx0 µx0
0
p
xT
0 (XT X) 1x0
⇠ N(0, 1)
14. Confidence and Prediction Intervals for Responses
Using Property 7.10 and 7.11, if follows that
T =
bx0 µx0
b
p
xT
0 (XT X) 1x0
⇠ t(n p)
Confidence Interval:
bx0
± tn p,1 /2 · b
q
xT
0 (XT X) 1x0
Prediction Interval: Assume that estimators ˆq and ˆ2
have been computed
using previous data so independent from ⌥x0 . Then ⌥x0
b⌥x0 is normally
distributed with
E( x0
bx0 ) = xT
0 q0 xT
0 E(ˆq) = 0
var( x0
bx0 ) = var( x0 ) + var(bx0 )
= 2
0 + 2
0xT
0 (XT
X) 1
x0
= 2
0
⇥
1 + xT
0 (XT
X) 1
x0
⇤
15. Prediction Intervals for Responses
Prediction Interval:
x0
± tn p,1 /2 · b
q
1 + xT
0 (XT X) 1x0
Data: = [ 1, · · · , n] of iid random observations
Prediction Interval: A 100 ⇥ (1 ↵)% prediction interval for a future observable
n+1 is a random interval [Lc( ), Uc( )] having probability at least 1 of
of containing n+1 under the joint distribution of ( , n+1).
16. Uncertainty Propagation in Models
Example: Linear model"
i = q1 + q2x2
i + i
with q0 = [0.6, 1.2] and 0 = 3.
0 1 2 3 4 5
−10
0
10
20
30
40
x
0 1 2 3 4 5
−10
0
10
20
30
40
x
Frequentist Confidence and
Prediction Intervals"
Bayesian Credible and
Prediction Intervals"
Note: Confidence and credible intervals grow for extrapolatory predictions."
18. Uncertainty Propagation in Models
Example: HIV model"
2500 5000 7500 10000 12500 15000
0.2985
0.2995
0.3005
0.3015
0.3025
b
E
Index
2500 5000 7500 10000 12500 15000
0.67
0.675
0.68
0.685
Index
0 50 100 150 200
0
10
20
30
40
50
60
Time (s)
E
30 35 40 45 50
35
40
45
50
Time (s)
E
19. Use of Prediction Intervals: Nuclear Power Plant Design
e.g., Dittus—Boelter Relation
Subchannel Code (COBRA-TF): numerous closure relations, ~70 parameters
i.e., [0, 0.046], [0, 1.6], [0,0.8]
Industry Standard: Employ conservative, uniform, bounds
Bayesian Analysis: Employ conservative bounds as priors
Note: Substantial reduction in parameter uncertainty
Nu = 0.023Re0.8
Pr0.4
Nu: Nusselt number
Re: Reynolds number
Pr: Prandtl number
2 ⇡ 0.0035 2 ⇡ 0.06 2 ⇡ 0.03
20. Use of Prediction Intervals: Nuclear Power Plant Design
Strategy: Propagate parameter uncertainties through COBRA-TF to
determine uncertainty in maximum fuel temperature
Notes:
• Temperature uncertainty reduced
from 40 degrees to 5 degrees
• Can run plant 20 degrees hotter,
which significantly improves efficiency
Ramification: Savings of 10 billion dollars per year for US power plants
Issues:
• We considered only one of many closure relations
• Nuclear regulatory commission takes years to change requirements and codes
Good News: We are now working with Westinghouse to reduce uncertainties.