SlideShare a Scribd company logo
1 of 35
Download to read offline
Jittered Sampling: Bounds and Problems
Stefan Steinerberger
joint with Florian Pausinger (Belfast), Manas Rachh (Yale)
Florian Pausinger (Belfast) and Manas Rachh (Yale)
QMC: the standard Dogma
Star discrepancy.
D⇤
N(X) = sup
R⇢[0,1]d
# {i : xi 2 R}
N
|R|
This is a good quantity to minimize because
Theorem (Koksma-Hlawka)
Z
[0,1]d
f (x)dx
1
N
NX
n=1
f (xn) . (D⇤
N) (var(f )) .
In particular: error only depends on the oscillation of f .
QMC: the standard Dogma
Star discrepancy.
D⇤
N(X) = sup
R⇢[0,1]d
# {i : xi 2 R}
N
|R|
Two competing conjectures (emotionally charged subject)
D⇤
N &
(log N)d 1
N
or D⇤
N &
(log N)d/2
N
.
There are many clever constructions of point set that achieve
D⇤
N .
(log N)d 1
N
.
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop, Jittered Sampling: Bounds & Problems - Stefan Steinberger, Dec 14, 2017
QMC: the standard Dogma
D⇤
N &
(log N)d 1
N
or D⇤
N &
(log N)d/2
N
.
How would one actually try to prove this? Open for 80+ years,
that sounds bad.
Small ball conjecture seems spiritually related.
Interlude: the small ball conjecture
+1 1
1 +1
+1 1
1 +1
Haar functions hR on rectangles R.
Interlude: the small ball conjecture
All dyadic rectangles of area 2 2.
Interlude: the small ball conjecture
Small ball conjecture, Talagrand (1994)
For all choices of sign "R 2 { 1, 1}
X
|R|=2 n
"RhR
L1
& nd/2
.
1. Talagrand cared about behavior of the Brownian sheet.
2. The lower bound & n(d 1)/2 is easy.
3. The case d = 2 is the only one that has been settled: three
proofs due to M. Talagrand, V. Temlyakov (via Riesz
products) and a beautiful one by Bilyk & Feldheim.
4. Only partial results in d 3 (Bilyk, Lacey, etc.)
Interlude: the small ball conjecture
Small ball conjecture, Talagrand (1994)
For all choices of sign "R 2 { 1, 1}
X
|R|=2 n
"RhR
L1
& nd/2
.
A recent surprise
Theorem (Noah Kravitz, arXiv:1712.01206)
For any choice of signs "R and any integer 0  k  n + 1,
8
<
:
x 2 [0, 1)2
:
X
|R|=2 n
"RhR = n + 1 2k
9
=
;
=
1
2n+1
✓
n + 1
k
◆
.
Problem with the Standard Dogma
Star discrepancy.
D⇤
N(X) = sup
R⇢[0,1]d
# {i : xi 2 R}
N
|R|
The constructions achieving
D⇤
N .
(log N)d 1
N
start being e↵ective around N dd (actually a bit larger even).
More or less totally useless in high dimensions.
Monte Carlo strikes back
Star discrepancy.
D⇤
N(X) = sup
R⇢[0,1]d
# {i : xi 2 R}
N
|R|
We want error bounds in N, d!
(Heinrich, Novak, Wasilkowski, Wozniakowski, 2002)
There are points
D⇤
N(X) .
d
p
N
.
This is still the best result. (Aistleitner 2011: constant c = 10).
How do you get these points? Monte Carlo
Jittered Sampling
If we already agree to distribute points randomly, we might just as
well distribute them randomly in a clever way.
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Bellhouse, 1981
Cook, Porter & Carpenter, 1984
Cook, Porter & Carpenter, 1984
A Recent Application in Compressed Sensing (Nov 2015)
Theorem (Beck, 1987)
E D⇤
N(jittered sampling)  Cd
(log N)
1
2
N
1
2
+ 1
2d
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Theorem (Beck, 1987)
E D⇤
N(jittered sampling)  Cd
(log N)
1
2
N
1
2
+ 1
2d
I a very general result for many di↵erent discrepancies
I L2 based discrepancies (Chen & Travaligni, 2009)
I Problem: same old constant Cd (might be huge, the way the
proof proceeds it will be MASSIVE)
Theorem (Pausinger and S., 2015)
For N su ciently large (depending on d)
1
10
d
N
1
2
+ 1
2d
 ED⇤
N(P) 
p
d(log N)
1
2
N
1
2
+ 1
2d
.
I ’su ciently large’ is bad (talk about this later)
I lower bound can probably be improved
I upper bound not by much
How the proof works
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
How the proof works
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
How the proof works
• •
•
•
Maximize discrepancy over
p
N dimensional set in [0, N 1/2].
DN ⇠
pp
N
p
N
1
p
N
=
1
N3/4
.
I lose a logarithm
I union bound on the other cubes
Large
Large
Large
smallsmall
small
The large contribution
comes from codimension 1 sets.
In d dimensions, we therefore expect the main contribution of the
discrepancy to behave like
DN ⇠
p
N
d 1
d
N
d 1
d
1
N
1
d
=
1
N
d 1
2d
1
N
1
d
=
1
N
d+1
2d
.
Of course, there is also a log. Adding up this quantity d times
(because there are d fat slices of codimension 1) gives us an upper
bound of
DN .
d
p
log N
N
d+1
2d
.
Want to improve this a bit: standard Bernstein inequalities aren’t
enough.
Sharp Dvoretzy-Kiefer-Wolfowitz inequality (Massart, 1990)
If z1, z2, . . . , zk are independently and uniformly distributed
random variables in [0, 1], then
P
✓
sup
0z1
# {1  `  k : 0  z`  z}
k
z > "
◆
 2e 2k"2
.
limit ! Brownian Bridge ! Kolmogorov-Smirnov distribution
Refining estimates
This yields a refined Bernstein inequality for very quickly decaying
expectations.
Rumors!
Figure: Benjamin Doerr (Ecole Polytechnique (Paris))
Benjamin Doerr probably removed a
p
log d (?). Sadly, still not
e↵ective for small N (?).
What partition gives the best jittered sampling?
You want to decompose [0, 1]2 into 4 sets such that the associated
jittered sampling construction is as e↵ective as possible. How?
•
•
•
•
Is this good? Is this bad? Will it be into 4 parts of same volume?
We don’t actually know.
Jittered sampling always improves: variance reduction
Decompose [0, 1]d into sets of equal measure
[0, 1]d
=
N[
i=1
⌦i such that 8 1  i  N : |⌦i | =
1
N
and measure using the L2 discrepancy
L2(A) :=
Z
[0,1]d
#A  [0, x]
#A
|[0, x]|
2
dx
!1
2
.
Observation (Pausinger and S., 2015)
E L2(Jittered Sampling⌦)2
 E L2(Purely randomN)2
,
Main Idea: Variance Reduction
(What happens in L3?)
How to select 2 points: expected squared L2
discrepancy
MC
0.0694 0.0638 0.0555 0.05
•
•
•
0.04700.0471
Theorem (Florian Pausinger, Manas Rachh, S.)
Among all splittings of a domain given by a function y = f (x) with
symmetry around x = y, the following subdivison is optimal.
0.04617
The Most Nonlinear Integral Equation I’ve Ever Seen
Theorem (Florian Pausinger, Manas Rachh, S.)
Any optimal monotonically decreasing function g(x) whose graph
is symmetric about y = x satisfies, for 0  x  g 1(0),
(1 2p 4xg(x)) (1 g(x)) + (4p 1)x 1 g(x)2
4
Z g 1(0)
g(x)
(1 y)g (y)dy + g0
(x) (1 2p 4xg(x)) (1 x)
+ (4p 1)g(x) 1 x2
4
Z g 1(0)
x
(1 y)g(y)dy = 0.
Question. How to do 3 points in [0, 1]2? Simple rules?
Many thanks!

More Related Content

What's hot

Unbiased Bayes for Big Data
Unbiased Bayes for Big DataUnbiased Bayes for Big Data
Unbiased Bayes for Big DataChristian Robert
 
Can we estimate a constant?
Can we estimate a constant?Can we estimate a constant?
Can we estimate a constant?Christian Robert
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsChristian Robert
 
accurate ABC Oliver Ratmann
accurate ABC Oliver Ratmannaccurate ABC Oliver Ratmann
accurate ABC Oliver Ratmannolli0601
 

What's hot (20)

Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Nested sampling
Nested samplingNested sampling
Nested sampling
 
2018 MUMS Fall Course - Bayesian inference for model calibration in UQ - Ralp...
2018 MUMS Fall Course - Bayesian inference for model calibration in UQ - Ralp...2018 MUMS Fall Course - Bayesian inference for model calibration in UQ - Ralp...
2018 MUMS Fall Course - Bayesian inference for model calibration in UQ - Ralp...
 
Unbiased Bayes for Big Data
Unbiased Bayes for Big DataUnbiased Bayes for Big Data
Unbiased Bayes for Big Data
 
ABC in Venezia
ABC in VeneziaABC in Venezia
ABC in Venezia
 
Can we estimate a constant?
Can we estimate a constant?Can we estimate a constant?
Can we estimate a constant?
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
accurate ABC Oliver Ratmann
accurate ABC Oliver Ratmannaccurate ABC Oliver Ratmann
accurate ABC Oliver Ratmann
 
CLIM Fall 2017 Course: Statistics for Climate Research, Estimating Curves and...
CLIM Fall 2017 Course: Statistics for Climate Research, Estimating Curves and...CLIM Fall 2017 Course: Statistics for Climate Research, Estimating Curves and...
CLIM Fall 2017 Course: Statistics for Climate Research, Estimating Curves and...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
QMC Opening Workshop, High Accuracy Algorithms for Interpolating and Integrat...
QMC Opening Workshop, High Accuracy Algorithms for Interpolating and Integrat...QMC Opening Workshop, High Accuracy Algorithms for Interpolating and Integrat...
QMC Opening Workshop, High Accuracy Algorithms for Interpolating and Integrat...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 

Similar to QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop, Jittered Sampling: Bounds & Problems - Stefan Steinberger, Dec 14, 2017

Density theorems for Euclidean point configurations
Density theorems for Euclidean point configurationsDensity theorems for Euclidean point configurations
Density theorems for Euclidean point configurationsVjekoslavKovac1
 
Jurnal informatika
Jurnal informatika Jurnal informatika
Jurnal informatika MamaMa28
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixturesChristian Robert
 
A Szemeredi-type theorem for subsets of the unit cube
A Szemeredi-type theorem for subsets of the unit cubeA Szemeredi-type theorem for subsets of the unit cube
A Szemeredi-type theorem for subsets of the unit cubeVjekoslavKovac1
 
Density theorems for anisotropic point configurations
Density theorems for anisotropic point configurationsDensity theorems for anisotropic point configurations
Density theorems for anisotropic point configurationsVjekoslavKovac1
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clusteringFrank Nielsen
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)Christian Robert
 
Modeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential EquationModeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential EquationMark Chang
 
aads_assignment2_answer.pdf
aads_assignment2_answer.pdfaads_assignment2_answer.pdf
aads_assignment2_answer.pdfNanaKoori
 
A Mathematically Derived Number of Resamplings for Noisy Optimization (GECCO2...
A Mathematically Derived Number of Resamplings for Noisy Optimization (GECCO2...A Mathematically Derived Number of Resamplings for Noisy Optimization (GECCO2...
A Mathematically Derived Number of Resamplings for Noisy Optimization (GECCO2...Jialin LIU
 
Divide and conquer
Divide and conquerDivide and conquer
Divide and conquerVikas Sharma
 
4 litvak
4 litvak4 litvak
4 litvakYandex
 
Decomposition and Denoising for moment sequences using convex optimization
Decomposition and Denoising for moment sequences using convex optimizationDecomposition and Denoising for moment sequences using convex optimization
Decomposition and Denoising for moment sequences using convex optimizationBadri Narayan Bhaskar
 
Prime numbers boundary
Prime numbers boundary Prime numbers boundary
Prime numbers boundary Camilo Ulloa
 
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...Gabriel Peyré
 

Similar to QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop, Jittered Sampling: Bounds & Problems - Stefan Steinberger, Dec 14, 2017 (20)

Density theorems for Euclidean point configurations
Density theorems for Euclidean point configurationsDensity theorems for Euclidean point configurations
Density theorems for Euclidean point configurations
 
Jurnal informatika
Jurnal informatika Jurnal informatika
Jurnal informatika
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
A Szemeredi-type theorem for subsets of the unit cube
A Szemeredi-type theorem for subsets of the unit cubeA Szemeredi-type theorem for subsets of the unit cube
A Szemeredi-type theorem for subsets of the unit cube
 
Density theorems for anisotropic point configurations
Density theorems for anisotropic point configurationsDensity theorems for anisotropic point configurations
Density theorems for anisotropic point configurations
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clustering
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)
 
Proof Techniques
Proof TechniquesProof Techniques
Proof Techniques
 
Lecture5
Lecture5Lecture5
Lecture5
 
Modeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential EquationModeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential Equation
 
aads_assignment2_answer.pdf
aads_assignment2_answer.pdfaads_assignment2_answer.pdf
aads_assignment2_answer.pdf
 
A Mathematically Derived Number of Resamplings for Noisy Optimization (GECCO2...
A Mathematically Derived Number of Resamplings for Noisy Optimization (GECCO2...A Mathematically Derived Number of Resamplings for Noisy Optimization (GECCO2...
A Mathematically Derived Number of Resamplings for Noisy Optimization (GECCO2...
 
Quantum chaos of generic systems - Marko Robnik
Quantum chaos of generic systems - Marko RobnikQuantum chaos of generic systems - Marko Robnik
Quantum chaos of generic systems - Marko Robnik
 
Divide and conquer
Divide and conquerDivide and conquer
Divide and conquer
 
Adaline and Madaline.ppt
Adaline and Madaline.pptAdaline and Madaline.ppt
Adaline and Madaline.ppt
 
4 litvak
4 litvak4 litvak
4 litvak
 
Decomposition and Denoising for moment sequences using convex optimization
Decomposition and Denoising for moment sequences using convex optimizationDecomposition and Denoising for moment sequences using convex optimization
Decomposition and Denoising for moment sequences using convex optimization
 
Prime numbers boundary
Prime numbers boundary Prime numbers boundary
Prime numbers boundary
 
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
 

More from The Statistical and Applied Mathematical Sciences Institute

More from The Statistical and Applied Mathematical Sciences Institute (20)

Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
 
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
 
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
 
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
 
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
 
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
 
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
 
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
 
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
 
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
 
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
 
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
 
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
 
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
 
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
 
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
 
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
 
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
 
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
 
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
 

Recently uploaded

30-de-thi-vao-lop-10-mon-tieng-anh-co-dap-an.doc
30-de-thi-vao-lop-10-mon-tieng-anh-co-dap-an.doc30-de-thi-vao-lop-10-mon-tieng-anh-co-dap-an.doc
30-de-thi-vao-lop-10-mon-tieng-anh-co-dap-an.docdieu18
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - HK2 (...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - HK2 (...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - HK2 (...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - HK2 (...Nguyen Thanh Tu Collection
 
3.12.24 Freedom Summer in Mississippi.pptx
3.12.24 Freedom Summer in Mississippi.pptx3.12.24 Freedom Summer in Mississippi.pptx
3.12.24 Freedom Summer in Mississippi.pptxmary850239
 
3.14.24 The Selma March and the Voting Rights Act.pptx
3.14.24 The Selma March and the Voting Rights Act.pptx3.14.24 The Selma March and the Voting Rights Act.pptx
3.14.24 The Selma March and the Voting Rights Act.pptxmary850239
 
ICS2208 Lecture4 Intelligent Interface Agents.pdf
ICS2208 Lecture4 Intelligent Interface Agents.pdfICS2208 Lecture4 Intelligent Interface Agents.pdf
ICS2208 Lecture4 Intelligent Interface Agents.pdfVanessa Camilleri
 
Dhavni Theory by Anandvardhana Indian Poetics
Dhavni Theory by Anandvardhana Indian PoeticsDhavni Theory by Anandvardhana Indian Poetics
Dhavni Theory by Anandvardhana Indian PoeticsDhatriParmar
 
LEAD6001 - Introduction to Advanced Stud
LEAD6001 - Introduction to Advanced StudLEAD6001 - Introduction to Advanced Stud
LEAD6001 - Introduction to Advanced StudDr. Bruce A. Johnson
 
LEAD6001 - Introduction to Advanced Stud
LEAD6001 - Introduction to Advanced StudLEAD6001 - Introduction to Advanced Stud
LEAD6001 - Introduction to Advanced StudDr. Bruce A. Johnson
 
25 CHUYÊN ĐỀ ÔN THI TỐT NGHIỆP THPT 2023 – BÀI TẬP PHÁT TRIỂN TỪ ĐỀ MINH HỌA...
25 CHUYÊN ĐỀ ÔN THI TỐT NGHIỆP THPT 2023 – BÀI TẬP PHÁT TRIỂN TỪ ĐỀ MINH HỌA...25 CHUYÊN ĐỀ ÔN THI TỐT NGHIỆP THPT 2023 – BÀI TẬP PHÁT TRIỂN TỪ ĐỀ MINH HỌA...
25 CHUYÊN ĐỀ ÔN THI TỐT NGHIỆP THPT 2023 – BÀI TẬP PHÁT TRIỂN TỪ ĐỀ MINH HỌA...Nguyen Thanh Tu Collection
 
Research Methodology and Tips on Better Research
Research Methodology and Tips on Better ResearchResearch Methodology and Tips on Better Research
Research Methodology and Tips on Better ResearchRushdi Shams
 
BÀI TẬP BỔ TRỢ TIẾNG ANH 11 THEO ĐƠN VỊ BÀI HỌC - CẢ NĂM - CÓ FILE NGHE (GLOB...
BÀI TẬP BỔ TRỢ TIẾNG ANH 11 THEO ĐƠN VỊ BÀI HỌC - CẢ NĂM - CÓ FILE NGHE (GLOB...BÀI TẬP BỔ TRỢ TIẾNG ANH 11 THEO ĐƠN VỊ BÀI HỌC - CẢ NĂM - CÓ FILE NGHE (GLOB...
BÀI TẬP BỔ TRỢ TIẾNG ANH 11 THEO ĐƠN VỊ BÀI HỌC - CẢ NĂM - CÓ FILE NGHE (GLOB...Nguyen Thanh Tu Collection
 
Metabolism of lipoproteins & its disorders(Chylomicron & VLDL & LDL).pptx
Metabolism of  lipoproteins & its disorders(Chylomicron & VLDL & LDL).pptxMetabolism of  lipoproteins & its disorders(Chylomicron & VLDL & LDL).pptx
Metabolism of lipoproteins & its disorders(Chylomicron & VLDL & LDL).pptxDr. Santhosh Kumar. N
 
2024 March 11, Telehealth Billing- Current Telehealth CPT Codes & Telehealth ...
2024 March 11, Telehealth Billing- Current Telehealth CPT Codes & Telehealth ...2024 March 11, Telehealth Billing- Current Telehealth CPT Codes & Telehealth ...
2024 March 11, Telehealth Billing- Current Telehealth CPT Codes & Telehealth ...Marlene Maheu
 
AUDIENCE THEORY - PARTICIPATORY - JENKINS.pptx
AUDIENCE THEORY - PARTICIPATORY - JENKINS.pptxAUDIENCE THEORY - PARTICIPATORY - JENKINS.pptx
AUDIENCE THEORY - PARTICIPATORY - JENKINS.pptxiammrhaywood
 
UNIT I Design Thinking and Explore.pptx
UNIT I  Design Thinking and Explore.pptxUNIT I  Design Thinking and Explore.pptx
UNIT I Design Thinking and Explore.pptxGOWSIKRAJA PALANISAMY
 
Auchitya Theory by Kshemendra Indian Poetics
Auchitya Theory by Kshemendra Indian PoeticsAuchitya Theory by Kshemendra Indian Poetics
Auchitya Theory by Kshemendra Indian PoeticsDhatriParmar
 
Plant Tissue culture., Plasticity, Totipotency, pptx
Plant Tissue culture., Plasticity, Totipotency, pptxPlant Tissue culture., Plasticity, Totipotency, pptx
Plant Tissue culture., Plasticity, Totipotency, pptxHimansu10
 
VIT336 – Recommender System - Unit 3.pdf
VIT336 – Recommender System - Unit 3.pdfVIT336 – Recommender System - Unit 3.pdf
VIT336 – Recommender System - Unit 3.pdfArthyR3
 
AI Uses and Misuses: Academic and Workplace Applications
AI Uses and Misuses: Academic and Workplace ApplicationsAI Uses and Misuses: Academic and Workplace Applications
AI Uses and Misuses: Academic and Workplace ApplicationsStella Lee
 

Recently uploaded (20)

30-de-thi-vao-lop-10-mon-tieng-anh-co-dap-an.doc
30-de-thi-vao-lop-10-mon-tieng-anh-co-dap-an.doc30-de-thi-vao-lop-10-mon-tieng-anh-co-dap-an.doc
30-de-thi-vao-lop-10-mon-tieng-anh-co-dap-an.doc
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - HK2 (...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - HK2 (...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - HK2 (...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - HK2 (...
 
Least Significance Difference:Biostatics and Research Methodology
Least Significance Difference:Biostatics and Research MethodologyLeast Significance Difference:Biostatics and Research Methodology
Least Significance Difference:Biostatics and Research Methodology
 
3.12.24 Freedom Summer in Mississippi.pptx
3.12.24 Freedom Summer in Mississippi.pptx3.12.24 Freedom Summer in Mississippi.pptx
3.12.24 Freedom Summer in Mississippi.pptx
 
3.14.24 The Selma March and the Voting Rights Act.pptx
3.14.24 The Selma March and the Voting Rights Act.pptx3.14.24 The Selma March and the Voting Rights Act.pptx
3.14.24 The Selma March and the Voting Rights Act.pptx
 
ICS2208 Lecture4 Intelligent Interface Agents.pdf
ICS2208 Lecture4 Intelligent Interface Agents.pdfICS2208 Lecture4 Intelligent Interface Agents.pdf
ICS2208 Lecture4 Intelligent Interface Agents.pdf
 
Dhavni Theory by Anandvardhana Indian Poetics
Dhavni Theory by Anandvardhana Indian PoeticsDhavni Theory by Anandvardhana Indian Poetics
Dhavni Theory by Anandvardhana Indian Poetics
 
LEAD6001 - Introduction to Advanced Stud
LEAD6001 - Introduction to Advanced StudLEAD6001 - Introduction to Advanced Stud
LEAD6001 - Introduction to Advanced Stud
 
LEAD6001 - Introduction to Advanced Stud
LEAD6001 - Introduction to Advanced StudLEAD6001 - Introduction to Advanced Stud
LEAD6001 - Introduction to Advanced Stud
 
25 CHUYÊN ĐỀ ÔN THI TỐT NGHIỆP THPT 2023 – BÀI TẬP PHÁT TRIỂN TỪ ĐỀ MINH HỌA...
25 CHUYÊN ĐỀ ÔN THI TỐT NGHIỆP THPT 2023 – BÀI TẬP PHÁT TRIỂN TỪ ĐỀ MINH HỌA...25 CHUYÊN ĐỀ ÔN THI TỐT NGHIỆP THPT 2023 – BÀI TẬP PHÁT TRIỂN TỪ ĐỀ MINH HỌA...
25 CHUYÊN ĐỀ ÔN THI TỐT NGHIỆP THPT 2023 – BÀI TẬP PHÁT TRIỂN TỪ ĐỀ MINH HỌA...
 
Research Methodology and Tips on Better Research
Research Methodology and Tips on Better ResearchResearch Methodology and Tips on Better Research
Research Methodology and Tips on Better Research
 
BÀI TẬP BỔ TRỢ TIẾNG ANH 11 THEO ĐƠN VỊ BÀI HỌC - CẢ NĂM - CÓ FILE NGHE (GLOB...
BÀI TẬP BỔ TRỢ TIẾNG ANH 11 THEO ĐƠN VỊ BÀI HỌC - CẢ NĂM - CÓ FILE NGHE (GLOB...BÀI TẬP BỔ TRỢ TIẾNG ANH 11 THEO ĐƠN VỊ BÀI HỌC - CẢ NĂM - CÓ FILE NGHE (GLOB...
BÀI TẬP BỔ TRỢ TIẾNG ANH 11 THEO ĐƠN VỊ BÀI HỌC - CẢ NĂM - CÓ FILE NGHE (GLOB...
 
Metabolism of lipoproteins & its disorders(Chylomicron & VLDL & LDL).pptx
Metabolism of  lipoproteins & its disorders(Chylomicron & VLDL & LDL).pptxMetabolism of  lipoproteins & its disorders(Chylomicron & VLDL & LDL).pptx
Metabolism of lipoproteins & its disorders(Chylomicron & VLDL & LDL).pptx
 
2024 March 11, Telehealth Billing- Current Telehealth CPT Codes & Telehealth ...
2024 March 11, Telehealth Billing- Current Telehealth CPT Codes & Telehealth ...2024 March 11, Telehealth Billing- Current Telehealth CPT Codes & Telehealth ...
2024 March 11, Telehealth Billing- Current Telehealth CPT Codes & Telehealth ...
 
AUDIENCE THEORY - PARTICIPATORY - JENKINS.pptx
AUDIENCE THEORY - PARTICIPATORY - JENKINS.pptxAUDIENCE THEORY - PARTICIPATORY - JENKINS.pptx
AUDIENCE THEORY - PARTICIPATORY - JENKINS.pptx
 
UNIT I Design Thinking and Explore.pptx
UNIT I  Design Thinking and Explore.pptxUNIT I  Design Thinking and Explore.pptx
UNIT I Design Thinking and Explore.pptx
 
Auchitya Theory by Kshemendra Indian Poetics
Auchitya Theory by Kshemendra Indian PoeticsAuchitya Theory by Kshemendra Indian Poetics
Auchitya Theory by Kshemendra Indian Poetics
 
Plant Tissue culture., Plasticity, Totipotency, pptx
Plant Tissue culture., Plasticity, Totipotency, pptxPlant Tissue culture., Plasticity, Totipotency, pptx
Plant Tissue culture., Plasticity, Totipotency, pptx
 
VIT336 – Recommender System - Unit 3.pdf
VIT336 – Recommender System - Unit 3.pdfVIT336 – Recommender System - Unit 3.pdf
VIT336 – Recommender System - Unit 3.pdf
 
AI Uses and Misuses: Academic and Workplace Applications
AI Uses and Misuses: Academic and Workplace ApplicationsAI Uses and Misuses: Academic and Workplace Applications
AI Uses and Misuses: Academic and Workplace Applications
 

QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop, Jittered Sampling: Bounds & Problems - Stefan Steinberger, Dec 14, 2017

  • 1. Jittered Sampling: Bounds and Problems Stefan Steinerberger joint with Florian Pausinger (Belfast), Manas Rachh (Yale)
  • 2. Florian Pausinger (Belfast) and Manas Rachh (Yale)
  • 3. QMC: the standard Dogma Star discrepancy. D⇤ N(X) = sup R⇢[0,1]d # {i : xi 2 R} N |R| This is a good quantity to minimize because Theorem (Koksma-Hlawka) Z [0,1]d f (x)dx 1 N NX n=1 f (xn) . (D⇤ N) (var(f )) . In particular: error only depends on the oscillation of f .
  • 4. QMC: the standard Dogma Star discrepancy. D⇤ N(X) = sup R⇢[0,1]d # {i : xi 2 R} N |R| Two competing conjectures (emotionally charged subject) D⇤ N & (log N)d 1 N or D⇤ N & (log N)d/2 N . There are many clever constructions of point set that achieve D⇤ N . (log N)d 1 N .
  • 6. QMC: the standard Dogma D⇤ N & (log N)d 1 N or D⇤ N & (log N)d/2 N . How would one actually try to prove this? Open for 80+ years, that sounds bad. Small ball conjecture seems spiritually related.
  • 7. Interlude: the small ball conjecture +1 1 1 +1 +1 1 1 +1 Haar functions hR on rectangles R.
  • 8. Interlude: the small ball conjecture All dyadic rectangles of area 2 2.
  • 9. Interlude: the small ball conjecture Small ball conjecture, Talagrand (1994) For all choices of sign "R 2 { 1, 1} X |R|=2 n "RhR L1 & nd/2 . 1. Talagrand cared about behavior of the Brownian sheet. 2. The lower bound & n(d 1)/2 is easy. 3. The case d = 2 is the only one that has been settled: three proofs due to M. Talagrand, V. Temlyakov (via Riesz products) and a beautiful one by Bilyk & Feldheim. 4. Only partial results in d 3 (Bilyk, Lacey, etc.)
  • 10. Interlude: the small ball conjecture Small ball conjecture, Talagrand (1994) For all choices of sign "R 2 { 1, 1} X |R|=2 n "RhR L1 & nd/2 . A recent surprise Theorem (Noah Kravitz, arXiv:1712.01206) For any choice of signs "R and any integer 0  k  n + 1, 8 < : x 2 [0, 1)2 : X |R|=2 n "RhR = n + 1 2k 9 = ; = 1 2n+1 ✓ n + 1 k ◆ .
  • 11. Problem with the Standard Dogma Star discrepancy. D⇤ N(X) = sup R⇢[0,1]d # {i : xi 2 R} N |R| The constructions achieving D⇤ N . (log N)d 1 N start being e↵ective around N dd (actually a bit larger even). More or less totally useless in high dimensions.
  • 12. Monte Carlo strikes back Star discrepancy. D⇤ N(X) = sup R⇢[0,1]d # {i : xi 2 R} N |R| We want error bounds in N, d! (Heinrich, Novak, Wasilkowski, Wozniakowski, 2002) There are points D⇤ N(X) . d p N . This is still the best result. (Aistleitner 2011: constant c = 10). How do you get these points? Monte Carlo
  • 13. Jittered Sampling If we already agree to distribute points randomly, we might just as well distribute them randomly in a clever way. • • • • • • • • • • • • • • • • • • • • • • • • •
  • 15. Cook, Porter & Carpenter, 1984
  • 16. Cook, Porter & Carpenter, 1984
  • 17. A Recent Application in Compressed Sensing (Nov 2015)
  • 18. Theorem (Beck, 1987) E D⇤ N(jittered sampling)  Cd (log N) 1 2 N 1 2 + 1 2d • • • • • • • • • • • • • • • • • • • • • • • • •
  • 19. Theorem (Beck, 1987) E D⇤ N(jittered sampling)  Cd (log N) 1 2 N 1 2 + 1 2d I a very general result for many di↵erent discrepancies I L2 based discrepancies (Chen & Travaligni, 2009) I Problem: same old constant Cd (might be huge, the way the proof proceeds it will be MASSIVE)
  • 20. Theorem (Pausinger and S., 2015) For N su ciently large (depending on d) 1 10 d N 1 2 + 1 2d  ED⇤ N(P)  p d(log N) 1 2 N 1 2 + 1 2d . I ’su ciently large’ is bad (talk about this later) I lower bound can probably be improved I upper bound not by much
  • 21. How the proof works • • • • • • • • • • • • • • • • • • • • • • • • •
  • 22. How the proof works • • • • • • • • • • • • • • • • • • • • • • • •
  • 23. How the proof works • • • • Maximize discrepancy over p N dimensional set in [0, N 1/2]. DN ⇠ pp N p N 1 p N = 1 N3/4 . I lose a logarithm I union bound on the other cubes
  • 25. In d dimensions, we therefore expect the main contribution of the discrepancy to behave like DN ⇠ p N d 1 d N d 1 d 1 N 1 d = 1 N d 1 2d 1 N 1 d = 1 N d+1 2d . Of course, there is also a log. Adding up this quantity d times (because there are d fat slices of codimension 1) gives us an upper bound of DN . d p log N N d+1 2d . Want to improve this a bit: standard Bernstein inequalities aren’t enough.
  • 26. Sharp Dvoretzy-Kiefer-Wolfowitz inequality (Massart, 1990) If z1, z2, . . . , zk are independently and uniformly distributed random variables in [0, 1], then P ✓ sup 0z1 # {1  `  k : 0  z`  z} k z > " ◆  2e 2k"2 . limit ! Brownian Bridge ! Kolmogorov-Smirnov distribution
  • 27. Refining estimates This yields a refined Bernstein inequality for very quickly decaying expectations.
  • 28. Rumors! Figure: Benjamin Doerr (Ecole Polytechnique (Paris)) Benjamin Doerr probably removed a p log d (?). Sadly, still not e↵ective for small N (?).
  • 29. What partition gives the best jittered sampling? You want to decompose [0, 1]2 into 4 sets such that the associated jittered sampling construction is as e↵ective as possible. How? • • • • Is this good? Is this bad? Will it be into 4 parts of same volume? We don’t actually know.
  • 30. Jittered sampling always improves: variance reduction Decompose [0, 1]d into sets of equal measure [0, 1]d = N[ i=1 ⌦i such that 8 1  i  N : |⌦i | = 1 N and measure using the L2 discrepancy L2(A) := Z [0,1]d #A [0, x] #A |[0, x]| 2 dx !1 2 . Observation (Pausinger and S., 2015) E L2(Jittered Sampling⌦)2  E L2(Purely randomN)2 ,
  • 31. Main Idea: Variance Reduction (What happens in L3?)
  • 32. How to select 2 points: expected squared L2 discrepancy MC 0.0694 0.0638 0.0555 0.05 • • • 0.04700.0471
  • 33. Theorem (Florian Pausinger, Manas Rachh, S.) Among all splittings of a domain given by a function y = f (x) with symmetry around x = y, the following subdivison is optimal. 0.04617
  • 34. The Most Nonlinear Integral Equation I’ve Ever Seen Theorem (Florian Pausinger, Manas Rachh, S.) Any optimal monotonically decreasing function g(x) whose graph is symmetric about y = x satisfies, for 0  x  g 1(0), (1 2p 4xg(x)) (1 g(x)) + (4p 1)x 1 g(x)2 4 Z g 1(0) g(x) (1 y)g (y)dy + g0 (x) (1 2p 4xg(x)) (1 x) + (4p 1)g(x) 1 x2 4 Z g 1(0) x (1 y)g(y)dy = 0. Question. How to do 3 points in [0, 1]2? Simple rules?