SlideShare a Scribd company logo
Integration with Kernel Methods, Transported
Meshfree methods.
P.G. LeFloch 1, J.M. Mercier 2
1CNRS, 2MPG-Partners
19 10 2019
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 1 / 8
Local integration with kernel methods
Monte Carlo estimations - consider the following error estimation (µ probability
measure, Y = (y1
, . . . , yN
) ∈ RN×D
)
RD
ϕ(x)dµ −
1
N
N
n=1
ϕ(yn
) ≤ E Y , HKµ ϕ HKµ
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 2 / 8
Local integration with kernel methods
Monte Carlo estimations - consider the following error estimation (µ probability
measure, Y = (y1
, . . . , yN
) ∈ RN×D
)
RD
ϕ(x)dµ −
1
N
N
n=1
ϕ(yn
) ≤ E Y , HKµ ϕ HKµ
1 example 1 : Y i.i.d. → E(Y , HKµ ) ∼ 1√
N
and HKµ ∼ L2
(RD
, |x|2
dµ) (stat.).
2 ex 1 : Y SOBOL, µ = dxΩ, Ω = [0, 1]D
. Then HK = BV (Ω) (bounded variations),
E(Y , HK ) ≥ ln(N)D−1
N
( Koksma-Hlawka sharp estimate conjecture).
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 2 / 8
Local integration with kernel methods
Monte Carlo estimations - consider the following error estimation (µ probability
measure, Y = (y1
, . . . , yN
) ∈ RN×D
)
RD
ϕ(x)dµ −
1
N
N
n=1
ϕ(yn
) ≤ E Y , HKµ ϕ HKµ
1 example 1 : Y i.i.d. → E(Y , HKµ ) ∼ 1√
N
and HKµ ∼ L2
(RD
, |x|2
dµ) (stat.).
2 ex 1 : Y SOBOL, µ = dxΩ, Ω = [0, 1]D
. Then HK = BV (Ω) (bounded variations),
E(Y , HK ) ≥ ln(N)D−1
N
( Koksma-Hlawka sharp estimate conjecture).
3 Others examples ...quantifiers, wavelet, deep feed-forward neural networks ...
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 2 / 8
Local integration with kernel methods
Monte Carlo estimations - consider the following error estimation (µ probability
measure, Y = (y1
, . . . , yN
) ∈ RN×D
)
RD
ϕ(x)dµ −
1
N
N
n=1
ϕ(yn
) ≤ E Y , HKµ ϕ HKµ
1 example 1 : Y i.i.d. → E(Y , HKµ ) ∼ 1√
N
and HKµ ∼ L2
(RD
, |x|2
dµ) (stat.).
2 ex 1 : Y SOBOL, µ = dxΩ, Ω = [0, 1]D
. Then HK = BV (Ω) (bounded variations),
E(Y , HK ) ≥ ln(N)D−1
N
( Koksma-Hlawka sharp estimate conjecture).
3 Others examples ...quantifiers, wavelet, deep feed-forward neural networks ...
4 A general method Let K(x, y), an admissible kernel, then
E2
(Y , HKµ ) =
R2D
K(x, y)dµx dµy +
1
N2
N
n,m=1
K(yn
, ym
)−
2
N
N
n=1 RD
K(x, yn
)dµx
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 2 / 8
Local integration with kernel methods
Monte Carlo estimations - consider the following error estimation (µ probability
measure, Y = (y1
, . . . , yN
) ∈ RN×D
)
RD
ϕ(x)dµ −
1
N
N
n=1
ϕ(yn
) ≤ E Y , HKµ ϕ HKµ
1 example 1 : Y i.i.d. → E(Y , HKµ ) ∼ 1√
N
and HKµ ∼ L2
(RD
, |x|2
dµ) (stat.).
2 ex 1 : Y SOBOL, µ = dxΩ, Ω = [0, 1]D
. Then HK = BV (Ω) (bounded variations),
E(Y , HK ) ≥ ln(N)D−1
N
( Koksma-Hlawka sharp estimate conjecture).
3 Others examples ...quantifiers, wavelet, deep feed-forward neural networks ...
4 A general method Let K(x, y), an admissible kernel, then
E2
(Y , HKµ ) =
R2D
K(x, y)dµx dµy +
1
N2
N
n,m=1
K(yn
, ym
)−
2
N
N
n=1 RD
K(x, yn
)dµx
5 We can compute sharp discrepancy sequences and optimal discrepancy error as
Y = arg inf
Y ∈RD×N
E(Y , HKµ ), EHKµ
(N, D) = E(Y , HKµ )
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 2 / 8
Our local kernels : lattice-based and transported kernels
Without loss of generality, consider µ = dx[0,1]D . We use two kind of kernels :
1 Lattice-based kernel : Let L a Lattice, L∗
its dual Lattice. Consider any discrete
function satisfying φ(α∗
) ∈ 1
(L∗
), φ(α∗
) ≥ 0, φ(0) = 1.
Kper (x, y) =
1
|L|
α∗∈L∗
φ(α∗
) exp2iπ<x−y,α∗
>
x
y
z
Matern
x
y
k
Multiquadric
x
y
k
Gaussian
x
y
k
Truncated
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 3 / 8
Our local kernels : lattice-based and transported kernels
Without loss of generality, consider µ = dx[0,1]D . We use two kind of kernels :
1 Lattice-based kernel : Let L a Lattice, L∗
its dual Lattice. Consider any discrete
function satisfying φ(α∗
) ∈ 1
(L∗
), φ(α∗
) ≥ 0, φ(0) = 1.
Kper (x, y) =
1
|L|
α∗∈L∗
φ(α∗
) exp2iπ<x−y,α∗
>
x
y
z
Matern
x
y
k
Multiquadric
x
y
k
Gaussian
x
y
k
Truncated
2 Transported kernel : S : Ω → RD
a transport map. Ktra(x, y) = K(S(x), S(y)).
x
y
k
Matern
x
y
k
Gaussian
x
y
k
Multiquadric
x
y
k
Truncated
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 3 / 8
An example : Monte-Carlo integration with Matern kernel
1 Kernel, random and computed sequences Y . N=256,D=2.
x
y
z
Matern
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
random points
x
y
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
computed points for lattice Matern
x
y
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 4 / 8
An example : Monte-Carlo integration with Matern kernel
1 Kernel, random and computed sequences Y . N=256,D=2.
x
y
z
Matern
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
random points
x
y
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
computed points for lattice Matern
x
y
2 Optimal discrepancy error → Koksma-Hlavka type estimate
EHK (N, D) ∼ n>N
φ(α∗n)
N
∼
ln(N)D−1
N
, φ(α) = ΠD
d=1
2
1 + 4π2α2
d /τ2
D
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 4 / 8
An example : Monte-Carlo integration with Matern kernel
1 Kernel, random and computed sequences Y . N=256,D=2.
x
y
z
Matern
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
random points
x
y
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
computed points for lattice Matern
x
y
2 Optimal discrepancy error → Koksma-Hlavka type estimate
EHK (N, D) ∼ n>N
φ(α∗n)
N
∼
ln(N)D−1
N
, φ(α) = ΠD
d=1
2
1 + 4π2α2
d /τ2
D
3 E(Y , HK ) random – vs E(Y , HK ) computed - vs theoretical EHK (N, D)
D=1 D=16 D=128
N=16 0.228 0.304 0.319
N=128 0.117 0.111 0.115
N=512 0.035 0.054 0.059
D=1 D=16 D=128
N=16 0.062 0.211 0.223
N=128 0.008 0.069 0.077
N=512 0.002 0.034 0.049
D=1 D=16 D=128
N=16 0.062 0.288 0.323
N=128 0.008 0.077 0.105
N=512 0.002 0.034 0.043
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 4 / 8
Application : Transported Meshfree Methods (TMM)
If µ(t, x) solution to non-linear hyperbolic-parabolic Fokker-Planck equations
∂t µ − Lµ = 0, L = · (bµ) + 2
· (Aµ), A :=
1
2
σσT
1 FORWARD : Compute µ(t) ∼ 1
N
δy1(t) + . . . + δyN (t) as best discrepancy
sequences
RD
ϕ(x)dµ(t, x) −
1
N
N
n=1
ϕ(yn
(t)) ≤ E Y (t), HKµ(t,·)
ϕ HKµ(t,·)
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 5 / 8
Application : Transported Meshfree Methods (TMM)
If µ(t, x) solution to non-linear hyperbolic-parabolic Fokker-Planck equations
∂t µ − Lµ = 0, L = · (bµ) + 2
· (Aµ), A :=
1
2
σσT
1 FORWARD : Compute µ(t) ∼ 1
N
δy1(t) + . . . + δyN (t) as best discrepancy
sequences
RD
ϕ(x)dµ(t, x) −
1
N
N
n=1
ϕ(yn
(t)) ≤ E Y (t), HKµ(t,·)
ϕ HKµ(t,·)
2 CHECK optimal rate : E Y (t), HKµ(t,·)
∼ EHKµ(t)
(N, D)
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 5 / 8
Application : Transported Meshfree Methods (TMM)
If µ(t, x) solution to non-linear hyperbolic-parabolic Fokker-Planck equations
∂t µ − Lµ = 0, L = · (bµ) + 2
· (Aµ), A :=
1
2
σσT
1 FORWARD : Compute µ(t) ∼ 1
N
δy1(t) + . . . + δyN (t) as best discrepancy
sequences
RD
ϕ(x)dµ(t, x) −
1
N
N
n=1
ϕ(yn
(t)) ≤ E Y (t), HKµ(t,·)
ϕ HKµ(t,·)
2 CHECK optimal rate : E Y (t), HKµ(t,·)
∼ EHKµ(t)
(N, D)
3 BACKWARD : interpret t → yn
(t), n = 1 . . . N as a moving, transported, PDE
grid. Solve with it the Kolmogorov equation. ERROR ESTIMATION :
RD
P(t, ·)dµ(t, ·) −
1
N
N
n=1
P(t, yn
(t)) ≤ EHKµ(t)
(N, D) P(t, ·) HKµ(t,·)
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 5 / 8
Application : Transported Meshfree Methods (TMM)
If µ(t, x) solution to non-linear hyperbolic-parabolic Fokker-Planck equations
∂t µ − Lµ = 0, L = · (bµ) + 2
· (Aµ), A :=
1
2
σσT
1 FORWARD : Compute µ(t) ∼ 1
N
δy1(t) + . . . + δyN (t) as best discrepancy
sequences
RD
ϕ(x)dµ(t, x) −
1
N
N
n=1
ϕ(yn
(t)) ≤ E Y (t), HKµ(t,·)
ϕ HKµ(t,·)
2 CHECK optimal rate : E Y (t), HKµ(t,·)
∼ EHKµ(t)
(N, D)
3 BACKWARD : interpret t → yn
(t), n = 1 . . . N as a moving, transported, PDE
grid. Solve with it the Kolmogorov equation. ERROR ESTIMATION :
RD
P(t, ·)dµ(t, ·) −
1
N
N
n=1
P(t, yn
(t)) ≤ EHKµ(t)
(N, D) P(t, ·) HKµ(t,·)
4 HYPERBOLIC CASE (σ ≡ 0) : LAGRANGIAN MESHFREE METHODS.
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 5 / 8
Illustration : the 2D SABR process, widely used in Finance
SABR process d
Ft
αt
= ρ
αt Fβ
t 0
0 ναt
dW 1
t
dW 2
t
, with 0 ≤ β ≤ 1,
ν ≥ 0,ρ ∈ R2×2
. The Fokker-Planck equation associated to SABR is
∂t µ + L∗
µ = 0, L∗
µ = ρ
x2
2
2
xβ
1 0
0 ν2
2
x2
ρT
· 2
µ.
(Loading SABR200)
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 6 / 8
academic tests, business cases, curse of dimensionality
1 Academic tests : finance, non-linear hyperbolic systems
1 Revisiting the method of characteristics via a convex hull algorithm
2 Numerical results using CoDeFi
3 A new method for solving Kolmogorov equations in mathematical finance
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 7 / 8
academic tests, business cases, curse of dimensionality
1 Academic tests : finance, non-linear hyperbolic systems
1 Revisiting the method of characteristics via a convex hull algorithm
2 Numerical results using CoDeFi
3 A new method for solving Kolmogorov equations in mathematical finance
2 Business cases : finance
1 Hedging Strategies for Net Interest Income and Economic Values of
Equity (http://dx.doi.org/10.2139/ssrn.3454813, with S.Miryusupov).
2 Compute metrics for big portfolio of Autocalls depending on several
underlyings (unpublished).
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 7 / 8
academic tests, business cases, curse of dimensionality
1 Academic tests : finance, non-linear hyperbolic systems
1 Revisiting the method of characteristics via a convex hull algorithm
2 Numerical results using CoDeFi
3 A new method for solving Kolmogorov equations in mathematical finance
2 Business cases : finance
1 Hedging Strategies for Net Interest Income and Economic Values of
Equity (http://dx.doi.org/10.2139/ssrn.3454813, with S.Miryusupov).
2 Compute metrics for big portfolio of Autocalls depending on several
underlyings (unpublished).
3 CURSE of dimensionality in finance : Price and manage a complex option
written on several underlyings. Result ? We can solve at any order of accuracy :
RD
P(t, ·)dµ(t, ·) −
1
N
N
n=1
P(t, yn
(t)) ≤
P(t, ·) HKµ(t,·)
Nα
...
...where α ≥ 1/2 is ANY NUMBER ! Choose it according to your desired
electricity bill ! BUT ...beware to low-regularity problems if the dimension D is too
big (e.g. american-type options or Autocall)
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 7 / 8
Summary and Conclusions
We presented in this talk:
1 News, sharps, estimations for Monte-Carlo methods...
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 8 / 8
Summary and Conclusions
We presented in this talk:
1 News, sharps, estimations for Monte-Carlo methods...
2 ...that can be used in a wide variety of context to perform a sharp
error analysis.
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 8 / 8
Summary and Conclusions
We presented in this talk:
1 News, sharps, estimations for Monte-Carlo methods...
2 ...that can be used in a wide variety of context to perform a sharp
error analysis.
3 A new method for numerical simulations of PDE : Transported
Meshfree Methods....
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 8 / 8
Summary and Conclusions
We presented in this talk:
1 News, sharps, estimations for Monte-Carlo methods...
2 ...that can be used in a wide variety of context to perform a sharp
error analysis.
3 A new method for numerical simulations of PDE : Transported
Meshfree Methods....
4 ... that can be used in a wide variety of applications (hyperbolic /
parabolic equations, artificial intelligence, etc...)...
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 8 / 8
Summary and Conclusions
We presented in this talk:
1 News, sharps, estimations for Monte-Carlo methods...
2 ...that can be used in a wide variety of context to perform a sharp
error analysis.
3 A new method for numerical simulations of PDE : Transported
Meshfree Methods....
4 ... that can be used in a wide variety of applications (hyperbolic /
parabolic equations, artificial intelligence, etc...)...
5 ..for which the error analysis applies : we can guarantee a worst
error estimation, and we can check that this error matches an
optimal convergence rate.
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 8 / 8

More Related Content

What's hot

Recent developments on unbiased MCMC
Recent developments on unbiased MCMCRecent developments on unbiased MCMC
Recent developments on unbiased MCMC
Pierre Jacob
 
Tensor train to solve stochastic PDEs
Tensor train to solve stochastic PDEsTensor train to solve stochastic PDEs
Tensor train to solve stochastic PDEs
Alexander Litvinenko
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
ABC in Venezia
ABC in VeneziaABC in Venezia
ABC in Venezia
Christian Robert
 
Smart Multitask Bregman Clustering
Smart Multitask Bregman ClusteringSmart Multitask Bregman Clustering
Smart Multitask Bregman Clustering
Venkat Sai Sharath Mudhigonda
 
MCMC and likelihood-free methods
MCMC and likelihood-free methodsMCMC and likelihood-free methods
MCMC and likelihood-free methods
Christian Robert
 
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...ijceronline
 
Phase-Type Distributions for Finite Interacting Particle Systems
Phase-Type Distributions for Finite Interacting Particle SystemsPhase-Type Distributions for Finite Interacting Particle Systems
Phase-Type Distributions for Finite Interacting Particle Systems
Stefan Eng
 
MUMS Opening Workshop - Model Uncertainty in Data Fusion for Remote Sensing -...
MUMS Opening Workshop - Model Uncertainty in Data Fusion for Remote Sensing -...MUMS Opening Workshop - Model Uncertainty in Data Fusion for Remote Sensing -...
MUMS Opening Workshop - Model Uncertainty in Data Fusion for Remote Sensing -...
The Statistical and Applied Mathematical Sciences Institute
 
2012 mdsp pr06  hmm
2012 mdsp pr06  hmm2012 mdsp pr06  hmm
2012 mdsp pr06  hmmnozomuhamada
 
Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...
Pierre Jacob
 
Mixed Spectra for Stable Signals from Discrete Observations
Mixed Spectra for Stable Signals from Discrete ObservationsMixed Spectra for Stable Signals from Discrete Observations
Mixed Spectra for Stable Signals from Discrete Observations
sipij
 
Conditional neural processes
Conditional neural processesConditional neural processes
Conditional neural processes
Kazuki Fujikawa
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical Methods
Christian Robert
 
Adaptive dynamic programming for control
Adaptive dynamic programming for controlAdaptive dynamic programming for control
Adaptive dynamic programming for controlSpringer
 
Nonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodNonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo Method
SSA KPI
 
Introduction to MCMC methods
Introduction to MCMC methodsIntroduction to MCMC methods
Introduction to MCMC methods
Christian Robert
 
SLC 2015 talk improved version
SLC 2015 talk improved versionSLC 2015 talk improved version
SLC 2015 talk improved version
Zheng Mengdi
 
JAISTサマースクール2016「脳を知るための理論」講義01 Single neuron models
JAISTサマースクール2016「脳を知るための理論」講義01 Single neuron modelsJAISTサマースクール2016「脳を知るための理論」講義01 Single neuron models
JAISTサマースクール2016「脳を知るための理論」講義01 Single neuron models
hirokazutanaka
 

What's hot (20)

Recent developments on unbiased MCMC
Recent developments on unbiased MCMCRecent developments on unbiased MCMC
Recent developments on unbiased MCMC
 
Tensor train to solve stochastic PDEs
Tensor train to solve stochastic PDEsTensor train to solve stochastic PDEs
Tensor train to solve stochastic PDEs
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
ABC in Venezia
ABC in VeneziaABC in Venezia
ABC in Venezia
 
Smart Multitask Bregman Clustering
Smart Multitask Bregman ClusteringSmart Multitask Bregman Clustering
Smart Multitask Bregman Clustering
 
MCMC and likelihood-free methods
MCMC and likelihood-free methodsMCMC and likelihood-free methods
MCMC and likelihood-free methods
 
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
 
Phase-Type Distributions for Finite Interacting Particle Systems
Phase-Type Distributions for Finite Interacting Particle SystemsPhase-Type Distributions for Finite Interacting Particle Systems
Phase-Type Distributions for Finite Interacting Particle Systems
 
MUMS Opening Workshop - Model Uncertainty in Data Fusion for Remote Sensing -...
MUMS Opening Workshop - Model Uncertainty in Data Fusion for Remote Sensing -...MUMS Opening Workshop - Model Uncertainty in Data Fusion for Remote Sensing -...
MUMS Opening Workshop - Model Uncertainty in Data Fusion for Remote Sensing -...
 
2012 mdsp pr06  hmm
2012 mdsp pr06  hmm2012 mdsp pr06  hmm
2012 mdsp pr06  hmm
 
Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...
 
Mixed Spectra for Stable Signals from Discrete Observations
Mixed Spectra for Stable Signals from Discrete ObservationsMixed Spectra for Stable Signals from Discrete Observations
Mixed Spectra for Stable Signals from Discrete Observations
 
Conditional neural processes
Conditional neural processesConditional neural processes
Conditional neural processes
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical Methods
 
Funcion gamma
Funcion gammaFuncion gamma
Funcion gamma
 
Adaptive dynamic programming for control
Adaptive dynamic programming for controlAdaptive dynamic programming for control
Adaptive dynamic programming for control
 
Nonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodNonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo Method
 
Introduction to MCMC methods
Introduction to MCMC methodsIntroduction to MCMC methods
Introduction to MCMC methods
 
SLC 2015 talk improved version
SLC 2015 talk improved versionSLC 2015 talk improved version
SLC 2015 talk improved version
 
JAISTサマースクール2016「脳を知るための理論」講義01 Single neuron models
JAISTサマースクール2016「脳を知るための理論」講義01 Single neuron modelsJAISTサマースクール2016「脳を知るための理論」講義01 Single neuron models
JAISTサマースクール2016「脳を知るための理論」講義01 Single neuron models
 

Similar to Integration with kernel methods, Transported meshfree methods

Pres metabief2020jmm
Pres metabief2020jmmPres metabief2020jmm
Pres metabief2020jmm
Mercier Jean-Marc
 
Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Computational Information Geometry on Matrix Manifolds (ICTP 2013)Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Frank Nielsen
 
SURF 2012 Final Report(1)
SURF 2012 Final Report(1)SURF 2012 Final Report(1)
SURF 2012 Final Report(1)Eric Zhang
 
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
The Statistical and Applied Mathematical Sciences Institute
 
Tucker tensor analysis of Matern functions in spatial statistics
Tucker tensor analysis of Matern functions in spatial statistics Tucker tensor analysis of Matern functions in spatial statistics
Tucker tensor analysis of Matern functions in spatial statistics
Alexander Litvinenko
 
KAUST_talk_short.pdf
KAUST_talk_short.pdfKAUST_talk_short.pdf
KAUST_talk_short.pdf
Chiheb Ben Hammouda
 
A new implementation of k-MLE for mixture modelling of Wishart distributions
A new implementation of k-MLE for mixture modelling of Wishart distributionsA new implementation of k-MLE for mixture modelling of Wishart distributions
A new implementation of k-MLE for mixture modelling of Wishart distributions
Frank Nielsen
 
NONLINEAR DIFFERENCE EQUATIONS WITH SMALL PARAMETERS OF MULTIPLE SCALES
NONLINEAR DIFFERENCE EQUATIONS WITH SMALL PARAMETERS OF MULTIPLE SCALESNONLINEAR DIFFERENCE EQUATIONS WITH SMALL PARAMETERS OF MULTIPLE SCALES
NONLINEAR DIFFERENCE EQUATIONS WITH SMALL PARAMETERS OF MULTIPLE SCALES
Tahia ZERIZER
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...
Alexander Litvinenko
 
A series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropyA series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropy
Frank Nielsen
 
Randomized algorithms ver 1.0
Randomized algorithms ver 1.0Randomized algorithms ver 1.0
Randomized algorithms ver 1.0
Dr. C.V. Suresh Babu
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
Christian Robert
 
Seminar Talk: Multilevel Hybrid Split Step Implicit Tau-Leap for Stochastic R...
Seminar Talk: Multilevel Hybrid Split Step Implicit Tau-Leap for Stochastic R...Seminar Talk: Multilevel Hybrid Split Step Implicit Tau-Leap for Stochastic R...
Seminar Talk: Multilevel Hybrid Split Step Implicit Tau-Leap for Stochastic R...
Chiheb Ben Hammouda
 
QMC: Operator Splitting Workshop, Perturbed (accelerated) Proximal-Gradient A...
QMC: Operator Splitting Workshop, Perturbed (accelerated) Proximal-Gradient A...QMC: Operator Splitting Workshop, Perturbed (accelerated) Proximal-Gradient A...
QMC: Operator Splitting Workshop, Perturbed (accelerated) Proximal-Gradient A...
The Statistical and Applied Mathematical Sciences Institute
 
Unbiased Markov chain Monte Carlo methods
Unbiased Markov chain Monte Carlo methods Unbiased Markov chain Monte Carlo methods
Unbiased Markov chain Monte Carlo methods
Pierre Jacob
 
SOLVING BVPs OF SINGULARLY PERTURBED DISCRETE SYSTEMS
SOLVING BVPs OF SINGULARLY PERTURBED DISCRETE SYSTEMSSOLVING BVPs OF SINGULARLY PERTURBED DISCRETE SYSTEMS
SOLVING BVPs OF SINGULARLY PERTURBED DISCRETE SYSTEMS
Tahia ZERIZER
 
On estimating the integrated co volatility using
On estimating the integrated co volatility usingOn estimating the integrated co volatility using
On estimating the integrated co volatility using
kkislas
 
orlando_fest
orlando_festorlando_fest
orlando_festAndy Hone
 

Similar to Integration with kernel methods, Transported meshfree methods (20)

Pres metabief2020jmm
Pres metabief2020jmmPres metabief2020jmm
Pres metabief2020jmm
 
Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Computational Information Geometry on Matrix Manifolds (ICTP 2013)Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Computational Information Geometry on Matrix Manifolds (ICTP 2013)
 
SURF 2012 Final Report(1)
SURF 2012 Final Report(1)SURF 2012 Final Report(1)
SURF 2012 Final Report(1)
 
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
 
Tucker tensor analysis of Matern functions in spatial statistics
Tucker tensor analysis of Matern functions in spatial statistics Tucker tensor analysis of Matern functions in spatial statistics
Tucker tensor analysis of Matern functions in spatial statistics
 
KAUST_talk_short.pdf
KAUST_talk_short.pdfKAUST_talk_short.pdf
KAUST_talk_short.pdf
 
A new implementation of k-MLE for mixture modelling of Wishart distributions
A new implementation of k-MLE for mixture modelling of Wishart distributionsA new implementation of k-MLE for mixture modelling of Wishart distributions
A new implementation of k-MLE for mixture modelling of Wishart distributions
 
NONLINEAR DIFFERENCE EQUATIONS WITH SMALL PARAMETERS OF MULTIPLE SCALES
NONLINEAR DIFFERENCE EQUATIONS WITH SMALL PARAMETERS OF MULTIPLE SCALESNONLINEAR DIFFERENCE EQUATIONS WITH SMALL PARAMETERS OF MULTIPLE SCALES
NONLINEAR DIFFERENCE EQUATIONS WITH SMALL PARAMETERS OF MULTIPLE SCALES
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...
 
A series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropyA series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropy
 
Randomized algorithms ver 1.0
Randomized algorithms ver 1.0Randomized algorithms ver 1.0
Randomized algorithms ver 1.0
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
 
Seminar Talk: Multilevel Hybrid Split Step Implicit Tau-Leap for Stochastic R...
Seminar Talk: Multilevel Hybrid Split Step Implicit Tau-Leap for Stochastic R...Seminar Talk: Multilevel Hybrid Split Step Implicit Tau-Leap for Stochastic R...
Seminar Talk: Multilevel Hybrid Split Step Implicit Tau-Leap for Stochastic R...
 
ma112011id535
ma112011id535ma112011id535
ma112011id535
 
QMC: Operator Splitting Workshop, Perturbed (accelerated) Proximal-Gradient A...
QMC: Operator Splitting Workshop, Perturbed (accelerated) Proximal-Gradient A...QMC: Operator Splitting Workshop, Perturbed (accelerated) Proximal-Gradient A...
QMC: Operator Splitting Workshop, Perturbed (accelerated) Proximal-Gradient A...
 
Unbiased Markov chain Monte Carlo methods
Unbiased Markov chain Monte Carlo methods Unbiased Markov chain Monte Carlo methods
Unbiased Markov chain Monte Carlo methods
 
SOLVING BVPs OF SINGULARLY PERTURBED DISCRETE SYSTEMS
SOLVING BVPs OF SINGULARLY PERTURBED DISCRETE SYSTEMSSOLVING BVPs OF SINGULARLY PERTURBED DISCRETE SYSTEMS
SOLVING BVPs OF SINGULARLY PERTURBED DISCRETE SYSTEMS
 
On estimating the integrated co volatility using
On estimating the integrated co volatility usingOn estimating the integrated co volatility using
On estimating the integrated co volatility using
 
orlando_fest
orlando_festorlando_fest
orlando_fest
 

Recently uploaded

somanykidsbutsofewfathers-140705000023-phpapp02.pptx
somanykidsbutsofewfathers-140705000023-phpapp02.pptxsomanykidsbutsofewfathers-140705000023-phpapp02.pptx
somanykidsbutsofewfathers-140705000023-phpapp02.pptx
Howard Spence
 
Bitcoin Lightning wallet and tic-tac-toe game XOXO
Bitcoin Lightning wallet and tic-tac-toe game XOXOBitcoin Lightning wallet and tic-tac-toe game XOXO
Bitcoin Lightning wallet and tic-tac-toe game XOXO
Matjaž Lipuš
 
Bonzo subscription_hjjjjjjjj5hhhhhhh_2024.pdf
Bonzo subscription_hjjjjjjjj5hhhhhhh_2024.pdfBonzo subscription_hjjjjjjjj5hhhhhhh_2024.pdf
Bonzo subscription_hjjjjjjjj5hhhhhhh_2024.pdf
khadija278284
 
Doctoral Symposium at the 17th IEEE International Conference on Software Test...
Doctoral Symposium at the 17th IEEE International Conference on Software Test...Doctoral Symposium at the 17th IEEE International Conference on Software Test...
Doctoral Symposium at the 17th IEEE International Conference on Software Test...
Sebastiano Panichella
 
0x01 - Newton's Third Law: Static vs. Dynamic Abusers
0x01 - Newton's Third Law:  Static vs. Dynamic Abusers0x01 - Newton's Third Law:  Static vs. Dynamic Abusers
0x01 - Newton's Third Law: Static vs. Dynamic Abusers
OWASP Beja
 
Getting started with Amazon Bedrock Studio and Control Tower
Getting started with Amazon Bedrock Studio and Control TowerGetting started with Amazon Bedrock Studio and Control Tower
Getting started with Amazon Bedrock Studio and Control Tower
Vladimir Samoylov
 
Acorn Recovery: Restore IT infra within minutes
Acorn Recovery: Restore IT infra within minutesAcorn Recovery: Restore IT infra within minutes
Acorn Recovery: Restore IT infra within minutes
IP ServerOne
 
Competition and Regulation in Professional Services – KLEINER – June 2024 OEC...
Competition and Regulation in Professional Services – KLEINER – June 2024 OEC...Competition and Regulation in Professional Services – KLEINER – June 2024 OEC...
Competition and Regulation in Professional Services – KLEINER – June 2024 OEC...
OECD Directorate for Financial and Enterprise Affairs
 
Sharpen existing tools or get a new toolbox? Contemporary cluster initiatives...
Sharpen existing tools or get a new toolbox? Contemporary cluster initiatives...Sharpen existing tools or get a new toolbox? Contemporary cluster initiatives...
Sharpen existing tools or get a new toolbox? Contemporary cluster initiatives...
Orkestra
 
Eureka, I found it! - Special Libraries Association 2021 Presentation
Eureka, I found it! - Special Libraries Association 2021 PresentationEureka, I found it! - Special Libraries Association 2021 Presentation
Eureka, I found it! - Special Libraries Association 2021 Presentation
Access Innovations, Inc.
 
Supercharge your AI - SSP Industry Breakout Session 2024-v2_1.pdf
Supercharge your AI - SSP Industry Breakout Session 2024-v2_1.pdfSupercharge your AI - SSP Industry Breakout Session 2024-v2_1.pdf
Supercharge your AI - SSP Industry Breakout Session 2024-v2_1.pdf
Access Innovations, Inc.
 
International Workshop on Artificial Intelligence in Software Testing
International Workshop on Artificial Intelligence in Software TestingInternational Workshop on Artificial Intelligence in Software Testing
International Workshop on Artificial Intelligence in Software Testing
Sebastiano Panichella
 
María Carolina Martínez - eCommerce Day Colombia 2024
María Carolina Martínez - eCommerce Day Colombia 2024María Carolina Martínez - eCommerce Day Colombia 2024
María Carolina Martínez - eCommerce Day Colombia 2024
eCommerce Institute
 
Media as a Mind Controlling Strategy In Old and Modern Era
Media as a Mind Controlling Strategy In Old and Modern EraMedia as a Mind Controlling Strategy In Old and Modern Era
Media as a Mind Controlling Strategy In Old and Modern Era
faizulhassanfaiz1670
 
Announcement of 18th IEEE International Conference on Software Testing, Verif...
Announcement of 18th IEEE International Conference on Software Testing, Verif...Announcement of 18th IEEE International Conference on Software Testing, Verif...
Announcement of 18th IEEE International Conference on Software Testing, Verif...
Sebastiano Panichella
 
Obesity causes and management and associated medical conditions
Obesity causes and management and associated medical conditionsObesity causes and management and associated medical conditions
Obesity causes and management and associated medical conditions
Faculty of Medicine And Health Sciences
 

Recently uploaded (16)

somanykidsbutsofewfathers-140705000023-phpapp02.pptx
somanykidsbutsofewfathers-140705000023-phpapp02.pptxsomanykidsbutsofewfathers-140705000023-phpapp02.pptx
somanykidsbutsofewfathers-140705000023-phpapp02.pptx
 
Bitcoin Lightning wallet and tic-tac-toe game XOXO
Bitcoin Lightning wallet and tic-tac-toe game XOXOBitcoin Lightning wallet and tic-tac-toe game XOXO
Bitcoin Lightning wallet and tic-tac-toe game XOXO
 
Bonzo subscription_hjjjjjjjj5hhhhhhh_2024.pdf
Bonzo subscription_hjjjjjjjj5hhhhhhh_2024.pdfBonzo subscription_hjjjjjjjj5hhhhhhh_2024.pdf
Bonzo subscription_hjjjjjjjj5hhhhhhh_2024.pdf
 
Doctoral Symposium at the 17th IEEE International Conference on Software Test...
Doctoral Symposium at the 17th IEEE International Conference on Software Test...Doctoral Symposium at the 17th IEEE International Conference on Software Test...
Doctoral Symposium at the 17th IEEE International Conference on Software Test...
 
0x01 - Newton's Third Law: Static vs. Dynamic Abusers
0x01 - Newton's Third Law:  Static vs. Dynamic Abusers0x01 - Newton's Third Law:  Static vs. Dynamic Abusers
0x01 - Newton's Third Law: Static vs. Dynamic Abusers
 
Getting started with Amazon Bedrock Studio and Control Tower
Getting started with Amazon Bedrock Studio and Control TowerGetting started with Amazon Bedrock Studio and Control Tower
Getting started with Amazon Bedrock Studio and Control Tower
 
Acorn Recovery: Restore IT infra within minutes
Acorn Recovery: Restore IT infra within minutesAcorn Recovery: Restore IT infra within minutes
Acorn Recovery: Restore IT infra within minutes
 
Competition and Regulation in Professional Services – KLEINER – June 2024 OEC...
Competition and Regulation in Professional Services – KLEINER – June 2024 OEC...Competition and Regulation in Professional Services – KLEINER – June 2024 OEC...
Competition and Regulation in Professional Services – KLEINER – June 2024 OEC...
 
Sharpen existing tools or get a new toolbox? Contemporary cluster initiatives...
Sharpen existing tools or get a new toolbox? Contemporary cluster initiatives...Sharpen existing tools or get a new toolbox? Contemporary cluster initiatives...
Sharpen existing tools or get a new toolbox? Contemporary cluster initiatives...
 
Eureka, I found it! - Special Libraries Association 2021 Presentation
Eureka, I found it! - Special Libraries Association 2021 PresentationEureka, I found it! - Special Libraries Association 2021 Presentation
Eureka, I found it! - Special Libraries Association 2021 Presentation
 
Supercharge your AI - SSP Industry Breakout Session 2024-v2_1.pdf
Supercharge your AI - SSP Industry Breakout Session 2024-v2_1.pdfSupercharge your AI - SSP Industry Breakout Session 2024-v2_1.pdf
Supercharge your AI - SSP Industry Breakout Session 2024-v2_1.pdf
 
International Workshop on Artificial Intelligence in Software Testing
International Workshop on Artificial Intelligence in Software TestingInternational Workshop on Artificial Intelligence in Software Testing
International Workshop on Artificial Intelligence in Software Testing
 
María Carolina Martínez - eCommerce Day Colombia 2024
María Carolina Martínez - eCommerce Day Colombia 2024María Carolina Martínez - eCommerce Day Colombia 2024
María Carolina Martínez - eCommerce Day Colombia 2024
 
Media as a Mind Controlling Strategy In Old and Modern Era
Media as a Mind Controlling Strategy In Old and Modern EraMedia as a Mind Controlling Strategy In Old and Modern Era
Media as a Mind Controlling Strategy In Old and Modern Era
 
Announcement of 18th IEEE International Conference on Software Testing, Verif...
Announcement of 18th IEEE International Conference on Software Testing, Verif...Announcement of 18th IEEE International Conference on Software Testing, Verif...
Announcement of 18th IEEE International Conference on Software Testing, Verif...
 
Obesity causes and management and associated medical conditions
Obesity causes and management and associated medical conditionsObesity causes and management and associated medical conditions
Obesity causes and management and associated medical conditions
 

Integration with kernel methods, Transported meshfree methods

  • 1. Integration with Kernel Methods, Transported Meshfree methods. P.G. LeFloch 1, J.M. Mercier 2 1CNRS, 2MPG-Partners 19 10 2019 P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 1 / 8
  • 2. Local integration with kernel methods Monte Carlo estimations - consider the following error estimation (µ probability measure, Y = (y1 , . . . , yN ) ∈ RN×D ) RD ϕ(x)dµ − 1 N N n=1 ϕ(yn ) ≤ E Y , HKµ ϕ HKµ P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 2 / 8
  • 3. Local integration with kernel methods Monte Carlo estimations - consider the following error estimation (µ probability measure, Y = (y1 , . . . , yN ) ∈ RN×D ) RD ϕ(x)dµ − 1 N N n=1 ϕ(yn ) ≤ E Y , HKµ ϕ HKµ 1 example 1 : Y i.i.d. → E(Y , HKµ ) ∼ 1√ N and HKµ ∼ L2 (RD , |x|2 dµ) (stat.). 2 ex 1 : Y SOBOL, µ = dxΩ, Ω = [0, 1]D . Then HK = BV (Ω) (bounded variations), E(Y , HK ) ≥ ln(N)D−1 N ( Koksma-Hlawka sharp estimate conjecture). P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 2 / 8
  • 4. Local integration with kernel methods Monte Carlo estimations - consider the following error estimation (µ probability measure, Y = (y1 , . . . , yN ) ∈ RN×D ) RD ϕ(x)dµ − 1 N N n=1 ϕ(yn ) ≤ E Y , HKµ ϕ HKµ 1 example 1 : Y i.i.d. → E(Y , HKµ ) ∼ 1√ N and HKµ ∼ L2 (RD , |x|2 dµ) (stat.). 2 ex 1 : Y SOBOL, µ = dxΩ, Ω = [0, 1]D . Then HK = BV (Ω) (bounded variations), E(Y , HK ) ≥ ln(N)D−1 N ( Koksma-Hlawka sharp estimate conjecture). 3 Others examples ...quantifiers, wavelet, deep feed-forward neural networks ... P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 2 / 8
  • 5. Local integration with kernel methods Monte Carlo estimations - consider the following error estimation (µ probability measure, Y = (y1 , . . . , yN ) ∈ RN×D ) RD ϕ(x)dµ − 1 N N n=1 ϕ(yn ) ≤ E Y , HKµ ϕ HKµ 1 example 1 : Y i.i.d. → E(Y , HKµ ) ∼ 1√ N and HKµ ∼ L2 (RD , |x|2 dµ) (stat.). 2 ex 1 : Y SOBOL, µ = dxΩ, Ω = [0, 1]D . Then HK = BV (Ω) (bounded variations), E(Y , HK ) ≥ ln(N)D−1 N ( Koksma-Hlawka sharp estimate conjecture). 3 Others examples ...quantifiers, wavelet, deep feed-forward neural networks ... 4 A general method Let K(x, y), an admissible kernel, then E2 (Y , HKµ ) = R2D K(x, y)dµx dµy + 1 N2 N n,m=1 K(yn , ym )− 2 N N n=1 RD K(x, yn )dµx P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 2 / 8
  • 6. Local integration with kernel methods Monte Carlo estimations - consider the following error estimation (µ probability measure, Y = (y1 , . . . , yN ) ∈ RN×D ) RD ϕ(x)dµ − 1 N N n=1 ϕ(yn ) ≤ E Y , HKµ ϕ HKµ 1 example 1 : Y i.i.d. → E(Y , HKµ ) ∼ 1√ N and HKµ ∼ L2 (RD , |x|2 dµ) (stat.). 2 ex 1 : Y SOBOL, µ = dxΩ, Ω = [0, 1]D . Then HK = BV (Ω) (bounded variations), E(Y , HK ) ≥ ln(N)D−1 N ( Koksma-Hlawka sharp estimate conjecture). 3 Others examples ...quantifiers, wavelet, deep feed-forward neural networks ... 4 A general method Let K(x, y), an admissible kernel, then E2 (Y , HKµ ) = R2D K(x, y)dµx dµy + 1 N2 N n,m=1 K(yn , ym )− 2 N N n=1 RD K(x, yn )dµx 5 We can compute sharp discrepancy sequences and optimal discrepancy error as Y = arg inf Y ∈RD×N E(Y , HKµ ), EHKµ (N, D) = E(Y , HKµ ) P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 2 / 8
  • 7. Our local kernels : lattice-based and transported kernels Without loss of generality, consider µ = dx[0,1]D . We use two kind of kernels : 1 Lattice-based kernel : Let L a Lattice, L∗ its dual Lattice. Consider any discrete function satisfying φ(α∗ ) ∈ 1 (L∗ ), φ(α∗ ) ≥ 0, φ(0) = 1. Kper (x, y) = 1 |L| α∗∈L∗ φ(α∗ ) exp2iπ<x−y,α∗ > x y z Matern x y k Multiquadric x y k Gaussian x y k Truncated P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 3 / 8
  • 8. Our local kernels : lattice-based and transported kernels Without loss of generality, consider µ = dx[0,1]D . We use two kind of kernels : 1 Lattice-based kernel : Let L a Lattice, L∗ its dual Lattice. Consider any discrete function satisfying φ(α∗ ) ∈ 1 (L∗ ), φ(α∗ ) ≥ 0, φ(0) = 1. Kper (x, y) = 1 |L| α∗∈L∗ φ(α∗ ) exp2iπ<x−y,α∗ > x y z Matern x y k Multiquadric x y k Gaussian x y k Truncated 2 Transported kernel : S : Ω → RD a transport map. Ktra(x, y) = K(S(x), S(y)). x y k Matern x y k Gaussian x y k Multiquadric x y k Truncated P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 3 / 8
  • 9. An example : Monte-Carlo integration with Matern kernel 1 Kernel, random and computed sequences Y . N=256,D=2. x y z Matern 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 random points x y 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 computed points for lattice Matern x y P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 4 / 8
  • 10. An example : Monte-Carlo integration with Matern kernel 1 Kernel, random and computed sequences Y . N=256,D=2. x y z Matern 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 random points x y 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 computed points for lattice Matern x y 2 Optimal discrepancy error → Koksma-Hlavka type estimate EHK (N, D) ∼ n>N φ(α∗n) N ∼ ln(N)D−1 N , φ(α) = ΠD d=1 2 1 + 4π2α2 d /τ2 D P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 4 / 8
  • 11. An example : Monte-Carlo integration with Matern kernel 1 Kernel, random and computed sequences Y . N=256,D=2. x y z Matern 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 random points x y 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 computed points for lattice Matern x y 2 Optimal discrepancy error → Koksma-Hlavka type estimate EHK (N, D) ∼ n>N φ(α∗n) N ∼ ln(N)D−1 N , φ(α) = ΠD d=1 2 1 + 4π2α2 d /τ2 D 3 E(Y , HK ) random – vs E(Y , HK ) computed - vs theoretical EHK (N, D) D=1 D=16 D=128 N=16 0.228 0.304 0.319 N=128 0.117 0.111 0.115 N=512 0.035 0.054 0.059 D=1 D=16 D=128 N=16 0.062 0.211 0.223 N=128 0.008 0.069 0.077 N=512 0.002 0.034 0.049 D=1 D=16 D=128 N=16 0.062 0.288 0.323 N=128 0.008 0.077 0.105 N=512 0.002 0.034 0.043 P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 4 / 8
  • 12. Application : Transported Meshfree Methods (TMM) If µ(t, x) solution to non-linear hyperbolic-parabolic Fokker-Planck equations ∂t µ − Lµ = 0, L = · (bµ) + 2 · (Aµ), A := 1 2 σσT 1 FORWARD : Compute µ(t) ∼ 1 N δy1(t) + . . . + δyN (t) as best discrepancy sequences RD ϕ(x)dµ(t, x) − 1 N N n=1 ϕ(yn (t)) ≤ E Y (t), HKµ(t,·) ϕ HKµ(t,·) P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 5 / 8
  • 13. Application : Transported Meshfree Methods (TMM) If µ(t, x) solution to non-linear hyperbolic-parabolic Fokker-Planck equations ∂t µ − Lµ = 0, L = · (bµ) + 2 · (Aµ), A := 1 2 σσT 1 FORWARD : Compute µ(t) ∼ 1 N δy1(t) + . . . + δyN (t) as best discrepancy sequences RD ϕ(x)dµ(t, x) − 1 N N n=1 ϕ(yn (t)) ≤ E Y (t), HKµ(t,·) ϕ HKµ(t,·) 2 CHECK optimal rate : E Y (t), HKµ(t,·) ∼ EHKµ(t) (N, D) P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 5 / 8
  • 14. Application : Transported Meshfree Methods (TMM) If µ(t, x) solution to non-linear hyperbolic-parabolic Fokker-Planck equations ∂t µ − Lµ = 0, L = · (bµ) + 2 · (Aµ), A := 1 2 σσT 1 FORWARD : Compute µ(t) ∼ 1 N δy1(t) + . . . + δyN (t) as best discrepancy sequences RD ϕ(x)dµ(t, x) − 1 N N n=1 ϕ(yn (t)) ≤ E Y (t), HKµ(t,·) ϕ HKµ(t,·) 2 CHECK optimal rate : E Y (t), HKµ(t,·) ∼ EHKµ(t) (N, D) 3 BACKWARD : interpret t → yn (t), n = 1 . . . N as a moving, transported, PDE grid. Solve with it the Kolmogorov equation. ERROR ESTIMATION : RD P(t, ·)dµ(t, ·) − 1 N N n=1 P(t, yn (t)) ≤ EHKµ(t) (N, D) P(t, ·) HKµ(t,·) P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 5 / 8
  • 15. Application : Transported Meshfree Methods (TMM) If µ(t, x) solution to non-linear hyperbolic-parabolic Fokker-Planck equations ∂t µ − Lµ = 0, L = · (bµ) + 2 · (Aµ), A := 1 2 σσT 1 FORWARD : Compute µ(t) ∼ 1 N δy1(t) + . . . + δyN (t) as best discrepancy sequences RD ϕ(x)dµ(t, x) − 1 N N n=1 ϕ(yn (t)) ≤ E Y (t), HKµ(t,·) ϕ HKµ(t,·) 2 CHECK optimal rate : E Y (t), HKµ(t,·) ∼ EHKµ(t) (N, D) 3 BACKWARD : interpret t → yn (t), n = 1 . . . N as a moving, transported, PDE grid. Solve with it the Kolmogorov equation. ERROR ESTIMATION : RD P(t, ·)dµ(t, ·) − 1 N N n=1 P(t, yn (t)) ≤ EHKµ(t) (N, D) P(t, ·) HKµ(t,·) 4 HYPERBOLIC CASE (σ ≡ 0) : LAGRANGIAN MESHFREE METHODS. P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 5 / 8
  • 16. Illustration : the 2D SABR process, widely used in Finance SABR process d Ft αt = ρ αt Fβ t 0 0 ναt dW 1 t dW 2 t , with 0 ≤ β ≤ 1, ν ≥ 0,ρ ∈ R2×2 . The Fokker-Planck equation associated to SABR is ∂t µ + L∗ µ = 0, L∗ µ = ρ x2 2 2 xβ 1 0 0 ν2 2 x2 ρT · 2 µ. (Loading SABR200) P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 6 / 8
  • 17. academic tests, business cases, curse of dimensionality 1 Academic tests : finance, non-linear hyperbolic systems 1 Revisiting the method of characteristics via a convex hull algorithm 2 Numerical results using CoDeFi 3 A new method for solving Kolmogorov equations in mathematical finance P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 7 / 8
  • 18. academic tests, business cases, curse of dimensionality 1 Academic tests : finance, non-linear hyperbolic systems 1 Revisiting the method of characteristics via a convex hull algorithm 2 Numerical results using CoDeFi 3 A new method for solving Kolmogorov equations in mathematical finance 2 Business cases : finance 1 Hedging Strategies for Net Interest Income and Economic Values of Equity (http://dx.doi.org/10.2139/ssrn.3454813, with S.Miryusupov). 2 Compute metrics for big portfolio of Autocalls depending on several underlyings (unpublished). P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 7 / 8
  • 19. academic tests, business cases, curse of dimensionality 1 Academic tests : finance, non-linear hyperbolic systems 1 Revisiting the method of characteristics via a convex hull algorithm 2 Numerical results using CoDeFi 3 A new method for solving Kolmogorov equations in mathematical finance 2 Business cases : finance 1 Hedging Strategies for Net Interest Income and Economic Values of Equity (http://dx.doi.org/10.2139/ssrn.3454813, with S.Miryusupov). 2 Compute metrics for big portfolio of Autocalls depending on several underlyings (unpublished). 3 CURSE of dimensionality in finance : Price and manage a complex option written on several underlyings. Result ? We can solve at any order of accuracy : RD P(t, ·)dµ(t, ·) − 1 N N n=1 P(t, yn (t)) ≤ P(t, ·) HKµ(t,·) Nα ... ...where α ≥ 1/2 is ANY NUMBER ! Choose it according to your desired electricity bill ! BUT ...beware to low-regularity problems if the dimension D is too big (e.g. american-type options or Autocall) P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 7 / 8
  • 20. Summary and Conclusions We presented in this talk: 1 News, sharps, estimations for Monte-Carlo methods... P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 8 / 8
  • 21. Summary and Conclusions We presented in this talk: 1 News, sharps, estimations for Monte-Carlo methods... 2 ...that can be used in a wide variety of context to perform a sharp error analysis. P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 8 / 8
  • 22. Summary and Conclusions We presented in this talk: 1 News, sharps, estimations for Monte-Carlo methods... 2 ...that can be used in a wide variety of context to perform a sharp error analysis. 3 A new method for numerical simulations of PDE : Transported Meshfree Methods.... P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 8 / 8
  • 23. Summary and Conclusions We presented in this talk: 1 News, sharps, estimations for Monte-Carlo methods... 2 ...that can be used in a wide variety of context to perform a sharp error analysis. 3 A new method for numerical simulations of PDE : Transported Meshfree Methods.... 4 ... that can be used in a wide variety of applications (hyperbolic / parabolic equations, artificial intelligence, etc...)... P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 8 / 8
  • 24. Summary and Conclusions We presented in this talk: 1 News, sharps, estimations for Monte-Carlo methods... 2 ...that can be used in a wide variety of context to perform a sharp error analysis. 3 A new method for numerical simulations of PDE : Transported Meshfree Methods.... 4 ... that can be used in a wide variety of applications (hyperbolic / parabolic equations, artificial intelligence, etc...)... 5 ..for which the error analysis applies : we can guarantee a worst error estimation, and we can check that this error matches an optimal convergence rate. P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)Integration with Kernel Methods, Transported Meshfree methods. 19 10 2019 8 / 8