SlideShare a Scribd company logo
1 of 48
Download to read offline
Sampling Methods for Statistical Inference
2020. 10. 19
Jinhwan Suk
Department of Mathematical Science, KAIST
GROOT
SEMINAR
GROOT SEMINAR
GROOT AI
Obstacles in latent modeling
• Latent variable model
• We need posterior
Sampling Methods for Statistical Inference
GROOT
SEMINAR
Latent Variable Model
GROOT
SEMINAR
𝑧 𝒟
e.g. RBM, VAE, GAN, HMM, Particle Filter, …
prior
likelihood
Data
Latent Variable Model
GROOT
SEMINAR
𝑧 𝒟
e.g. RBM, VAE, GAN, HMM, Particle Filter, …
prior
likelihood
Data
Latent Variable Model
GROOT
SEMINAR
𝑧 𝒟
e.g. RBM, VAE, GAN, HMM, Particle Filter, …
prior
likelihood
Data
Latent Variable Model
GROOT
SEMINAR
𝑧 𝒟
e.g. RBM, VAE, GAN, HMM, Particle Filter, …
prior
likelihood
Hidden markov model
Data
Latent Variable Model
GROOT
SEMINAR
𝑧 𝒟
e.g. RBM, VAE, GAN, HMM, Particle Filter, …
prior
likelihood
Hidden markov model VAE
Data
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
EM	Algorithm
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
EM	Algorithm
log𝑝( 𝒟) =
∫
log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
                         =
∫
𝑙𝑜𝑔
𝑝( 𝒟, 𝑧)
𝑝( 𝑧 𝒟)
 𝑝( 𝑧 𝒟) 𝑑𝑧
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
EM	Algorithm
log𝑝( 𝒟) =
∫
log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
                         =
∫
𝑙𝑜𝑔
𝑝( 𝒟, 𝑧)
𝑝( 𝑧 𝒟)
 𝑝( 𝑧 𝒟) 𝑑𝑧
=
∫
log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 −
∫
log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
EM	Algorithm
log𝑝( 𝒟) =
∫
log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
                         =
∫
𝑙𝑜𝑔
𝑝( 𝒟, 𝑧)
𝑝( 𝑧 𝒟)
 𝑝( 𝑧 𝒟) 𝑑𝑧
=
∫
log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 −
∫
log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
= 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] + 𝐻 ≥ 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)]
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
EM	Algorithm
log𝑝( 𝒟) =
∫
log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
                         =
∫
𝑙𝑜𝑔
𝑝( 𝒟, 𝑧)
𝑝( 𝑧 𝒟)
 𝑝( 𝑧 𝒟) 𝑑𝑧
=
∫
log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 −
∫
log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
= 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] + 𝐻 ≥ 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)]
E-Step
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
EM	Algorithm
log𝑝( 𝒟) =
∫
log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
                         =
∫
𝑙𝑜𝑔
𝑝( 𝒟, 𝑧)
𝑝( 𝑧 𝒟)
 𝑝( 𝑧 𝒟) 𝑑𝑧
=
∫
log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 −
∫
log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
= 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] + 𝐻 ≥ 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)]
E-Step
𝑝( 𝑧 𝒟) =
𝑝( 𝒟, 𝑧)
𝑝(𝒟)
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
EM	Algorithm
log𝑝( 𝒟) =
∫
log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
                         =
∫
𝑙𝑜𝑔
𝑝( 𝒟, 𝑧)
𝑝( 𝑧 𝒟)
 𝑝( 𝑧 𝒟) 𝑑𝑧
=
∫
log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 −
∫
log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
= 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] + 𝐻 ≥ 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)]
E-Step
𝑝( 𝑧 𝒟) =
𝑝( 𝒟, 𝑧)
𝑝(𝒟)
Intractable!
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
Posterior inference
𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟]
𝒟
Data
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
Posterior inference
𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟]
𝒟
Data
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
Posterior inference
𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟]
𝒟
Data
“Bayesian inference is all about posterior inference”
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
Posterior inference
𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟]
𝒟
Data
Direct computing is impossible
“Bayesian inference is all about posterior inference”
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
Posterior inference
𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟]
𝒟
Data
Direct computing is impossible
Approximation! But…how?
“Bayesian inference is all about posterior inference”
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
Posterior inference
𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟]
𝒟
Data
Direct computing is impossible
Approximation! But…how?
“Bayesian inference is all about posterior inference”
Let target distribution denoted by 𝑝(𝑥)
Two ways of approximation
GROOT
SEMINAR
Two ways of approximation
GROOT
SEMINAR
1.
Two ways of approximation
GROOT
SEMINAR
1.
2.
Two ways of approximation
GROOT
SEMINAR
두 분포 사이의 거리를 줄이자
1.
2.
Variational Inference
Two ways of approximation
GROOT
SEMINAR
두 분포 사이의 거리를 줄이자
1.
2.
Variational Inference
Optimization problem
Two ways of approximation
GROOT
SEMINAR
두 분포 사이의 거리를 줄이자
1.
2. 표본을 추출하자
Variational Inference
Optimization problem
Two ways of approximation
GROOT
SEMINAR
두 분포 사이의 거리를 줄이자
1.
2. 표본을 추출하자
Variational Inference
Monte Carlo Method
Optimization problem
Two ways of approximation
GROOT
SEMINAR
두 분포 사이의 거리를 줄이자
1.
2. 표본을 추출하자
Variational Inference
Monte Carlo Method
Optimization problem
Sampling method
Monte Carlo Method
GROOT
SEMINAR
Monte Carlo Method
GROOT
SEMINAR
The Law of Large Numbers
𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛
𝑛
→ 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑
Monte Carlo Method
GROOT
SEMINAR
The Law of Large Numbers
𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛
𝑛
→ 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑
But…
Monte Carlo Method
GROOT
SEMINAR
The Law of Large Numbers
𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛
𝑛
→ 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑
But… Yes, How do we sample from ?𝑝(𝑥)
Monte Carlo Method
GROOT
SEMINAR
The Law of Large Numbers
𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛
𝑛
→ 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑
But… Yes, How do we sample from ?𝑝(𝑥)
Rejection Sampling
Monte Carlo Method
GROOT
SEMINAR
The Law of Large Numbers
𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛
𝑛
→ 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑
But… Yes, How do we sample from ?𝑝(𝑥)
: proposal distribution𝑞(𝑥)
Rejection Sampling
Easy to compute, easy to sample
Monte Carlo Method
GROOT
SEMINAR
The Law of Large Numbers
𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛
𝑛
→ 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑
But… Yes, How do we sample from ?𝑝(𝑥)
: proposal distribution𝑞(𝑥)
Rejection Sampling
Easy to compute, easy to sample
𝑀𝑞(𝑥) ≥ ~𝑝(𝑥)
If , we reject the sample, otherwise we accept it𝑢 >
~𝑝(𝑥)
𝑀𝑞(𝑥)
Monte Carlo Method
GROOT
SEMINAR
The Law of Large Numbers
𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛
𝑛
→ 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑
But… Yes, How do we sample from ?𝑝(𝑥)
: proposal distribution𝑞(𝑥)
Rejection Sampling
Easy to compute, easy to sample
𝑀𝑞(𝑥) ≥ ~𝑝(𝑥)
If , we reject the sample, otherwise we accept it𝑢 >
~𝑝(𝑥)
𝑀𝑞(𝑥)
Monte Carlo Method
GROOT
SEMINAR
Monte Carlo Method
GROOT
SEMINAR
𝜇 = 𝔼𝑝(𝑥)[ 𝑓(𝑥)] =
∫
𝑓(𝑥)
~𝑝(𝑥)
𝑍
𝑑𝑥 =
1
𝑍 ∫
𝑓(𝑥)
~𝑝(𝑥)
𝑞(𝑥)
𝑞(𝑥)𝑑𝑥
Importance Samping
Monte Carlo Method
GROOT
SEMINAR
𝜇 = 𝔼𝑝(𝑥)[ 𝑓(𝑥)] =
∫
𝑓(𝑥)
~𝑝(𝑥)
𝑍
𝑑𝑥 =
1
𝑍 ∫
𝑓(𝑥)
~𝑝(𝑥)
𝑞(𝑥)
𝑞(𝑥)𝑑𝑥
Importance Samping
𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈  
1
𝑛𝑍 ∑
𝑓( 𝑥𝑖)
~𝑝( 𝑥𝑖)
𝑞( 𝑥𝑖)
=
1
𝑛𝑍 ∑
𝑤( 𝑥𝑖) 𝑓(𝑥𝑖)
𝑍 =
∫
~𝑝(𝑥)𝑑𝑥 =
∫
~𝑝(𝑥)
𝑞(𝑥)
𝑞(𝑥)𝑑𝑥 ≈
1
𝑛 ∑
𝑤(𝑥𝑖)
𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈
∑
~𝑤( 𝑥𝑖) 𝑓(𝑥𝑖)
Monte Carlo Method
GROOT
SEMINAR
𝜇 = 𝔼𝑝(𝑥)[ 𝑓(𝑥)] =
∫
𝑓(𝑥)
~𝑝(𝑥)
𝑍
𝑑𝑥 =
1
𝑍 ∫
𝑓(𝑥)
~𝑝(𝑥)
𝑞(𝑥)
𝑞(𝑥)𝑑𝑥
Importance Samping
𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈  
1
𝑛𝑍 ∑
𝑓( 𝑥𝑖)
~𝑝( 𝑥𝑖)
𝑞( 𝑥𝑖)
=
1
𝑛𝑍 ∑
𝑤( 𝑥𝑖) 𝑓(𝑥𝑖)
𝑍 =
∫
~𝑝(𝑥)𝑑𝑥 =
∫
~𝑝(𝑥)
𝑞(𝑥)
𝑞(𝑥)𝑑𝑥 ≈
1
𝑛 ∑
𝑤(𝑥𝑖)
𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈
∑
~𝑤( 𝑥𝑖) 𝑓(𝑥𝑖)
i.i.d sampling is very vulnerable in high-dimensional spaces
Markov Chain Monte Carlo
• Gibbs Sampling
• Metropolis-Hestings Algorithm
Sampling Methods for Statistical Inference
GROOT
SEMINAR
Basic idea of MCMC
GROOT
SEMINAR
Markov Chain :
𝑝( 𝑥(𝑡+1)
 𝑥(1),  𝑥(2)
, …,  𝑥(𝑡)
) = 𝑝( 𝑥(𝑡+1)
𝑥(𝑡)
)
is a sequence of random variables. It forms a Markov chain if𝑥(1)
,  𝑥(2)
, …
A Markov chain can be specified by
1. Initial distribution
2. Transition probability
𝑝1(𝑥) = 𝑝( 𝑥(1)
)
𝑇 𝑘(𝑥′,  𝑥) = 𝑝( 𝑥(𝑡+1)
= 𝑥′ 𝑥(𝑡)
= 𝑥)
Ergodicity
, regardless of the initial distributionlim
𝑛→∞
𝑇 𝑛
𝑝1 = 𝜋 𝑝1
1. Build a Markov chain having
as an invariant distribution
2. Sample from the chain
3. Compute
𝑝(𝑥)
( 𝑥(𝑡)
)𝑡≥1
𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈ 𝔼 𝑇 𝑛 𝑝1(𝑥)[ 𝑓(𝑥)]
≈
1
𝑛 ∑
𝑓( 𝑥(𝑡)
)
MCMC Algorithms
GROOT
SEMINAR
𝑥 𝑠+1
𝑖 ∼ 𝑝(𝑥𝑖 | 𝒙−𝒊)Gibbs Sampling
Metropolis-Hastings algorithm Hamiltonian MCMC
Sequential MCMC
Stochastic gradient MCMC
Particle MCMC
Thank you
Sampling Methods for Statistical Inference
2020. 10. 19
Jinhwan Suk
Department of Mathematical Science, KAIST
GROOT
SEMINAR

More Related Content

What's hot

Estimation Theory Class (Summary and Revision)
Estimation Theory Class (Summary and Revision)Estimation Theory Class (Summary and Revision)
Estimation Theory Class (Summary and Revision)Ahmad Gomaa
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationChristian Robert
 
transformations and nonparametric inference
transformations and nonparametric inferencetransformations and nonparametric inference
transformations and nonparametric inferenceArthur Charpentier
 
A Course in Fuzzy Systems and Control Matlab Chapter Three
A Course in Fuzzy Systems and Control Matlab Chapter ThreeA Course in Fuzzy Systems and Control Matlab Chapter Three
A Course in Fuzzy Systems and Control Matlab Chapter ThreeChung Hua Universit
 
Data Approximation in Mathematical Modelling Regression Analysis and Curve Fi...
Data Approximation in Mathematical Modelling Regression Analysis and Curve Fi...Data Approximation in Mathematical Modelling Regression Analysis and Curve Fi...
Data Approximation in Mathematical Modelling Regression Analysis and Curve Fi...Dr.Summiya Parveen
 
Direct Methods For The Solution Of Systems Of
Direct Methods For The Solution Of Systems OfDirect Methods For The Solution Of Systems Of
Direct Methods For The Solution Of Systems OfMarcela Carrillo
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsChristian Robert
 
Operators n dirac in qm
Operators n dirac in qmOperators n dirac in qm
Operators n dirac in qmAnda Tywabi
 
Comparison Results of Trapezoidal, Simpson’s 13 rule, Simpson’s 38 rule, and ...
Comparison Results of Trapezoidal, Simpson’s 13 rule, Simpson’s 38 rule, and ...Comparison Results of Trapezoidal, Simpson’s 13 rule, Simpson’s 38 rule, and ...
Comparison Results of Trapezoidal, Simpson’s 13 rule, Simpson’s 38 rule, and ...BRNSS Publication Hub
 

What's hot (20)

Estimation Theory Class (Summary and Revision)
Estimation Theory Class (Summary and Revision)Estimation Theory Class (Summary and Revision)
Estimation Theory Class (Summary and Revision)
 
Slides ACTINFO 2016
Slides ACTINFO 2016Slides ACTINFO 2016
Slides ACTINFO 2016
 
Side 2019 #5
Side 2019 #5Side 2019 #5
Side 2019 #5
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimation
 
transformations and nonparametric inference
transformations and nonparametric inferencetransformations and nonparametric inference
transformations and nonparametric inference
 
Classification
ClassificationClassification
Classification
 
A Course in Fuzzy Systems and Control Matlab Chapter Three
A Course in Fuzzy Systems and Control Matlab Chapter ThreeA Course in Fuzzy Systems and Control Matlab Chapter Three
A Course in Fuzzy Systems and Control Matlab Chapter Three
 
Data Approximation in Mathematical Modelling Regression Analysis and Curve Fi...
Data Approximation in Mathematical Modelling Regression Analysis and Curve Fi...Data Approximation in Mathematical Modelling Regression Analysis and Curve Fi...
Data Approximation in Mathematical Modelling Regression Analysis and Curve Fi...
 
Algebra-taller2
Algebra-taller2Algebra-taller2
Algebra-taller2
 
Direct Methods For The Solution Of Systems Of
Direct Methods For The Solution Of Systems OfDirect Methods For The Solution Of Systems Of
Direct Methods For The Solution Of Systems Of
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
 
Least Squares
Least SquaresLeast Squares
Least Squares
 
Operators n dirac in qm
Operators n dirac in qmOperators n dirac in qm
Operators n dirac in qm
 
Side 2019 #7
Side 2019 #7Side 2019 #7
Side 2019 #7
 
Curve Fitting
Curve FittingCurve Fitting
Curve Fitting
 
Econometrics 2017-graduate-3
Econometrics 2017-graduate-3Econometrics 2017-graduate-3
Econometrics 2017-graduate-3
 
Comparison Results of Trapezoidal, Simpson’s 13 rule, Simpson’s 38 rule, and ...
Comparison Results of Trapezoidal, Simpson’s 13 rule, Simpson’s 38 rule, and ...Comparison Results of Trapezoidal, Simpson’s 13 rule, Simpson’s 38 rule, and ...
Comparison Results of Trapezoidal, Simpson’s 13 rule, Simpson’s 38 rule, and ...
 
Backpropagation
BackpropagationBackpropagation
Backpropagation
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
RTSP Report
RTSP ReportRTSP Report
RTSP Report
 

Similar to Sampling method : MCMC

Introduction to PyTorch
Introduction to PyTorchIntroduction to PyTorch
Introduction to PyTorchJun Young Park
 
Analysis of large scale spiking networks dynamics with spatio-temporal constr...
Analysis of large scale spiking networks dynamics with spatio-temporal constr...Analysis of large scale spiking networks dynamics with spatio-temporal constr...
Analysis of large scale spiking networks dynamics with spatio-temporal constr...Hassan Nasser
 
Solving Poisson Equation using Conjugate Gradient Method and its implementation
Solving Poisson Equation using Conjugate Gradient Methodand its implementationSolving Poisson Equation using Conjugate Gradient Methodand its implementation
Solving Poisson Equation using Conjugate Gradient Method and its implementationJongsu "Liam" Kim
 
Intro to Quant Trading Strategies (Lecture 2 of 10)
Intro to Quant Trading Strategies (Lecture 2 of 10)Intro to Quant Trading Strategies (Lecture 2 of 10)
Intro to Quant Trading Strategies (Lecture 2 of 10)Adrian Aley
 
cat-quant-cheat-sheet
 cat-quant-cheat-sheet cat-quant-cheat-sheet
cat-quant-cheat-sheettechonomics1
 
Maximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of BeetleMaximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of BeetleLiang Kai Hu
 
[GAN by Hung-yi Lee]Part 1: General introduction of GAN
[GAN by Hung-yi Lee]Part 1: General introduction of GAN[GAN by Hung-yi Lee]Part 1: General introduction of GAN
[GAN by Hung-yi Lee]Part 1: General introduction of GANNAVER Engineering
 
diffusion_posterior_sampling_for_general_noisy_inverse_problems_slideshare.pdf
diffusion_posterior_sampling_for_general_noisy_inverse_problems_slideshare.pdfdiffusion_posterior_sampling_for_general_noisy_inverse_problems_slideshare.pdf
diffusion_posterior_sampling_for_general_noisy_inverse_problems_slideshare.pdfChung Hyung Jin
 
Stochastic optimal control & rl
Stochastic optimal control & rlStochastic optimal control & rl
Stochastic optimal control & rlChoiJinwon3
 
fuzzy fuzzification and defuzzification
fuzzy fuzzification and defuzzificationfuzzy fuzzification and defuzzification
fuzzy fuzzification and defuzzificationNourhan Selem Salm
 
Vehicle Routing Problem using PSO (Particle Swarm Optimization)
Vehicle Routing Problem using PSO (Particle Swarm Optimization)Vehicle Routing Problem using PSO (Particle Swarm Optimization)
Vehicle Routing Problem using PSO (Particle Swarm Optimization)Niharika Varshney
 
Kalman filter for Beginners
Kalman filter for BeginnersKalman filter for Beginners
Kalman filter for Beginnerswinfred lu
 
Pricing optimization poster version 2 (1)
Pricing optimization poster version 2 (1)Pricing optimization poster version 2 (1)
Pricing optimization poster version 2 (1)Alex Potocki
 
Han Liu MedicReS World Congress 2015
Han Liu MedicReS World Congress 2015Han Liu MedicReS World Congress 2015
Han Liu MedicReS World Congress 2015MedicReS
 
GAN in_kakao
GAN in_kakaoGAN in_kakao
GAN in_kakaoJunho Kim
 
ITERATIVE METHODS FOR THE SOLUTION OF SADDLE POINT PROBLEM
ITERATIVE METHODS FOR THE SOLUTION OF SADDLE POINT PROBLEMITERATIVE METHODS FOR THE SOLUTION OF SADDLE POINT PROBLEM
ITERATIVE METHODS FOR THE SOLUTION OF SADDLE POINT PROBLEMIAEME Publication
 
Iterative methods for the solution of saddle point problem
Iterative methods for the solution of saddle point problemIterative methods for the solution of saddle point problem
Iterative methods for the solution of saddle point problemIAEME Publication
 
Intro to Quant Trading Strategies (Lecture 6 of 10)
Intro to Quant Trading Strategies (Lecture 6 of 10)Intro to Quant Trading Strategies (Lecture 6 of 10)
Intro to Quant Trading Strategies (Lecture 6 of 10)Adrian Aley
 

Similar to Sampling method : MCMC (20)

Introduction to PyTorch
Introduction to PyTorchIntroduction to PyTorch
Introduction to PyTorch
 
Analysis of large scale spiking networks dynamics with spatio-temporal constr...
Analysis of large scale spiking networks dynamics with spatio-temporal constr...Analysis of large scale spiking networks dynamics with spatio-temporal constr...
Analysis of large scale spiking networks dynamics with spatio-temporal constr...
 
Solving Poisson Equation using Conjugate Gradient Method and its implementation
Solving Poisson Equation using Conjugate Gradient Methodand its implementationSolving Poisson Equation using Conjugate Gradient Methodand its implementation
Solving Poisson Equation using Conjugate Gradient Method and its implementation
 
Intro to Quant Trading Strategies (Lecture 2 of 10)
Intro to Quant Trading Strategies (Lecture 2 of 10)Intro to Quant Trading Strategies (Lecture 2 of 10)
Intro to Quant Trading Strategies (Lecture 2 of 10)
 
Cat Quant Cheat Sheet
Cat Quant Cheat SheetCat Quant Cheat Sheet
Cat Quant Cheat Sheet
 
cat-quant-cheat-sheet
 cat-quant-cheat-sheet cat-quant-cheat-sheet
cat-quant-cheat-sheet
 
Maximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of BeetleMaximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of Beetle
 
[GAN by Hung-yi Lee]Part 1: General introduction of GAN
[GAN by Hung-yi Lee]Part 1: General introduction of GAN[GAN by Hung-yi Lee]Part 1: General introduction of GAN
[GAN by Hung-yi Lee]Part 1: General introduction of GAN
 
diffusion_posterior_sampling_for_general_noisy_inverse_problems_slideshare.pdf
diffusion_posterior_sampling_for_general_noisy_inverse_problems_slideshare.pdfdiffusion_posterior_sampling_for_general_noisy_inverse_problems_slideshare.pdf
diffusion_posterior_sampling_for_general_noisy_inverse_problems_slideshare.pdf
 
Stochastic optimal control & rl
Stochastic optimal control & rlStochastic optimal control & rl
Stochastic optimal control & rl
 
fuzzy fuzzification and defuzzification
fuzzy fuzzification and defuzzificationfuzzy fuzzification and defuzzification
fuzzy fuzzification and defuzzification
 
Vehicle Routing Problem using PSO (Particle Swarm Optimization)
Vehicle Routing Problem using PSO (Particle Swarm Optimization)Vehicle Routing Problem using PSO (Particle Swarm Optimization)
Vehicle Routing Problem using PSO (Particle Swarm Optimization)
 
Kalman filter for Beginners
Kalman filter for BeginnersKalman filter for Beginners
Kalman filter for Beginners
 
Pricing optimization poster version 2 (1)
Pricing optimization poster version 2 (1)Pricing optimization poster version 2 (1)
Pricing optimization poster version 2 (1)
 
Han Liu MedicReS World Congress 2015
Han Liu MedicReS World Congress 2015Han Liu MedicReS World Congress 2015
Han Liu MedicReS World Congress 2015
 
GAN in_kakao
GAN in_kakaoGAN in_kakao
GAN in_kakao
 
ITERATIVE METHODS FOR THE SOLUTION OF SADDLE POINT PROBLEM
ITERATIVE METHODS FOR THE SOLUTION OF SADDLE POINT PROBLEMITERATIVE METHODS FOR THE SOLUTION OF SADDLE POINT PROBLEM
ITERATIVE METHODS FOR THE SOLUTION OF SADDLE POINT PROBLEM
 
Iterative methods for the solution of saddle point problem
Iterative methods for the solution of saddle point problemIterative methods for the solution of saddle point problem
Iterative methods for the solution of saddle point problem
 
Modifed my_poster
Modifed my_posterModifed my_poster
Modifed my_poster
 
Intro to Quant Trading Strategies (Lecture 6 of 10)
Intro to Quant Trading Strategies (Lecture 6 of 10)Intro to Quant Trading Strategies (Lecture 6 of 10)
Intro to Quant Trading Strategies (Lecture 6 of 10)
 

More from SEMINARGROOT

Metric based meta_learning
Metric based meta_learningMetric based meta_learning
Metric based meta_learningSEMINARGROOT
 
Demystifying Neural Style Transfer
Demystifying Neural Style TransferDemystifying Neural Style Transfer
Demystifying Neural Style TransferSEMINARGROOT
 
Towards Deep Learning Models Resistant to Adversarial Attacks.
Towards Deep Learning Models Resistant to Adversarial Attacks.Towards Deep Learning Models Resistant to Adversarial Attacks.
Towards Deep Learning Models Resistant to Adversarial Attacks.SEMINARGROOT
 
The ways of node embedding
The ways of node embeddingThe ways of node embedding
The ways of node embeddingSEMINARGROOT
 
Graph Convolutional Network
Graph  Convolutional NetworkGraph  Convolutional Network
Graph Convolutional NetworkSEMINARGROOT
 
Denoising With Frequency Domain
Denoising With Frequency DomainDenoising With Frequency Domain
Denoising With Frequency DomainSEMINARGROOT
 
Bayesian Statistics
Bayesian StatisticsBayesian Statistics
Bayesian StatisticsSEMINARGROOT
 
Coding Test Review 3
Coding Test Review 3Coding Test Review 3
Coding Test Review 3SEMINARGROOT
 
Time Series Analysis - ARMA
Time Series Analysis - ARMATime Series Analysis - ARMA
Time Series Analysis - ARMASEMINARGROOT
 
Generative models : VAE and GAN
Generative models : VAE and GANGenerative models : VAE and GAN
Generative models : VAE and GANSEMINARGROOT
 
Attention Is All You Need
Attention Is All You NeedAttention Is All You Need
Attention Is All You NeedSEMINARGROOT
 
WWW 2020 XAI Tutorial Review
WWW 2020 XAI Tutorial ReviewWWW 2020 XAI Tutorial Review
WWW 2020 XAI Tutorial ReviewSEMINARGROOT
 
Coding test review 2
Coding test review 2Coding test review 2
Coding test review 2SEMINARGROOT
 
Locality sensitive hashing
Locality sensitive hashingLocality sensitive hashing
Locality sensitive hashingSEMINARGROOT
 
Coding Test Review1
Coding Test Review1Coding Test Review1
Coding Test Review1SEMINARGROOT
 
Strong convexity on gradient descent and newton's method
Strong convexity on gradient descent and newton's methodStrong convexity on gradient descent and newton's method
Strong convexity on gradient descent and newton's methodSEMINARGROOT
 
SVM (Support Vector Machine & Kernel)
SVM (Support Vector Machine & Kernel)SVM (Support Vector Machine & Kernel)
SVM (Support Vector Machine & Kernel)SEMINARGROOT
 
Gaussian Process Regression
Gaussian Process Regression  Gaussian Process Regression
Gaussian Process Regression SEMINARGROOT
 

More from SEMINARGROOT (20)

Metric based meta_learning
Metric based meta_learningMetric based meta_learning
Metric based meta_learning
 
Demystifying Neural Style Transfer
Demystifying Neural Style TransferDemystifying Neural Style Transfer
Demystifying Neural Style Transfer
 
Towards Deep Learning Models Resistant to Adversarial Attacks.
Towards Deep Learning Models Resistant to Adversarial Attacks.Towards Deep Learning Models Resistant to Adversarial Attacks.
Towards Deep Learning Models Resistant to Adversarial Attacks.
 
The ways of node embedding
The ways of node embeddingThe ways of node embedding
The ways of node embedding
 
Graph Convolutional Network
Graph  Convolutional NetworkGraph  Convolutional Network
Graph Convolutional Network
 
Denoising With Frequency Domain
Denoising With Frequency DomainDenoising With Frequency Domain
Denoising With Frequency Domain
 
Bayesian Statistics
Bayesian StatisticsBayesian Statistics
Bayesian Statistics
 
Coding Test Review 3
Coding Test Review 3Coding Test Review 3
Coding Test Review 3
 
Time Series Analysis - ARMA
Time Series Analysis - ARMATime Series Analysis - ARMA
Time Series Analysis - ARMA
 
Generative models : VAE and GAN
Generative models : VAE and GANGenerative models : VAE and GAN
Generative models : VAE and GAN
 
Effective Python
Effective PythonEffective Python
Effective Python
 
Attention Is All You Need
Attention Is All You NeedAttention Is All You Need
Attention Is All You Need
 
Attention
AttentionAttention
Attention
 
WWW 2020 XAI Tutorial Review
WWW 2020 XAI Tutorial ReviewWWW 2020 XAI Tutorial Review
WWW 2020 XAI Tutorial Review
 
Coding test review 2
Coding test review 2Coding test review 2
Coding test review 2
 
Locality sensitive hashing
Locality sensitive hashingLocality sensitive hashing
Locality sensitive hashing
 
Coding Test Review1
Coding Test Review1Coding Test Review1
Coding Test Review1
 
Strong convexity on gradient descent and newton's method
Strong convexity on gradient descent and newton's methodStrong convexity on gradient descent and newton's method
Strong convexity on gradient descent and newton's method
 
SVM (Support Vector Machine & Kernel)
SVM (Support Vector Machine & Kernel)SVM (Support Vector Machine & Kernel)
SVM (Support Vector Machine & Kernel)
 
Gaussian Process Regression
Gaussian Process Regression  Gaussian Process Regression
Gaussian Process Regression
 

Recently uploaded

Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)
Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)
Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)jennyeacort
 
GA4 Without Cookies [Measure Camp AMS]
GA4 Without Cookies [Measure Camp AMS]GA4 Without Cookies [Measure Camp AMS]
GA4 Without Cookies [Measure Camp AMS]📊 Markus Baersch
 
代办国外大学文凭《原版美国UCLA文凭证书》加州大学洛杉矶分校毕业证制作成绩单修改
代办国外大学文凭《原版美国UCLA文凭证书》加州大学洛杉矶分校毕业证制作成绩单修改代办国外大学文凭《原版美国UCLA文凭证书》加州大学洛杉矶分校毕业证制作成绩单修改
代办国外大学文凭《原版美国UCLA文凭证书》加州大学洛杉矶分校毕业证制作成绩单修改atducpo
 
Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptx
Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptxAmazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptx
Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptxAbdelrhman abooda
 
20240419 - Measurecamp Amsterdam - SAM.pdf
20240419 - Measurecamp Amsterdam - SAM.pdf20240419 - Measurecamp Amsterdam - SAM.pdf
20240419 - Measurecamp Amsterdam - SAM.pdfHuman37
 
Beautiful Sapna Vip Call Girls Hauz Khas 9711199012 Call /Whatsapps
Beautiful Sapna Vip  Call Girls Hauz Khas 9711199012 Call /WhatsappsBeautiful Sapna Vip  Call Girls Hauz Khas 9711199012 Call /Whatsapps
Beautiful Sapna Vip Call Girls Hauz Khas 9711199012 Call /Whatsappssapnasaifi408
 
Kantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdf
Kantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdfKantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdf
Kantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdfSocial Samosa
 
办理学位证纽约大学毕业证(NYU毕业证书)原版一比一
办理学位证纽约大学毕业证(NYU毕业证书)原版一比一办理学位证纽约大学毕业证(NYU毕业证书)原版一比一
办理学位证纽约大学毕业证(NYU毕业证书)原版一比一fhwihughh
 
Brighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data StorytellingBrighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data StorytellingNeil Barnes
 
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024thyngster
 
PKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptxPKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptxPramod Kumar Srivastava
 
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...soniya singh
 
RadioAdProWritingCinderellabyButleri.pdf
RadioAdProWritingCinderellabyButleri.pdfRadioAdProWritingCinderellabyButleri.pdf
RadioAdProWritingCinderellabyButleri.pdfgstagge
 
Predictive Analysis - Using Insight-informed Data to Determine Factors Drivin...
Predictive Analysis - Using Insight-informed Data to Determine Factors Drivin...Predictive Analysis - Using Insight-informed Data to Determine Factors Drivin...
Predictive Analysis - Using Insight-informed Data to Determine Factors Drivin...ThinkInnovation
 
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝soniya singh
 
INTERNSHIP ON PURBASHA COMPOSITE TEX LTD
INTERNSHIP ON PURBASHA COMPOSITE TEX LTDINTERNSHIP ON PURBASHA COMPOSITE TEX LTD
INTERNSHIP ON PURBASHA COMPOSITE TEX LTDRafezzaman
 
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort servicejennyeacort
 
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130Suhani Kapoor
 

Recently uploaded (20)

Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)
Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)
Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)
 
GA4 Without Cookies [Measure Camp AMS]
GA4 Without Cookies [Measure Camp AMS]GA4 Without Cookies [Measure Camp AMS]
GA4 Without Cookies [Measure Camp AMS]
 
代办国外大学文凭《原版美国UCLA文凭证书》加州大学洛杉矶分校毕业证制作成绩单修改
代办国外大学文凭《原版美国UCLA文凭证书》加州大学洛杉矶分校毕业证制作成绩单修改代办国外大学文凭《原版美国UCLA文凭证书》加州大学洛杉矶分校毕业证制作成绩单修改
代办国外大学文凭《原版美国UCLA文凭证书》加州大学洛杉矶分校毕业证制作成绩单修改
 
Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptx
Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptxAmazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptx
Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptx
 
20240419 - Measurecamp Amsterdam - SAM.pdf
20240419 - Measurecamp Amsterdam - SAM.pdf20240419 - Measurecamp Amsterdam - SAM.pdf
20240419 - Measurecamp Amsterdam - SAM.pdf
 
Beautiful Sapna Vip Call Girls Hauz Khas 9711199012 Call /Whatsapps
Beautiful Sapna Vip  Call Girls Hauz Khas 9711199012 Call /WhatsappsBeautiful Sapna Vip  Call Girls Hauz Khas 9711199012 Call /Whatsapps
Beautiful Sapna Vip Call Girls Hauz Khas 9711199012 Call /Whatsapps
 
Kantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdf
Kantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdfKantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdf
Kantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdf
 
办理学位证纽约大学毕业证(NYU毕业证书)原版一比一
办理学位证纽约大学毕业证(NYU毕业证书)原版一比一办理学位证纽约大学毕业证(NYU毕业证书)原版一比一
办理学位证纽约大学毕业证(NYU毕业证书)原版一比一
 
Brighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data StorytellingBrighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data Storytelling
 
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
 
E-Commerce Order PredictionShraddha Kamble.pptx
E-Commerce Order PredictionShraddha Kamble.pptxE-Commerce Order PredictionShraddha Kamble.pptx
E-Commerce Order PredictionShraddha Kamble.pptx
 
PKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptxPKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptx
 
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
 
RadioAdProWritingCinderellabyButleri.pdf
RadioAdProWritingCinderellabyButleri.pdfRadioAdProWritingCinderellabyButleri.pdf
RadioAdProWritingCinderellabyButleri.pdf
 
Decoding Loan Approval: Predictive Modeling in Action
Decoding Loan Approval: Predictive Modeling in ActionDecoding Loan Approval: Predictive Modeling in Action
Decoding Loan Approval: Predictive Modeling in Action
 
Predictive Analysis - Using Insight-informed Data to Determine Factors Drivin...
Predictive Analysis - Using Insight-informed Data to Determine Factors Drivin...Predictive Analysis - Using Insight-informed Data to Determine Factors Drivin...
Predictive Analysis - Using Insight-informed Data to Determine Factors Drivin...
 
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
 
INTERNSHIP ON PURBASHA COMPOSITE TEX LTD
INTERNSHIP ON PURBASHA COMPOSITE TEX LTDINTERNSHIP ON PURBASHA COMPOSITE TEX LTD
INTERNSHIP ON PURBASHA COMPOSITE TEX LTD
 
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
 
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
 

Sampling method : MCMC

  • 1. Sampling Methods for Statistical Inference 2020. 10. 19 Jinhwan Suk Department of Mathematical Science, KAIST GROOT SEMINAR GROOT SEMINAR GROOT AI
  • 2. Obstacles in latent modeling • Latent variable model • We need posterior Sampling Methods for Statistical Inference GROOT SEMINAR
  • 3. Latent Variable Model GROOT SEMINAR 𝑧 𝒟 e.g. RBM, VAE, GAN, HMM, Particle Filter, … prior likelihood Data
  • 4. Latent Variable Model GROOT SEMINAR 𝑧 𝒟 e.g. RBM, VAE, GAN, HMM, Particle Filter, … prior likelihood Data
  • 5. Latent Variable Model GROOT SEMINAR 𝑧 𝒟 e.g. RBM, VAE, GAN, HMM, Particle Filter, … prior likelihood Data
  • 6. Latent Variable Model GROOT SEMINAR 𝑧 𝒟 e.g. RBM, VAE, GAN, HMM, Particle Filter, … prior likelihood Hidden markov model Data
  • 7. Latent Variable Model GROOT SEMINAR 𝑧 𝒟 e.g. RBM, VAE, GAN, HMM, Particle Filter, … prior likelihood Hidden markov model VAE Data
  • 8. Latent Variable Model GROOT SEMINAR zprior likelihood Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data
  • 9. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data
  • 10. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data
  • 11. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data
  • 12. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data EM Algorithm
  • 13. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data EM Algorithm log𝑝( 𝒟) = ∫ log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧                          = ∫ 𝑙𝑜𝑔 𝑝( 𝒟, 𝑧) 𝑝( 𝑧 𝒟)  𝑝( 𝑧 𝒟) 𝑑𝑧
  • 14. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data EM Algorithm log𝑝( 𝒟) = ∫ log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧                          = ∫ 𝑙𝑜𝑔 𝑝( 𝒟, 𝑧) 𝑝( 𝑧 𝒟)  𝑝( 𝑧 𝒟) 𝑑𝑧 = ∫ log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 − ∫ log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
  • 15. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data EM Algorithm log𝑝( 𝒟) = ∫ log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧                          = ∫ 𝑙𝑜𝑔 𝑝( 𝒟, 𝑧) 𝑝( 𝑧 𝒟)  𝑝( 𝑧 𝒟) 𝑑𝑧 = ∫ log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 − ∫ log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧 = 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] + 𝐻 ≥ 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)]
  • 16. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data EM Algorithm log𝑝( 𝒟) = ∫ log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧                          = ∫ 𝑙𝑜𝑔 𝑝( 𝒟, 𝑧) 𝑝( 𝑧 𝒟)  𝑝( 𝑧 𝒟) 𝑑𝑧 = ∫ log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 − ∫ log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧 = 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] + 𝐻 ≥ 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] E-Step
  • 17. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data EM Algorithm log𝑝( 𝒟) = ∫ log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧                          = ∫ 𝑙𝑜𝑔 𝑝( 𝒟, 𝑧) 𝑝( 𝑧 𝒟)  𝑝( 𝑧 𝒟) 𝑑𝑧 = ∫ log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 − ∫ log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧 = 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] + 𝐻 ≥ 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] E-Step 𝑝( 𝑧 𝒟) = 𝑝( 𝒟, 𝑧) 𝑝(𝒟)
  • 18. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data EM Algorithm log𝑝( 𝒟) = ∫ log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧                          = ∫ 𝑙𝑜𝑔 𝑝( 𝒟, 𝑧) 𝑝( 𝑧 𝒟)  𝑝( 𝑧 𝒟) 𝑑𝑧 = ∫ log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 − ∫ log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧 = 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] + 𝐻 ≥ 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] E-Step 𝑝( 𝑧 𝒟) = 𝑝( 𝒟, 𝑧) 𝑝(𝒟) Intractable!
  • 19. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) Posterior inference 𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟] 𝒟 Data
  • 20. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) Posterior inference 𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟] 𝒟 Data
  • 21. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) Posterior inference 𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟] 𝒟 Data “Bayesian inference is all about posterior inference”
  • 22. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) Posterior inference 𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟] 𝒟 Data Direct computing is impossible “Bayesian inference is all about posterior inference”
  • 23. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) Posterior inference 𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟] 𝒟 Data Direct computing is impossible Approximation! But…how? “Bayesian inference is all about posterior inference”
  • 24. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) Posterior inference 𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟] 𝒟 Data Direct computing is impossible Approximation! But…how? “Bayesian inference is all about posterior inference” Let target distribution denoted by 𝑝(𝑥)
  • 25. Two ways of approximation GROOT SEMINAR
  • 26. Two ways of approximation GROOT SEMINAR 1.
  • 27. Two ways of approximation GROOT SEMINAR 1. 2.
  • 28. Two ways of approximation GROOT SEMINAR 두 분포 사이의 거리를 줄이자 1. 2. Variational Inference
  • 29. Two ways of approximation GROOT SEMINAR 두 분포 사이의 거리를 줄이자 1. 2. Variational Inference Optimization problem
  • 30. Two ways of approximation GROOT SEMINAR 두 분포 사이의 거리를 줄이자 1. 2. 표본을 추출하자 Variational Inference Optimization problem
  • 31. Two ways of approximation GROOT SEMINAR 두 분포 사이의 거리를 줄이자 1. 2. 표본을 추출하자 Variational Inference Monte Carlo Method Optimization problem
  • 32. Two ways of approximation GROOT SEMINAR 두 분포 사이의 거리를 줄이자 1. 2. 표본을 추출하자 Variational Inference Monte Carlo Method Optimization problem Sampling method
  • 34. Monte Carlo Method GROOT SEMINAR The Law of Large Numbers 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 𝑛 → 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑
  • 35. Monte Carlo Method GROOT SEMINAR The Law of Large Numbers 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 𝑛 → 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑 But…
  • 36. Monte Carlo Method GROOT SEMINAR The Law of Large Numbers 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 𝑛 → 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑 But… Yes, How do we sample from ?𝑝(𝑥)
  • 37. Monte Carlo Method GROOT SEMINAR The Law of Large Numbers 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 𝑛 → 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑 But… Yes, How do we sample from ?𝑝(𝑥) Rejection Sampling
  • 38. Monte Carlo Method GROOT SEMINAR The Law of Large Numbers 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 𝑛 → 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑 But… Yes, How do we sample from ?𝑝(𝑥) : proposal distribution𝑞(𝑥) Rejection Sampling Easy to compute, easy to sample
  • 39. Monte Carlo Method GROOT SEMINAR The Law of Large Numbers 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 𝑛 → 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑 But… Yes, How do we sample from ?𝑝(𝑥) : proposal distribution𝑞(𝑥) Rejection Sampling Easy to compute, easy to sample 𝑀𝑞(𝑥) ≥ ~𝑝(𝑥) If , we reject the sample, otherwise we accept it𝑢 > ~𝑝(𝑥) 𝑀𝑞(𝑥)
  • 40. Monte Carlo Method GROOT SEMINAR The Law of Large Numbers 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 𝑛 → 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑 But… Yes, How do we sample from ?𝑝(𝑥) : proposal distribution𝑞(𝑥) Rejection Sampling Easy to compute, easy to sample 𝑀𝑞(𝑥) ≥ ~𝑝(𝑥) If , we reject the sample, otherwise we accept it𝑢 > ~𝑝(𝑥) 𝑀𝑞(𝑥)
  • 42. Monte Carlo Method GROOT SEMINAR 𝜇 = 𝔼𝑝(𝑥)[ 𝑓(𝑥)] = ∫ 𝑓(𝑥) ~𝑝(𝑥) 𝑍 𝑑𝑥 = 1 𝑍 ∫ 𝑓(𝑥) ~𝑝(𝑥) 𝑞(𝑥) 𝑞(𝑥)𝑑𝑥 Importance Samping
  • 43. Monte Carlo Method GROOT SEMINAR 𝜇 = 𝔼𝑝(𝑥)[ 𝑓(𝑥)] = ∫ 𝑓(𝑥) ~𝑝(𝑥) 𝑍 𝑑𝑥 = 1 𝑍 ∫ 𝑓(𝑥) ~𝑝(𝑥) 𝑞(𝑥) 𝑞(𝑥)𝑑𝑥 Importance Samping 𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈   1 𝑛𝑍 ∑ 𝑓( 𝑥𝑖) ~𝑝( 𝑥𝑖) 𝑞( 𝑥𝑖) = 1 𝑛𝑍 ∑ 𝑤( 𝑥𝑖) 𝑓(𝑥𝑖) 𝑍 = ∫ ~𝑝(𝑥)𝑑𝑥 = ∫ ~𝑝(𝑥) 𝑞(𝑥) 𝑞(𝑥)𝑑𝑥 ≈ 1 𝑛 ∑ 𝑤(𝑥𝑖) 𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈ ∑ ~𝑤( 𝑥𝑖) 𝑓(𝑥𝑖)
  • 44. Monte Carlo Method GROOT SEMINAR 𝜇 = 𝔼𝑝(𝑥)[ 𝑓(𝑥)] = ∫ 𝑓(𝑥) ~𝑝(𝑥) 𝑍 𝑑𝑥 = 1 𝑍 ∫ 𝑓(𝑥) ~𝑝(𝑥) 𝑞(𝑥) 𝑞(𝑥)𝑑𝑥 Importance Samping 𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈   1 𝑛𝑍 ∑ 𝑓( 𝑥𝑖) ~𝑝( 𝑥𝑖) 𝑞( 𝑥𝑖) = 1 𝑛𝑍 ∑ 𝑤( 𝑥𝑖) 𝑓(𝑥𝑖) 𝑍 = ∫ ~𝑝(𝑥)𝑑𝑥 = ∫ ~𝑝(𝑥) 𝑞(𝑥) 𝑞(𝑥)𝑑𝑥 ≈ 1 𝑛 ∑ 𝑤(𝑥𝑖) 𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈ ∑ ~𝑤( 𝑥𝑖) 𝑓(𝑥𝑖) i.i.d sampling is very vulnerable in high-dimensional spaces
  • 45. Markov Chain Monte Carlo • Gibbs Sampling • Metropolis-Hestings Algorithm Sampling Methods for Statistical Inference GROOT SEMINAR
  • 46. Basic idea of MCMC GROOT SEMINAR Markov Chain : 𝑝( 𝑥(𝑡+1)  𝑥(1),  𝑥(2) , …,  𝑥(𝑡) ) = 𝑝( 𝑥(𝑡+1) 𝑥(𝑡) ) is a sequence of random variables. It forms a Markov chain if𝑥(1) ,  𝑥(2) , … A Markov chain can be specified by 1. Initial distribution 2. Transition probability 𝑝1(𝑥) = 𝑝( 𝑥(1) ) 𝑇 𝑘(𝑥′,  𝑥) = 𝑝( 𝑥(𝑡+1) = 𝑥′ 𝑥(𝑡) = 𝑥) Ergodicity , regardless of the initial distributionlim 𝑛→∞ 𝑇 𝑛 𝑝1 = 𝜋 𝑝1 1. Build a Markov chain having as an invariant distribution 2. Sample from the chain 3. Compute 𝑝(𝑥) ( 𝑥(𝑡) )𝑡≥1 𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈ 𝔼 𝑇 𝑛 𝑝1(𝑥)[ 𝑓(𝑥)] ≈ 1 𝑛 ∑ 𝑓( 𝑥(𝑡) )
  • 47. MCMC Algorithms GROOT SEMINAR 𝑥 𝑠+1 𝑖 ∼ 𝑝(𝑥𝑖 | 𝒙−𝒊)Gibbs Sampling Metropolis-Hastings algorithm Hamiltonian MCMC Sequential MCMC Stochastic gradient MCMC Particle MCMC
  • 48. Thank you Sampling Methods for Statistical Inference 2020. 10. 19 Jinhwan Suk Department of Mathematical Science, KAIST GROOT SEMINAR