SlideShare a Scribd company logo
Sampling Methods for Statistical Inference
2020. 10. 19
Jinhwan Suk
Department of Mathematical Science, KAIST
GROOT
SEMINAR
GROOT SEMINAR
GROOT AI
Obstacles in latent modeling
• Latent variable model
• We need posterior
Sampling Methods for Statistical Inference
GROOT
SEMINAR
Latent Variable Model
GROOT
SEMINAR
𝑧 𝒟
e.g. RBM, VAE, GAN, HMM, Particle Filter, …
prior
likelihood
Data
Latent Variable Model
GROOT
SEMINAR
𝑧 𝒟
e.g. RBM, VAE, GAN, HMM, Particle Filter, …
prior
likelihood
Data
Latent Variable Model
GROOT
SEMINAR
𝑧 𝒟
e.g. RBM, VAE, GAN, HMM, Particle Filter, …
prior
likelihood
Data
Latent Variable Model
GROOT
SEMINAR
𝑧 𝒟
e.g. RBM, VAE, GAN, HMM, Particle Filter, …
prior
likelihood
Hidden markov model
Data
Latent Variable Model
GROOT
SEMINAR
𝑧 𝒟
e.g. RBM, VAE, GAN, HMM, Particle Filter, …
prior
likelihood
Hidden markov model VAE
Data
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
EM	Algorithm
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
EM	Algorithm
log𝑝( 𝒟) =
∫
log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
                         =
∫
𝑙𝑜𝑔
𝑝( 𝒟, 𝑧)
𝑝( 𝑧 𝒟)
 𝑝( 𝑧 𝒟) 𝑑𝑧
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
EM	Algorithm
log𝑝( 𝒟) =
∫
log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
                         =
∫
𝑙𝑜𝑔
𝑝( 𝒟, 𝑧)
𝑝( 𝑧 𝒟)
 𝑝( 𝑧 𝒟) 𝑑𝑧
=
∫
log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 −
∫
log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
EM	Algorithm
log𝑝( 𝒟) =
∫
log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
                         =
∫
𝑙𝑜𝑔
𝑝( 𝒟, 𝑧)
𝑝( 𝑧 𝒟)
 𝑝( 𝑧 𝒟) 𝑑𝑧
=
∫
log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 −
∫
log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
= 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] + 𝐻 ≥ 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)]
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
EM	Algorithm
log𝑝( 𝒟) =
∫
log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
                         =
∫
𝑙𝑜𝑔
𝑝( 𝒟, 𝑧)
𝑝( 𝑧 𝒟)
 𝑝( 𝑧 𝒟) 𝑑𝑧
=
∫
log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 −
∫
log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
= 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] + 𝐻 ≥ 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)]
E-Step
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
EM	Algorithm
log𝑝( 𝒟) =
∫
log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
                         =
∫
𝑙𝑜𝑔
𝑝( 𝒟, 𝑧)
𝑝( 𝑧 𝒟)
 𝑝( 𝑧 𝒟) 𝑑𝑧
=
∫
log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 −
∫
log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
= 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] + 𝐻 ≥ 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)]
E-Step
𝑝( 𝑧 𝒟) =
𝑝( 𝒟, 𝑧)
𝑝(𝒟)
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
𝒟
Data
EM	Algorithm
log𝑝( 𝒟) =
∫
log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
                         =
∫
𝑙𝑜𝑔
𝑝( 𝒟, 𝑧)
𝑝( 𝑧 𝒟)
 𝑝( 𝑧 𝒟) 𝑑𝑧
=
∫
log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 −
∫
log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
= 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] + 𝐻 ≥ 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)]
E-Step
𝑝( 𝑧 𝒟) =
𝑝( 𝒟, 𝑧)
𝑝(𝒟)
Intractable!
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
Posterior inference
𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟]
𝒟
Data
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
Posterior inference
𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟]
𝒟
Data
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
Posterior inference
𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟]
𝒟
Data
“Bayesian inference is all about posterior inference”
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
Posterior inference
𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟]
𝒟
Data
Direct computing is impossible
“Bayesian inference is all about posterior inference”
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
Posterior inference
𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟]
𝒟
Data
Direct computing is impossible
Approximation! But…how?
“Bayesian inference is all about posterior inference”
Latent Variable Model
GROOT
SEMINAR
zprior
likelihood
^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟)
Inference
𝑝𝜃( 𝒟) =
∫
𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧
Compute Intractable!
Given
Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧)
Posterior inference
𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟]
𝒟
Data
Direct computing is impossible
Approximation! But…how?
“Bayesian inference is all about posterior inference”
Let target distribution denoted by 𝑝(𝑥)
Two ways of approximation
GROOT
SEMINAR
Two ways of approximation
GROOT
SEMINAR
1.
Two ways of approximation
GROOT
SEMINAR
1.
2.
Two ways of approximation
GROOT
SEMINAR
두 분포 사이의 거리를 줄이자
1.
2.
Variational Inference
Two ways of approximation
GROOT
SEMINAR
두 분포 사이의 거리를 줄이자
1.
2.
Variational Inference
Optimization problem
Two ways of approximation
GROOT
SEMINAR
두 분포 사이의 거리를 줄이자
1.
2. 표본을 추출하자
Variational Inference
Optimization problem
Two ways of approximation
GROOT
SEMINAR
두 분포 사이의 거리를 줄이자
1.
2. 표본을 추출하자
Variational Inference
Monte Carlo Method
Optimization problem
Two ways of approximation
GROOT
SEMINAR
두 분포 사이의 거리를 줄이자
1.
2. 표본을 추출하자
Variational Inference
Monte Carlo Method
Optimization problem
Sampling method
Monte Carlo Method
GROOT
SEMINAR
Monte Carlo Method
GROOT
SEMINAR
The Law of Large Numbers
𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛
𝑛
→ 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑
Monte Carlo Method
GROOT
SEMINAR
The Law of Large Numbers
𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛
𝑛
→ 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑
But…
Monte Carlo Method
GROOT
SEMINAR
The Law of Large Numbers
𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛
𝑛
→ 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑
But… Yes, How do we sample from ?𝑝(𝑥)
Monte Carlo Method
GROOT
SEMINAR
The Law of Large Numbers
𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛
𝑛
→ 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑
But… Yes, How do we sample from ?𝑝(𝑥)
Rejection Sampling
Monte Carlo Method
GROOT
SEMINAR
The Law of Large Numbers
𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛
𝑛
→ 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑
But… Yes, How do we sample from ?𝑝(𝑥)
: proposal distribution𝑞(𝑥)
Rejection Sampling
Easy to compute, easy to sample
Monte Carlo Method
GROOT
SEMINAR
The Law of Large Numbers
𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛
𝑛
→ 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑
But… Yes, How do we sample from ?𝑝(𝑥)
: proposal distribution𝑞(𝑥)
Rejection Sampling
Easy to compute, easy to sample
𝑀𝑞(𝑥) ≥ ~𝑝(𝑥)
If , we reject the sample, otherwise we accept it𝑢 >
~𝑝(𝑥)
𝑀𝑞(𝑥)
Monte Carlo Method
GROOT
SEMINAR
The Law of Large Numbers
𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛
𝑛
→ 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑
But… Yes, How do we sample from ?𝑝(𝑥)
: proposal distribution𝑞(𝑥)
Rejection Sampling
Easy to compute, easy to sample
𝑀𝑞(𝑥) ≥ ~𝑝(𝑥)
If , we reject the sample, otherwise we accept it𝑢 >
~𝑝(𝑥)
𝑀𝑞(𝑥)
Monte Carlo Method
GROOT
SEMINAR
Monte Carlo Method
GROOT
SEMINAR
𝜇 = 𝔼𝑝(𝑥)[ 𝑓(𝑥)] =
∫
𝑓(𝑥)
~𝑝(𝑥)
𝑍
𝑑𝑥 =
1
𝑍 ∫
𝑓(𝑥)
~𝑝(𝑥)
𝑞(𝑥)
𝑞(𝑥)𝑑𝑥
Importance Samping
Monte Carlo Method
GROOT
SEMINAR
𝜇 = 𝔼𝑝(𝑥)[ 𝑓(𝑥)] =
∫
𝑓(𝑥)
~𝑝(𝑥)
𝑍
𝑑𝑥 =
1
𝑍 ∫
𝑓(𝑥)
~𝑝(𝑥)
𝑞(𝑥)
𝑞(𝑥)𝑑𝑥
Importance Samping
𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈  
1
𝑛𝑍 ∑
𝑓( 𝑥𝑖)
~𝑝( 𝑥𝑖)
𝑞( 𝑥𝑖)
=
1
𝑛𝑍 ∑
𝑤( 𝑥𝑖) 𝑓(𝑥𝑖)
𝑍 =
∫
~𝑝(𝑥)𝑑𝑥 =
∫
~𝑝(𝑥)
𝑞(𝑥)
𝑞(𝑥)𝑑𝑥 ≈
1
𝑛 ∑
𝑤(𝑥𝑖)
𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈
∑
~𝑤( 𝑥𝑖) 𝑓(𝑥𝑖)
Monte Carlo Method
GROOT
SEMINAR
𝜇 = 𝔼𝑝(𝑥)[ 𝑓(𝑥)] =
∫
𝑓(𝑥)
~𝑝(𝑥)
𝑍
𝑑𝑥 =
1
𝑍 ∫
𝑓(𝑥)
~𝑝(𝑥)
𝑞(𝑥)
𝑞(𝑥)𝑑𝑥
Importance Samping
𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈  
1
𝑛𝑍 ∑
𝑓( 𝑥𝑖)
~𝑝( 𝑥𝑖)
𝑞( 𝑥𝑖)
=
1
𝑛𝑍 ∑
𝑤( 𝑥𝑖) 𝑓(𝑥𝑖)
𝑍 =
∫
~𝑝(𝑥)𝑑𝑥 =
∫
~𝑝(𝑥)
𝑞(𝑥)
𝑞(𝑥)𝑑𝑥 ≈
1
𝑛 ∑
𝑤(𝑥𝑖)
𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈
∑
~𝑤( 𝑥𝑖) 𝑓(𝑥𝑖)
i.i.d sampling is very vulnerable in high-dimensional spaces
Markov Chain Monte Carlo
• Gibbs Sampling
• Metropolis-Hestings Algorithm
Sampling Methods for Statistical Inference
GROOT
SEMINAR
Basic idea of MCMC
GROOT
SEMINAR
Markov Chain :
𝑝( 𝑥(𝑡+1)
 𝑥(1),  𝑥(2)
, …,  𝑥(𝑡)
) = 𝑝( 𝑥(𝑡+1)
𝑥(𝑡)
)
is a sequence of random variables. It forms a Markov chain if𝑥(1)
,  𝑥(2)
, …
A Markov chain can be specified by
1. Initial distribution
2. Transition probability
𝑝1(𝑥) = 𝑝( 𝑥(1)
)
𝑇 𝑘(𝑥′,  𝑥) = 𝑝( 𝑥(𝑡+1)
= 𝑥′ 𝑥(𝑡)
= 𝑥)
Ergodicity
, regardless of the initial distributionlim
𝑛→∞
𝑇 𝑛
𝑝1 = 𝜋 𝑝1
1. Build a Markov chain having
as an invariant distribution
2. Sample from the chain
3. Compute
𝑝(𝑥)
( 𝑥(𝑡)
)𝑡≥1
𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈ 𝔼 𝑇 𝑛 𝑝1(𝑥)[ 𝑓(𝑥)]
≈
1
𝑛 ∑
𝑓( 𝑥(𝑡)
)
MCMC Algorithms
GROOT
SEMINAR
𝑥 𝑠+1
𝑖 ∼ 𝑝(𝑥𝑖 | 𝒙−𝒊)Gibbs Sampling
Metropolis-Hastings algorithm Hamiltonian MCMC
Sequential MCMC
Stochastic gradient MCMC
Particle MCMC
Thank you
Sampling Methods for Statistical Inference
2020. 10. 19
Jinhwan Suk
Department of Mathematical Science, KAIST
GROOT
SEMINAR

More Related Content

What's hot

Estimation Theory Class (Summary and Revision)
Estimation Theory Class (Summary and Revision)Estimation Theory Class (Summary and Revision)
Estimation Theory Class (Summary and Revision)Ahmad Gomaa
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationChristian Robert
 
transformations and nonparametric inference
transformations and nonparametric inferencetransformations and nonparametric inference
transformations and nonparametric inferenceArthur Charpentier
 
A Course in Fuzzy Systems and Control Matlab Chapter Three
A Course in Fuzzy Systems and Control Matlab Chapter ThreeA Course in Fuzzy Systems and Control Matlab Chapter Three
A Course in Fuzzy Systems and Control Matlab Chapter ThreeChung Hua Universit
 
Data Approximation in Mathematical Modelling Regression Analysis and Curve Fi...
Data Approximation in Mathematical Modelling Regression Analysis and Curve Fi...Data Approximation in Mathematical Modelling Regression Analysis and Curve Fi...
Data Approximation in Mathematical Modelling Regression Analysis and Curve Fi...Dr.Summiya Parveen
 
Direct Methods For The Solution Of Systems Of
Direct Methods For The Solution Of Systems OfDirect Methods For The Solution Of Systems Of
Direct Methods For The Solution Of Systems OfMarcela Carrillo
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsChristian Robert
 
Operators n dirac in qm
Operators n dirac in qmOperators n dirac in qm
Operators n dirac in qmAnda Tywabi
 
Comparison Results of Trapezoidal, Simpson’s 13 rule, Simpson’s 38 rule, and ...
Comparison Results of Trapezoidal, Simpson’s 13 rule, Simpson’s 38 rule, and ...Comparison Results of Trapezoidal, Simpson’s 13 rule, Simpson’s 38 rule, and ...
Comparison Results of Trapezoidal, Simpson’s 13 rule, Simpson’s 38 rule, and ...BRNSS Publication Hub
 

What's hot (20)

Estimation Theory Class (Summary and Revision)
Estimation Theory Class (Summary and Revision)Estimation Theory Class (Summary and Revision)
Estimation Theory Class (Summary and Revision)
 
Slides ACTINFO 2016
Slides ACTINFO 2016Slides ACTINFO 2016
Slides ACTINFO 2016
 
Side 2019 #5
Side 2019 #5Side 2019 #5
Side 2019 #5
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimation
 
transformations and nonparametric inference
transformations and nonparametric inferencetransformations and nonparametric inference
transformations and nonparametric inference
 
Classification
ClassificationClassification
Classification
 
A Course in Fuzzy Systems and Control Matlab Chapter Three
A Course in Fuzzy Systems and Control Matlab Chapter ThreeA Course in Fuzzy Systems and Control Matlab Chapter Three
A Course in Fuzzy Systems and Control Matlab Chapter Three
 
Data Approximation in Mathematical Modelling Regression Analysis and Curve Fi...
Data Approximation in Mathematical Modelling Regression Analysis and Curve Fi...Data Approximation in Mathematical Modelling Regression Analysis and Curve Fi...
Data Approximation in Mathematical Modelling Regression Analysis and Curve Fi...
 
Algebra-taller2
Algebra-taller2Algebra-taller2
Algebra-taller2
 
Direct Methods For The Solution Of Systems Of
Direct Methods For The Solution Of Systems OfDirect Methods For The Solution Of Systems Of
Direct Methods For The Solution Of Systems Of
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
 
Least Squares
Least SquaresLeast Squares
Least Squares
 
Operators n dirac in qm
Operators n dirac in qmOperators n dirac in qm
Operators n dirac in qm
 
Side 2019 #7
Side 2019 #7Side 2019 #7
Side 2019 #7
 
Curve Fitting
Curve FittingCurve Fitting
Curve Fitting
 
Econometrics 2017-graduate-3
Econometrics 2017-graduate-3Econometrics 2017-graduate-3
Econometrics 2017-graduate-3
 
Comparison Results of Trapezoidal, Simpson’s 13 rule, Simpson’s 38 rule, and ...
Comparison Results of Trapezoidal, Simpson’s 13 rule, Simpson’s 38 rule, and ...Comparison Results of Trapezoidal, Simpson’s 13 rule, Simpson’s 38 rule, and ...
Comparison Results of Trapezoidal, Simpson’s 13 rule, Simpson’s 38 rule, and ...
 
Backpropagation
BackpropagationBackpropagation
Backpropagation
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
RTSP Report
RTSP ReportRTSP Report
RTSP Report
 

Similar to Sampling method : MCMC

Introduction to PyTorch
Introduction to PyTorchIntroduction to PyTorch
Introduction to PyTorchJun Young Park
 
Analysis of large scale spiking networks dynamics with spatio-temporal constr...
Analysis of large scale spiking networks dynamics with spatio-temporal constr...Analysis of large scale spiking networks dynamics with spatio-temporal constr...
Analysis of large scale spiking networks dynamics with spatio-temporal constr...Hassan Nasser
 
Solving Poisson Equation using Conjugate Gradient Method and its implementation
Solving Poisson Equation using Conjugate Gradient Methodand its implementationSolving Poisson Equation using Conjugate Gradient Methodand its implementation
Solving Poisson Equation using Conjugate Gradient Method and its implementationJongsu "Liam" Kim
 
Intro to Quant Trading Strategies (Lecture 2 of 10)
Intro to Quant Trading Strategies (Lecture 2 of 10)Intro to Quant Trading Strategies (Lecture 2 of 10)
Intro to Quant Trading Strategies (Lecture 2 of 10)Adrian Aley
 
cat-quant-cheat-sheet
 cat-quant-cheat-sheet cat-quant-cheat-sheet
cat-quant-cheat-sheettechonomics1
 
Maximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of BeetleMaximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of BeetleLiang Kai Hu
 
[GAN by Hung-yi Lee]Part 1: General introduction of GAN
[GAN by Hung-yi Lee]Part 1: General introduction of GAN[GAN by Hung-yi Lee]Part 1: General introduction of GAN
[GAN by Hung-yi Lee]Part 1: General introduction of GANNAVER Engineering
 
diffusion_posterior_sampling_for_general_noisy_inverse_problems_slideshare.pdf
diffusion_posterior_sampling_for_general_noisy_inverse_problems_slideshare.pdfdiffusion_posterior_sampling_for_general_noisy_inverse_problems_slideshare.pdf
diffusion_posterior_sampling_for_general_noisy_inverse_problems_slideshare.pdfChung Hyung Jin
 
Stochastic optimal control & rl
Stochastic optimal control & rlStochastic optimal control & rl
Stochastic optimal control & rlChoiJinwon3
 
fuzzy fuzzification and defuzzification
fuzzy fuzzification and defuzzificationfuzzy fuzzification and defuzzification
fuzzy fuzzification and defuzzificationNourhan Selem Salm
 
Vehicle Routing Problem using PSO (Particle Swarm Optimization)
Vehicle Routing Problem using PSO (Particle Swarm Optimization)Vehicle Routing Problem using PSO (Particle Swarm Optimization)
Vehicle Routing Problem using PSO (Particle Swarm Optimization)Niharika Varshney
 
Kalman filter for Beginners
Kalman filter for BeginnersKalman filter for Beginners
Kalman filter for Beginnerswinfred lu
 
Pricing optimization poster version 2 (1)
Pricing optimization poster version 2 (1)Pricing optimization poster version 2 (1)
Pricing optimization poster version 2 (1)Alex Potocki
 
Han Liu MedicReS World Congress 2015
Han Liu MedicReS World Congress 2015Han Liu MedicReS World Congress 2015
Han Liu MedicReS World Congress 2015MedicReS
 
GAN in_kakao
GAN in_kakaoGAN in_kakao
GAN in_kakaoJunho Kim
 
ITERATIVE METHODS FOR THE SOLUTION OF SADDLE POINT PROBLEM
ITERATIVE METHODS FOR THE SOLUTION OF SADDLE POINT PROBLEMITERATIVE METHODS FOR THE SOLUTION OF SADDLE POINT PROBLEM
ITERATIVE METHODS FOR THE SOLUTION OF SADDLE POINT PROBLEMIAEME Publication
 
Iterative methods for the solution of saddle point problem
Iterative methods for the solution of saddle point problemIterative methods for the solution of saddle point problem
Iterative methods for the solution of saddle point problemIAEME Publication
 
Intro to Quant Trading Strategies (Lecture 6 of 10)
Intro to Quant Trading Strategies (Lecture 6 of 10)Intro to Quant Trading Strategies (Lecture 6 of 10)
Intro to Quant Trading Strategies (Lecture 6 of 10)Adrian Aley
 

Similar to Sampling method : MCMC (20)

Introduction to PyTorch
Introduction to PyTorchIntroduction to PyTorch
Introduction to PyTorch
 
Analysis of large scale spiking networks dynamics with spatio-temporal constr...
Analysis of large scale spiking networks dynamics with spatio-temporal constr...Analysis of large scale spiking networks dynamics with spatio-temporal constr...
Analysis of large scale spiking networks dynamics with spatio-temporal constr...
 
Solving Poisson Equation using Conjugate Gradient Method and its implementation
Solving Poisson Equation using Conjugate Gradient Methodand its implementationSolving Poisson Equation using Conjugate Gradient Methodand its implementation
Solving Poisson Equation using Conjugate Gradient Method and its implementation
 
Intro to Quant Trading Strategies (Lecture 2 of 10)
Intro to Quant Trading Strategies (Lecture 2 of 10)Intro to Quant Trading Strategies (Lecture 2 of 10)
Intro to Quant Trading Strategies (Lecture 2 of 10)
 
cat-quant-cheat-sheet
 cat-quant-cheat-sheet cat-quant-cheat-sheet
cat-quant-cheat-sheet
 
Cat Quant Cheat Sheet
Cat Quant Cheat SheetCat Quant Cheat Sheet
Cat Quant Cheat Sheet
 
Maximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of BeetleMaximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of Beetle
 
[GAN by Hung-yi Lee]Part 1: General introduction of GAN
[GAN by Hung-yi Lee]Part 1: General introduction of GAN[GAN by Hung-yi Lee]Part 1: General introduction of GAN
[GAN by Hung-yi Lee]Part 1: General introduction of GAN
 
diffusion_posterior_sampling_for_general_noisy_inverse_problems_slideshare.pdf
diffusion_posterior_sampling_for_general_noisy_inverse_problems_slideshare.pdfdiffusion_posterior_sampling_for_general_noisy_inverse_problems_slideshare.pdf
diffusion_posterior_sampling_for_general_noisy_inverse_problems_slideshare.pdf
 
Stochastic optimal control & rl
Stochastic optimal control & rlStochastic optimal control & rl
Stochastic optimal control & rl
 
fuzzy fuzzification and defuzzification
fuzzy fuzzification and defuzzificationfuzzy fuzzification and defuzzification
fuzzy fuzzification and defuzzification
 
Vehicle Routing Problem using PSO (Particle Swarm Optimization)
Vehicle Routing Problem using PSO (Particle Swarm Optimization)Vehicle Routing Problem using PSO (Particle Swarm Optimization)
Vehicle Routing Problem using PSO (Particle Swarm Optimization)
 
Kalman filter for Beginners
Kalman filter for BeginnersKalman filter for Beginners
Kalman filter for Beginners
 
Pricing optimization poster version 2 (1)
Pricing optimization poster version 2 (1)Pricing optimization poster version 2 (1)
Pricing optimization poster version 2 (1)
 
Han Liu MedicReS World Congress 2015
Han Liu MedicReS World Congress 2015Han Liu MedicReS World Congress 2015
Han Liu MedicReS World Congress 2015
 
GAN in_kakao
GAN in_kakaoGAN in_kakao
GAN in_kakao
 
ITERATIVE METHODS FOR THE SOLUTION OF SADDLE POINT PROBLEM
ITERATIVE METHODS FOR THE SOLUTION OF SADDLE POINT PROBLEMITERATIVE METHODS FOR THE SOLUTION OF SADDLE POINT PROBLEM
ITERATIVE METHODS FOR THE SOLUTION OF SADDLE POINT PROBLEM
 
Iterative methods for the solution of saddle point problem
Iterative methods for the solution of saddle point problemIterative methods for the solution of saddle point problem
Iterative methods for the solution of saddle point problem
 
Modifed my_poster
Modifed my_posterModifed my_poster
Modifed my_poster
 
Intro to Quant Trading Strategies (Lecture 6 of 10)
Intro to Quant Trading Strategies (Lecture 6 of 10)Intro to Quant Trading Strategies (Lecture 6 of 10)
Intro to Quant Trading Strategies (Lecture 6 of 10)
 

More from SEMINARGROOT

Metric based meta_learning
Metric based meta_learningMetric based meta_learning
Metric based meta_learningSEMINARGROOT
 
Demystifying Neural Style Transfer
Demystifying Neural Style TransferDemystifying Neural Style Transfer
Demystifying Neural Style TransferSEMINARGROOT
 
Towards Deep Learning Models Resistant to Adversarial Attacks.
Towards Deep Learning Models Resistant to Adversarial Attacks.Towards Deep Learning Models Resistant to Adversarial Attacks.
Towards Deep Learning Models Resistant to Adversarial Attacks.SEMINARGROOT
 
The ways of node embedding
The ways of node embeddingThe ways of node embedding
The ways of node embeddingSEMINARGROOT
 
Graph Convolutional Network
Graph  Convolutional NetworkGraph  Convolutional Network
Graph Convolutional NetworkSEMINARGROOT
 
Denoising With Frequency Domain
Denoising With Frequency DomainDenoising With Frequency Domain
Denoising With Frequency DomainSEMINARGROOT
 
Bayesian Statistics
Bayesian StatisticsBayesian Statistics
Bayesian StatisticsSEMINARGROOT
 
Coding Test Review 3
Coding Test Review 3Coding Test Review 3
Coding Test Review 3SEMINARGROOT
 
Time Series Analysis - ARMA
Time Series Analysis - ARMATime Series Analysis - ARMA
Time Series Analysis - ARMASEMINARGROOT
 
Generative models : VAE and GAN
Generative models : VAE and GANGenerative models : VAE and GAN
Generative models : VAE and GANSEMINARGROOT
 
Attention Is All You Need
Attention Is All You NeedAttention Is All You Need
Attention Is All You NeedSEMINARGROOT
 
WWW 2020 XAI Tutorial Review
WWW 2020 XAI Tutorial ReviewWWW 2020 XAI Tutorial Review
WWW 2020 XAI Tutorial ReviewSEMINARGROOT
 
Coding test review 2
Coding test review 2Coding test review 2
Coding test review 2SEMINARGROOT
 
Locality sensitive hashing
Locality sensitive hashingLocality sensitive hashing
Locality sensitive hashingSEMINARGROOT
 
Coding Test Review1
Coding Test Review1Coding Test Review1
Coding Test Review1SEMINARGROOT
 
Strong convexity on gradient descent and newton's method
Strong convexity on gradient descent and newton's methodStrong convexity on gradient descent and newton's method
Strong convexity on gradient descent and newton's methodSEMINARGROOT
 
SVM (Support Vector Machine & Kernel)
SVM (Support Vector Machine & Kernel)SVM (Support Vector Machine & Kernel)
SVM (Support Vector Machine & Kernel)SEMINARGROOT
 
Gaussian Process Regression
Gaussian Process Regression  Gaussian Process Regression
Gaussian Process Regression SEMINARGROOT
 

More from SEMINARGROOT (20)

Metric based meta_learning
Metric based meta_learningMetric based meta_learning
Metric based meta_learning
 
Demystifying Neural Style Transfer
Demystifying Neural Style TransferDemystifying Neural Style Transfer
Demystifying Neural Style Transfer
 
Towards Deep Learning Models Resistant to Adversarial Attacks.
Towards Deep Learning Models Resistant to Adversarial Attacks.Towards Deep Learning Models Resistant to Adversarial Attacks.
Towards Deep Learning Models Resistant to Adversarial Attacks.
 
The ways of node embedding
The ways of node embeddingThe ways of node embedding
The ways of node embedding
 
Graph Convolutional Network
Graph  Convolutional NetworkGraph  Convolutional Network
Graph Convolutional Network
 
Denoising With Frequency Domain
Denoising With Frequency DomainDenoising With Frequency Domain
Denoising With Frequency Domain
 
Bayesian Statistics
Bayesian StatisticsBayesian Statistics
Bayesian Statistics
 
Coding Test Review 3
Coding Test Review 3Coding Test Review 3
Coding Test Review 3
 
Time Series Analysis - ARMA
Time Series Analysis - ARMATime Series Analysis - ARMA
Time Series Analysis - ARMA
 
Generative models : VAE and GAN
Generative models : VAE and GANGenerative models : VAE and GAN
Generative models : VAE and GAN
 
Effective Python
Effective PythonEffective Python
Effective Python
 
Attention Is All You Need
Attention Is All You NeedAttention Is All You Need
Attention Is All You Need
 
Attention
AttentionAttention
Attention
 
WWW 2020 XAI Tutorial Review
WWW 2020 XAI Tutorial ReviewWWW 2020 XAI Tutorial Review
WWW 2020 XAI Tutorial Review
 
Coding test review 2
Coding test review 2Coding test review 2
Coding test review 2
 
Locality sensitive hashing
Locality sensitive hashingLocality sensitive hashing
Locality sensitive hashing
 
Coding Test Review1
Coding Test Review1Coding Test Review1
Coding Test Review1
 
Strong convexity on gradient descent and newton's method
Strong convexity on gradient descent and newton's methodStrong convexity on gradient descent and newton's method
Strong convexity on gradient descent and newton's method
 
SVM (Support Vector Machine & Kernel)
SVM (Support Vector Machine & Kernel)SVM (Support Vector Machine & Kernel)
SVM (Support Vector Machine & Kernel)
 
Gaussian Process Regression
Gaussian Process Regression  Gaussian Process Regression
Gaussian Process Regression
 

Recently uploaded

Supply chain analytics to combat the effects of Ukraine-Russia-conflict
Supply chain analytics to combat the effects of Ukraine-Russia-conflictSupply chain analytics to combat the effects of Ukraine-Russia-conflict
Supply chain analytics to combat the effects of Ukraine-Russia-conflictJack Cole
 
2024-05-14 - Tableau User Group - TC24 Hot Topics - Tableau Pulse and Einstei...
2024-05-14 - Tableau User Group - TC24 Hot Topics - Tableau Pulse and Einstei...2024-05-14 - Tableau User Group - TC24 Hot Topics - Tableau Pulse and Einstei...
2024-05-14 - Tableau User Group - TC24 Hot Topics - Tableau Pulse and Einstei...elinavihriala
 
一比一原版纽卡斯尔大学毕业证成绩单如何办理
一比一原版纽卡斯尔大学毕业证成绩单如何办理一比一原版纽卡斯尔大学毕业证成绩单如何办理
一比一原版纽卡斯尔大学毕业证成绩单如何办理cyebo
 
一比一原版麦考瑞大学毕业证成绩单如何办理
一比一原版麦考瑞大学毕业证成绩单如何办理一比一原版麦考瑞大学毕业证成绩单如何办理
一比一原版麦考瑞大学毕业证成绩单如何办理cyebo
 
How I opened a fake bank account and didn't go to prison
How I opened a fake bank account and didn't go to prisonHow I opened a fake bank account and didn't go to prison
How I opened a fake bank account and didn't go to prisonPayment Village
 
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理pyhepag
 
一比一原版阿德莱德大学毕业证成绩单如何办理
一比一原版阿德莱德大学毕业证成绩单如何办理一比一原版阿德莱德大学毕业证成绩单如何办理
一比一原版阿德莱德大学毕业证成绩单如何办理pyhepag
 
Fuzzy Sets decision making under information of uncertainty
Fuzzy Sets decision making under information of uncertaintyFuzzy Sets decision making under information of uncertainty
Fuzzy Sets decision making under information of uncertaintyRafigAliyev2
 
Machine Learning For Career Growth..pptx
Machine Learning For Career Growth..pptxMachine Learning For Career Growth..pptx
Machine Learning For Career Growth..pptxbenishzehra469
 
Artificial_General_Intelligence__storm_gen_article.pdf
Artificial_General_Intelligence__storm_gen_article.pdfArtificial_General_Intelligence__storm_gen_article.pdf
Artificial_General_Intelligence__storm_gen_article.pdfscitechtalktv
 
Atlantic Grupa Case Study (Mintec Data AI)
Atlantic Grupa Case Study (Mintec Data AI)Atlantic Grupa Case Study (Mintec Data AI)
Atlantic Grupa Case Study (Mintec Data AI)Jon Hansen
 
Business update Q1 2024 Lar España Real Estate SOCIMI
Business update Q1 2024 Lar España Real Estate SOCIMIBusiness update Q1 2024 Lar España Real Estate SOCIMI
Business update Q1 2024 Lar España Real Estate SOCIMIAlejandraGmez176757
 
Webinar One View, Multiple Systems No-Code Integration of Salesforce and ERPs
Webinar One View, Multiple Systems No-Code Integration of Salesforce and ERPsWebinar One View, Multiple Systems No-Code Integration of Salesforce and ERPs
Webinar One View, Multiple Systems No-Code Integration of Salesforce and ERPsCEPTES Software Inc
 
2024 Q1 Tableau User Group Leader Quarterly Call
2024 Q1 Tableau User Group Leader Quarterly Call2024 Q1 Tableau User Group Leader Quarterly Call
2024 Q1 Tableau User Group Leader Quarterly Calllward7
 
AI Imagen for data-storytelling Infographics.pdf
AI Imagen for data-storytelling Infographics.pdfAI Imagen for data-storytelling Infographics.pdf
AI Imagen for data-storytelling Infographics.pdfMichaelSenkow
 
MALL CUSTOMER SEGMENTATION USING K-MEANS CLUSTERING.pptx
MALL CUSTOMER SEGMENTATION USING K-MEANS CLUSTERING.pptxMALL CUSTOMER SEGMENTATION USING K-MEANS CLUSTERING.pptx
MALL CUSTOMER SEGMENTATION USING K-MEANS CLUSTERING.pptxNidaFaviankaNawawi
 
basics of data science with application areas.pdf
basics of data science with application areas.pdfbasics of data science with application areas.pdf
basics of data science with application areas.pdfvyankatesh1
 
2024 Q2 Orange County (CA) Tableau User Group Meeting
2024 Q2 Orange County (CA) Tableau User Group Meeting2024 Q2 Orange County (CA) Tableau User Group Meeting
2024 Q2 Orange County (CA) Tableau User Group MeetingAlison Pitt
 

Recently uploaded (20)

Supply chain analytics to combat the effects of Ukraine-Russia-conflict
Supply chain analytics to combat the effects of Ukraine-Russia-conflictSupply chain analytics to combat the effects of Ukraine-Russia-conflict
Supply chain analytics to combat the effects of Ukraine-Russia-conflict
 
Slip-and-fall Injuries: Top Workers' Comp Claims
Slip-and-fall Injuries: Top Workers' Comp ClaimsSlip-and-fall Injuries: Top Workers' Comp Claims
Slip-and-fall Injuries: Top Workers' Comp Claims
 
2024-05-14 - Tableau User Group - TC24 Hot Topics - Tableau Pulse and Einstei...
2024-05-14 - Tableau User Group - TC24 Hot Topics - Tableau Pulse and Einstei...2024-05-14 - Tableau User Group - TC24 Hot Topics - Tableau Pulse and Einstei...
2024-05-14 - Tableau User Group - TC24 Hot Topics - Tableau Pulse and Einstei...
 
一比一原版纽卡斯尔大学毕业证成绩单如何办理
一比一原版纽卡斯尔大学毕业证成绩单如何办理一比一原版纽卡斯尔大学毕业证成绩单如何办理
一比一原版纽卡斯尔大学毕业证成绩单如何办理
 
一比一原版麦考瑞大学毕业证成绩单如何办理
一比一原版麦考瑞大学毕业证成绩单如何办理一比一原版麦考瑞大学毕业证成绩单如何办理
一比一原版麦考瑞大学毕业证成绩单如何办理
 
How I opened a fake bank account and didn't go to prison
How I opened a fake bank account and didn't go to prisonHow I opened a fake bank account and didn't go to prison
How I opened a fake bank account and didn't go to prison
 
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
 
一比一原版阿德莱德大学毕业证成绩单如何办理
一比一原版阿德莱德大学毕业证成绩单如何办理一比一原版阿德莱德大学毕业证成绩单如何办理
一比一原版阿德莱德大学毕业证成绩单如何办理
 
Fuzzy Sets decision making under information of uncertainty
Fuzzy Sets decision making under information of uncertaintyFuzzy Sets decision making under information of uncertainty
Fuzzy Sets decision making under information of uncertainty
 
Machine Learning For Career Growth..pptx
Machine Learning For Career Growth..pptxMachine Learning For Career Growth..pptx
Machine Learning For Career Growth..pptx
 
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotecAbortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
 
Artificial_General_Intelligence__storm_gen_article.pdf
Artificial_General_Intelligence__storm_gen_article.pdfArtificial_General_Intelligence__storm_gen_article.pdf
Artificial_General_Intelligence__storm_gen_article.pdf
 
Atlantic Grupa Case Study (Mintec Data AI)
Atlantic Grupa Case Study (Mintec Data AI)Atlantic Grupa Case Study (Mintec Data AI)
Atlantic Grupa Case Study (Mintec Data AI)
 
Business update Q1 2024 Lar España Real Estate SOCIMI
Business update Q1 2024 Lar España Real Estate SOCIMIBusiness update Q1 2024 Lar España Real Estate SOCIMI
Business update Q1 2024 Lar España Real Estate SOCIMI
 
Webinar One View, Multiple Systems No-Code Integration of Salesforce and ERPs
Webinar One View, Multiple Systems No-Code Integration of Salesforce and ERPsWebinar One View, Multiple Systems No-Code Integration of Salesforce and ERPs
Webinar One View, Multiple Systems No-Code Integration of Salesforce and ERPs
 
2024 Q1 Tableau User Group Leader Quarterly Call
2024 Q1 Tableau User Group Leader Quarterly Call2024 Q1 Tableau User Group Leader Quarterly Call
2024 Q1 Tableau User Group Leader Quarterly Call
 
AI Imagen for data-storytelling Infographics.pdf
AI Imagen for data-storytelling Infographics.pdfAI Imagen for data-storytelling Infographics.pdf
AI Imagen for data-storytelling Infographics.pdf
 
MALL CUSTOMER SEGMENTATION USING K-MEANS CLUSTERING.pptx
MALL CUSTOMER SEGMENTATION USING K-MEANS CLUSTERING.pptxMALL CUSTOMER SEGMENTATION USING K-MEANS CLUSTERING.pptx
MALL CUSTOMER SEGMENTATION USING K-MEANS CLUSTERING.pptx
 
basics of data science with application areas.pdf
basics of data science with application areas.pdfbasics of data science with application areas.pdf
basics of data science with application areas.pdf
 
2024 Q2 Orange County (CA) Tableau User Group Meeting
2024 Q2 Orange County (CA) Tableau User Group Meeting2024 Q2 Orange County (CA) Tableau User Group Meeting
2024 Q2 Orange County (CA) Tableau User Group Meeting
 

Sampling method : MCMC

  • 1. Sampling Methods for Statistical Inference 2020. 10. 19 Jinhwan Suk Department of Mathematical Science, KAIST GROOT SEMINAR GROOT SEMINAR GROOT AI
  • 2. Obstacles in latent modeling • Latent variable model • We need posterior Sampling Methods for Statistical Inference GROOT SEMINAR
  • 3. Latent Variable Model GROOT SEMINAR 𝑧 𝒟 e.g. RBM, VAE, GAN, HMM, Particle Filter, … prior likelihood Data
  • 4. Latent Variable Model GROOT SEMINAR 𝑧 𝒟 e.g. RBM, VAE, GAN, HMM, Particle Filter, … prior likelihood Data
  • 5. Latent Variable Model GROOT SEMINAR 𝑧 𝒟 e.g. RBM, VAE, GAN, HMM, Particle Filter, … prior likelihood Data
  • 6. Latent Variable Model GROOT SEMINAR 𝑧 𝒟 e.g. RBM, VAE, GAN, HMM, Particle Filter, … prior likelihood Hidden markov model Data
  • 7. Latent Variable Model GROOT SEMINAR 𝑧 𝒟 e.g. RBM, VAE, GAN, HMM, Particle Filter, … prior likelihood Hidden markov model VAE Data
  • 8. Latent Variable Model GROOT SEMINAR zprior likelihood Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data
  • 9. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data
  • 10. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data
  • 11. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data
  • 12. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data EM Algorithm
  • 13. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data EM Algorithm log𝑝( 𝒟) = ∫ log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧                          = ∫ 𝑙𝑜𝑔 𝑝( 𝒟, 𝑧) 𝑝( 𝑧 𝒟)  𝑝( 𝑧 𝒟) 𝑑𝑧
  • 14. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data EM Algorithm log𝑝( 𝒟) = ∫ log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧                          = ∫ 𝑙𝑜𝑔 𝑝( 𝒟, 𝑧) 𝑝( 𝑧 𝒟)  𝑝( 𝑧 𝒟) 𝑑𝑧 = ∫ log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 − ∫ log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧
  • 15. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data EM Algorithm log𝑝( 𝒟) = ∫ log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧                          = ∫ 𝑙𝑜𝑔 𝑝( 𝒟, 𝑧) 𝑝( 𝑧 𝒟)  𝑝( 𝑧 𝒟) 𝑑𝑧 = ∫ log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 − ∫ log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧 = 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] + 𝐻 ≥ 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)]
  • 16. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data EM Algorithm log𝑝( 𝒟) = ∫ log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧                          = ∫ 𝑙𝑜𝑔 𝑝( 𝒟, 𝑧) 𝑝( 𝑧 𝒟)  𝑝( 𝑧 𝒟) 𝑑𝑧 = ∫ log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 − ∫ log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧 = 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] + 𝐻 ≥ 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] E-Step
  • 17. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data EM Algorithm log𝑝( 𝒟) = ∫ log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧                          = ∫ 𝑙𝑜𝑔 𝑝( 𝒟, 𝑧) 𝑝( 𝑧 𝒟)  𝑝( 𝑧 𝒟) 𝑑𝑧 = ∫ log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 − ∫ log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧 = 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] + 𝐻 ≥ 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] E-Step 𝑝( 𝑧 𝒟) = 𝑝( 𝒟, 𝑧) 𝑝(𝒟)
  • 18. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) 𝒟 Data EM Algorithm log𝑝( 𝒟) = ∫ log 𝑝( 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧                          = ∫ 𝑙𝑜𝑔 𝑝( 𝒟, 𝑧) 𝑝( 𝑧 𝒟)  𝑝( 𝑧 𝒟) 𝑑𝑧 = ∫ log𝑝( 𝒟, 𝑧) 𝑝(𝑧| 𝒟) 𝑑𝑧 − ∫ log𝑝( 𝑧 𝒟) 𝑝( 𝑧 𝒟) 𝑑𝑧 = 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] + 𝐻 ≥ 𝔼𝑝( 𝑧 𝒟)[log𝑝( 𝒟, 𝑧)] E-Step 𝑝( 𝑧 𝒟) = 𝑝( 𝒟, 𝑧) 𝑝(𝒟) Intractable!
  • 19. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) Posterior inference 𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟] 𝒟 Data
  • 20. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) Posterior inference 𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟] 𝒟 Data
  • 21. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) Posterior inference 𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟] 𝒟 Data “Bayesian inference is all about posterior inference”
  • 22. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) Posterior inference 𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟] 𝒟 Data Direct computing is impossible “Bayesian inference is all about posterior inference”
  • 23. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) Posterior inference 𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟] 𝒟 Data Direct computing is impossible Approximation! But…how? “Bayesian inference is all about posterior inference”
  • 24. Latent Variable Model GROOT SEMINAR zprior likelihood ^𝜃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝑝𝜃(𝒟) Inference 𝑝𝜃( 𝒟) = ∫ 𝑝𝜃(𝑧) 𝑝 𝜃( 𝒟 𝑧) 𝑑𝑧 Compute Intractable! Given Prior : 𝑝𝜃(𝑧) Posterior : 𝑝𝜃(𝒟| 𝑧) Posterior inference 𝔼[ 𝑧 𝒟] 𝔼[ 𝑓(𝑧) 𝒟] 𝒟 Data Direct computing is impossible Approximation! But…how? “Bayesian inference is all about posterior inference” Let target distribution denoted by 𝑝(𝑥)
  • 25. Two ways of approximation GROOT SEMINAR
  • 26. Two ways of approximation GROOT SEMINAR 1.
  • 27. Two ways of approximation GROOT SEMINAR 1. 2.
  • 28. Two ways of approximation GROOT SEMINAR 두 분포 사이의 거리를 줄이자 1. 2. Variational Inference
  • 29. Two ways of approximation GROOT SEMINAR 두 분포 사이의 거리를 줄이자 1. 2. Variational Inference Optimization problem
  • 30. Two ways of approximation GROOT SEMINAR 두 분포 사이의 거리를 줄이자 1. 2. 표본을 추출하자 Variational Inference Optimization problem
  • 31. Two ways of approximation GROOT SEMINAR 두 분포 사이의 거리를 줄이자 1. 2. 표본을 추출하자 Variational Inference Monte Carlo Method Optimization problem
  • 32. Two ways of approximation GROOT SEMINAR 두 분포 사이의 거리를 줄이자 1. 2. 표본을 추출하자 Variational Inference Monte Carlo Method Optimization problem Sampling method
  • 34. Monte Carlo Method GROOT SEMINAR The Law of Large Numbers 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 𝑛 → 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑
  • 35. Monte Carlo Method GROOT SEMINAR The Law of Large Numbers 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 𝑛 → 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑 But…
  • 36. Monte Carlo Method GROOT SEMINAR The Law of Large Numbers 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 𝑛 → 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑 But… Yes, How do we sample from ?𝑝(𝑥)
  • 37. Monte Carlo Method GROOT SEMINAR The Law of Large Numbers 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 𝑛 → 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑 But… Yes, How do we sample from ?𝑝(𝑥) Rejection Sampling
  • 38. Monte Carlo Method GROOT SEMINAR The Law of Large Numbers 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 𝑛 → 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑 But… Yes, How do we sample from ?𝑝(𝑥) : proposal distribution𝑞(𝑥) Rejection Sampling Easy to compute, easy to sample
  • 39. Monte Carlo Method GROOT SEMINAR The Law of Large Numbers 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 𝑛 → 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑 But… Yes, How do we sample from ?𝑝(𝑥) : proposal distribution𝑞(𝑥) Rejection Sampling Easy to compute, easy to sample 𝑀𝑞(𝑥) ≥ ~𝑝(𝑥) If , we reject the sample, otherwise we accept it𝑢 > ~𝑝(𝑥) 𝑀𝑞(𝑥)
  • 40. Monte Carlo Method GROOT SEMINAR The Law of Large Numbers 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 𝑛 → 𝔼[𝑋𝑖]𝑋1,  𝑋2, …,  𝑋𝑛 : 𝑖 . 𝑖 . 𝑑 But… Yes, How do we sample from ?𝑝(𝑥) : proposal distribution𝑞(𝑥) Rejection Sampling Easy to compute, easy to sample 𝑀𝑞(𝑥) ≥ ~𝑝(𝑥) If , we reject the sample, otherwise we accept it𝑢 > ~𝑝(𝑥) 𝑀𝑞(𝑥)
  • 42. Monte Carlo Method GROOT SEMINAR 𝜇 = 𝔼𝑝(𝑥)[ 𝑓(𝑥)] = ∫ 𝑓(𝑥) ~𝑝(𝑥) 𝑍 𝑑𝑥 = 1 𝑍 ∫ 𝑓(𝑥) ~𝑝(𝑥) 𝑞(𝑥) 𝑞(𝑥)𝑑𝑥 Importance Samping
  • 43. Monte Carlo Method GROOT SEMINAR 𝜇 = 𝔼𝑝(𝑥)[ 𝑓(𝑥)] = ∫ 𝑓(𝑥) ~𝑝(𝑥) 𝑍 𝑑𝑥 = 1 𝑍 ∫ 𝑓(𝑥) ~𝑝(𝑥) 𝑞(𝑥) 𝑞(𝑥)𝑑𝑥 Importance Samping 𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈   1 𝑛𝑍 ∑ 𝑓( 𝑥𝑖) ~𝑝( 𝑥𝑖) 𝑞( 𝑥𝑖) = 1 𝑛𝑍 ∑ 𝑤( 𝑥𝑖) 𝑓(𝑥𝑖) 𝑍 = ∫ ~𝑝(𝑥)𝑑𝑥 = ∫ ~𝑝(𝑥) 𝑞(𝑥) 𝑞(𝑥)𝑑𝑥 ≈ 1 𝑛 ∑ 𝑤(𝑥𝑖) 𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈ ∑ ~𝑤( 𝑥𝑖) 𝑓(𝑥𝑖)
  • 44. Monte Carlo Method GROOT SEMINAR 𝜇 = 𝔼𝑝(𝑥)[ 𝑓(𝑥)] = ∫ 𝑓(𝑥) ~𝑝(𝑥) 𝑍 𝑑𝑥 = 1 𝑍 ∫ 𝑓(𝑥) ~𝑝(𝑥) 𝑞(𝑥) 𝑞(𝑥)𝑑𝑥 Importance Samping 𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈   1 𝑛𝑍 ∑ 𝑓( 𝑥𝑖) ~𝑝( 𝑥𝑖) 𝑞( 𝑥𝑖) = 1 𝑛𝑍 ∑ 𝑤( 𝑥𝑖) 𝑓(𝑥𝑖) 𝑍 = ∫ ~𝑝(𝑥)𝑑𝑥 = ∫ ~𝑝(𝑥) 𝑞(𝑥) 𝑞(𝑥)𝑑𝑥 ≈ 1 𝑛 ∑ 𝑤(𝑥𝑖) 𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈ ∑ ~𝑤( 𝑥𝑖) 𝑓(𝑥𝑖) i.i.d sampling is very vulnerable in high-dimensional spaces
  • 45. Markov Chain Monte Carlo • Gibbs Sampling • Metropolis-Hestings Algorithm Sampling Methods for Statistical Inference GROOT SEMINAR
  • 46. Basic idea of MCMC GROOT SEMINAR Markov Chain : 𝑝( 𝑥(𝑡+1)  𝑥(1),  𝑥(2) , …,  𝑥(𝑡) ) = 𝑝( 𝑥(𝑡+1) 𝑥(𝑡) ) is a sequence of random variables. It forms a Markov chain if𝑥(1) ,  𝑥(2) , … A Markov chain can be specified by 1. Initial distribution 2. Transition probability 𝑝1(𝑥) = 𝑝( 𝑥(1) ) 𝑇 𝑘(𝑥′,  𝑥) = 𝑝( 𝑥(𝑡+1) = 𝑥′ 𝑥(𝑡) = 𝑥) Ergodicity , regardless of the initial distributionlim 𝑛→∞ 𝑇 𝑛 𝑝1 = 𝜋 𝑝1 1. Build a Markov chain having as an invariant distribution 2. Sample from the chain 3. Compute 𝑝(𝑥) ( 𝑥(𝑡) )𝑡≥1 𝔼𝑝(𝑥)[ 𝑓(𝑥)] ≈ 𝔼 𝑇 𝑛 𝑝1(𝑥)[ 𝑓(𝑥)] ≈ 1 𝑛 ∑ 𝑓( 𝑥(𝑡) )
  • 47. MCMC Algorithms GROOT SEMINAR 𝑥 𝑠+1 𝑖 ∼ 𝑝(𝑥𝑖 | 𝒙−𝒊)Gibbs Sampling Metropolis-Hastings algorithm Hamiltonian MCMC Sequential MCMC Stochastic gradient MCMC Particle MCMC
  • 48. Thank you Sampling Methods for Statistical Inference 2020. 10. 19 Jinhwan Suk Department of Mathematical Science, KAIST GROOT SEMINAR