SlideShare a Scribd company logo
1 of 33
13.
Linear Factor Models
ํ•˜์ธ์ค€ , ์ด์ •ํ˜„
Generative Model
์ง€๋„ํ•™์Šต ๋น„์ง€๋„ํ•™์Šต
Generative Model
์ •๊ทœ๋ถ„ํฌ ํฌ์•„์†ก๋ถ„ํฌ ๊ฐ๋งˆ๋ถ„ํฌ
Generative Model
Generative adversarial network (GAN)
Linear Factor Model
์ž ์žฌ๋ณ€์ˆ˜ (Latent Variable) h
h์˜ ์„ ํ˜•๋ณ€ํ™˜์— bias์™€ noise๋ฅผ ๋”ํ•จ์œผ๋กœ์จ x๋ฅผ ์ƒ์„ฑ
์ด๋•Œ ๋ณดํ†ต noise๋Š” Gaussian & diagonal (์ฐจ์›๋“ค์— ๋Œ€ํ•ด ๋…๋ฆฝ)
Linear Factor Model
Linear Factor Model
pPCA
ICA
SFA
probabilistic PCA
โ€œ์ด๋ฒˆ์— ๋ฐฐ์šธ PPCA๋ž€ ๋ง์ด์ฃ ..โ€
-PCA์žฅ์ธ
probabilistic PCA
Probabilistic PCA
Probabilistic PCA
Probabilistic PCA
Eigen Decomposition
/ Singular Value Decomposition
probabilistic PCA
probabilistic PCA
probabilistic PCA
probabilistic PCA
probabilistic PCA
probabilistic PCA
probabilistic PCA
probabilistic PCA
probabilistic PCA
Independent CA
Independent CA
Independent CA
Independent CA
Slow Feature Analysis
Slow Feature Analysis
๊ฐ€์žฅ ์ค‘์š”ํ•œ ์ •๋ณด๋Š” ๋ฌด์—‡์ผ๊นŒ?
๊ฐ€์žฅ ์ฒœ์ฒœํžˆ(๋ถ€๋“œ๋Ÿฝ๊ฒŒ) ๋ณ€ํ•˜๋Š”๊ฒƒ!
-๋Š๋ฆผ ์›๋ฆฌ(slowness principle)
Slow Feature Analysis
Slow Feature Analysis
๋žŒ๋‹ค๋Š” ํ•ญ์˜ ๊ฐ•๋„๋ฅผ ๊ฒฐ์ •ํ•˜๋Š” ํ•˜์ดํผํŒŒ๋ผ๋ฏธํ„ฐ,
t ๋Š” ์‹œ๊ฐ„,
f๋Š” ํŠน์ง•์ถ”์ถœ๊ธฐ(feature extractor)
L์€ ๊ฑฐ๋ฆฌ๋ฅผ ์ธก์ •ํ•˜๋Š” Loss function (ํ”ํžˆ MSE๋ฅผ ํ™œ์šฉํ•œ๋‹ค)
๋Š๋ฆผ ์›๋ฆฌ(slowness principle) ์ ์šฉ์„ ์œ„ํ•ด ์› ๋ชจ๋ธ์˜ ๋น„์šฉํ•จ์ˆ˜์— ๋‹ค์Œํ•ญ์„ ์ถ”๊ฐ€
Slow Feature Analysis
๊ฒฐ๊ตญ ์ด๋Ÿฌํ•œ ์ตœ์ ํ™” ๋ฌธ์ œ๋ฅผ ํ’€๊ฒŒ ๋˜๋Š”๊ฒƒ!
Slow Feature Analysis
์ปดํ“จํ„ฐ์—๊ฒŒ ์ž์—ฐ๊ฒฝ๊ด€์„ ๋‹ด์€ ๋™์˜์ƒ์œผ๋กœ ํ›ˆ๋ จํ•œ ๊ฒฐ๊ณผ์™€
์ฅ์˜ ๋‡Œ์—์„œ ๋‰ด๋Ÿฐ๋“ค์ด ๋Œ€ํ‘œํ•˜๋Š” ํŠน์ง•๋“ค์ด ๊ณตํ†ต์ ์ด ๋†’๊ฒŒ ๋‚˜ํƒ€๋‚˜๋Š”๊ฒƒ์œผ๋กœ ๋ณด์•„
์ƒ๋ฌผํ•™์ ์œผ๋กœ๋„ ์–ด๋Š์ •๋„ ๊ทธ๋Ÿด๋“ฏํ•œ ๋ชจํ˜•์ธ ๊ฒƒ์œผ๋กœ ๋ณด์ž„.
ํ•˜์ง€๋งŒ ์ตœ๊ณ ์ˆ˜์ค€์˜ ์‘์šฉ์—์„œ ํ™œ์šฉ๋˜์ง€ ๋ชปํ•˜๋Š”๋ฐ,
์„ฑ๊ณผ๋ฅผ ์ œํ•œํ•˜๋Š” ์š”์ธ์ด ์–ด๋–ค๊ฒƒ์ธ์ง€ ์•„์ง ์•Œ๋ ค์ง€์ง€ ์•Š์Œ
13.
Linear Factor Models
ํ•˜์ธ์ค€ , ์ด์ •ํ˜„

More Related Content

What's hot

Particle Filters and Applications in Computer Vision
Particle Filters and Applications in Computer VisionParticle Filters and Applications in Computer Vision
Particle Filters and Applications in Computer Vision
zukun
ย 
๋จธํ”ผ์˜ ๋จธ์‹ ๋Ÿฌ๋‹ 13 Sparse Linear Model
๋จธํ”ผ์˜ ๋จธ์‹ ๋Ÿฌ๋‹ 13 Sparse Linear Model๋จธํ”ผ์˜ ๋จธ์‹ ๋Ÿฌ๋‹ 13 Sparse Linear Model
๋จธํ”ผ์˜ ๋จธ์‹ ๋Ÿฌ๋‹ 13 Sparse Linear Model
Jungkyu Lee
ย 

What's hot (20)

๊ฐ€๊น๊ณ ๋„ ๋จผ Trpo
๊ฐ€๊น๊ณ ๋„ ๋จผ Trpo๊ฐ€๊น๊ณ ๋„ ๋จผ Trpo
๊ฐ€๊น๊ณ ๋„ ๋จผ Trpo
ย 
[GomGuard] ๋‰ด๋Ÿฐ๋ถ€ํ„ฐ YOLO ๊นŒ์ง€ - ๋”ฅ๋Ÿฌ๋‹ ์ „๋ฐ˜์— ๋Œ€ํ•œ ์ด์•ผ๊ธฐ
[GomGuard] ๋‰ด๋Ÿฐ๋ถ€ํ„ฐ YOLO ๊นŒ์ง€ - ๋”ฅ๋Ÿฌ๋‹ ์ „๋ฐ˜์— ๋Œ€ํ•œ ์ด์•ผ๊ธฐ[GomGuard] ๋‰ด๋Ÿฐ๋ถ€ํ„ฐ YOLO ๊นŒ์ง€ - ๋”ฅ๋Ÿฌ๋‹ ์ „๋ฐ˜์— ๋Œ€ํ•œ ์ด์•ผ๊ธฐ
[GomGuard] ๋‰ด๋Ÿฐ๋ถ€ํ„ฐ YOLO ๊นŒ์ง€ - ๋”ฅ๋Ÿฌ๋‹ ์ „๋ฐ˜์— ๋Œ€ํ•œ ์ด์•ผ๊ธฐ
ย 
Particle Filters and Applications in Computer Vision
Particle Filters and Applications in Computer VisionParticle Filters and Applications in Computer Vision
Particle Filters and Applications in Computer Vision
ย 
์€๋‹‰ ๋งˆ๋ฅด์ฝ”ํ”„ ๋ชจ๋ธ, Hidden Markov Model(HMM)
์€๋‹‰ ๋งˆ๋ฅด์ฝ”ํ”„ ๋ชจ๋ธ, Hidden Markov Model(HMM)์€๋‹‰ ๋งˆ๋ฅด์ฝ”ํ”„ ๋ชจ๋ธ, Hidden Markov Model(HMM)
์€๋‹‰ ๋งˆ๋ฅด์ฝ”ํ”„ ๋ชจ๋ธ, Hidden Markov Model(HMM)
ย 
support vector regression
support vector regressionsupport vector regression
support vector regression
ย 
Autoencoder
AutoencoderAutoencoder
Autoencoder
ย 
Deep Generative Models
Deep Generative Models Deep Generative Models
Deep Generative Models
ย 
Feed forward ,back propagation,gradient descent
Feed forward ,back propagation,gradient descentFeed forward ,back propagation,gradient descent
Feed forward ,back propagation,gradient descent
ย 
๋”ฅ๋Ÿฌ๋‹์˜ ๊ธฐ๋ณธ
๋”ฅ๋Ÿฌ๋‹์˜ ๊ธฐ๋ณธ๋”ฅ๋Ÿฌ๋‹์˜ ๊ธฐ๋ณธ
๋”ฅ๋Ÿฌ๋‹์˜ ๊ธฐ๋ณธ
ย 
Variational Autoencoder๋ฅผ ์—ฌ๋Ÿฌ ๊ฐ€์ง€ ๊ฐ๋„์—์„œ ์ดํ•ดํ•˜๊ธฐ (Understanding Variational Autoencod...
Variational Autoencoder๋ฅผ ์—ฌ๋Ÿฌ ๊ฐ€์ง€ ๊ฐ๋„์—์„œ ์ดํ•ดํ•˜๊ธฐ (Understanding Variational Autoencod...Variational Autoencoder๋ฅผ ์—ฌ๋Ÿฌ ๊ฐ€์ง€ ๊ฐ๋„์—์„œ ์ดํ•ดํ•˜๊ธฐ (Understanding Variational Autoencod...
Variational Autoencoder๋ฅผ ์—ฌ๋Ÿฌ ๊ฐ€์ง€ ๊ฐ๋„์—์„œ ์ดํ•ดํ•˜๊ธฐ (Understanding Variational Autoencod...
ย 
๋‚ด๊ฐ€ ์ดํ•ดํ•˜๋Š” SVM(์™œ, ์–ด๋–ป๊ฒŒ๋ฅผ ์ค‘์‹ฌ์œผ๋กœ)
๋‚ด๊ฐ€ ์ดํ•ดํ•˜๋Š” SVM(์™œ, ์–ด๋–ป๊ฒŒ๋ฅผ ์ค‘์‹ฌ์œผ๋กœ)๋‚ด๊ฐ€ ์ดํ•ดํ•˜๋Š” SVM(์™œ, ์–ด๋–ป๊ฒŒ๋ฅผ ์ค‘์‹ฌ์œผ๋กœ)
๋‚ด๊ฐ€ ์ดํ•ดํ•˜๋Š” SVM(์™œ, ์–ด๋–ป๊ฒŒ๋ฅผ ์ค‘์‹ฌ์œผ๋กœ)
ย 
07 regularization
07 regularization07 regularization
07 regularization
ย 
Reinforcement Learning 5. Monte Carlo Methods
Reinforcement Learning 5. Monte Carlo MethodsReinforcement Learning 5. Monte Carlo Methods
Reinforcement Learning 5. Monte Carlo Methods
ย 
๋”ฅ ๋Ÿฌ๋‹ ์ž์—ฐ์–ด ์ฒ˜๋ฆฌ๋ฅผ ํ•™์Šต์„ ์œ„ํ•œ ํŒŒ์›Œํฌ์ธํŠธ. (Deep Learning for Natural Language Processing)
๋”ฅ ๋Ÿฌ๋‹ ์ž์—ฐ์–ด ์ฒ˜๋ฆฌ๋ฅผ ํ•™์Šต์„ ์œ„ํ•œ ํŒŒ์›Œํฌ์ธํŠธ. (Deep Learning for Natural Language Processing)๋”ฅ ๋Ÿฌ๋‹ ์ž์—ฐ์–ด ์ฒ˜๋ฆฌ๋ฅผ ํ•™์Šต์„ ์œ„ํ•œ ํŒŒ์›Œํฌ์ธํŠธ. (Deep Learning for Natural Language Processing)
๋”ฅ ๋Ÿฌ๋‹ ์ž์—ฐ์–ด ์ฒ˜๋ฆฌ๋ฅผ ํ•™์Šต์„ ์œ„ํ•œ ํŒŒ์›Œํฌ์ธํŠธ. (Deep Learning for Natural Language Processing)
ย 
An overview of gradient descent optimization algorithms
An overview of gradient descent optimization algorithms An overview of gradient descent optimization algorithms
An overview of gradient descent optimization algorithms
ย 
Causal Inference : Primer (2019-06-01 ์ž”๋””์ฝ˜)
Causal Inference : Primer (2019-06-01 ์ž”๋””์ฝ˜)Causal Inference : Primer (2019-06-01 ์ž”๋””์ฝ˜)
Causal Inference : Primer (2019-06-01 ์ž”๋””์ฝ˜)
ย 
[PR12] categorical reparameterization with gumbel softmax
[PR12] categorical reparameterization with gumbel softmax[PR12] categorical reparameterization with gumbel softmax
[PR12] categorical reparameterization with gumbel softmax
ย 
๋จธํ”ผ์˜ ๋จธ์‹ ๋Ÿฌ๋‹ 13 Sparse Linear Model
๋จธํ”ผ์˜ ๋จธ์‹ ๋Ÿฌ๋‹ 13 Sparse Linear Model๋จธํ”ผ์˜ ๋จธ์‹ ๋Ÿฌ๋‹ 13 Sparse Linear Model
๋จธํ”ผ์˜ ๋จธ์‹ ๋Ÿฌ๋‹ 13 Sparse Linear Model
ย 
Recurrent neural networks rnn
Recurrent neural networks   rnnRecurrent neural networks   rnn
Recurrent neural networks rnn
ย 
์•Œ๊ธฐ์‰ฌ์šด Variational autoencoder
์•Œ๊ธฐ์‰ฌ์šด Variational autoencoder์•Œ๊ธฐ์‰ฌ์šด Variational autoencoder
์•Œ๊ธฐ์‰ฌ์šด Variational autoencoder
ย 

More from KyeongUkJang

More from KyeongUkJang (20)

Photo wake up - 3d character animation from a single photo
Photo wake up - 3d character animation from a single photoPhoto wake up - 3d character animation from a single photo
Photo wake up - 3d character animation from a single photo
ย 
YOLO
YOLOYOLO
YOLO
ย 
AlphagoZero
AlphagoZeroAlphagoZero
AlphagoZero
ย 
GoogLenet
GoogLenetGoogLenet
GoogLenet
ย 
GAN - Generative Adversarial Nets
GAN - Generative Adversarial NetsGAN - Generative Adversarial Nets
GAN - Generative Adversarial Nets
ย 
Distilling the knowledge in a neural network
Distilling the knowledge in a neural networkDistilling the knowledge in a neural network
Distilling the knowledge in a neural network
ย 
Latent Dirichlet Allocation
Latent Dirichlet AllocationLatent Dirichlet Allocation
Latent Dirichlet Allocation
ย 
Gaussian Mixture Model
Gaussian Mixture ModelGaussian Mixture Model
Gaussian Mixture Model
ย 
CNN for sentence classification
CNN for sentence classificationCNN for sentence classification
CNN for sentence classification
ย 
Visualizing data using t-SNE
Visualizing data using t-SNEVisualizing data using t-SNE
Visualizing data using t-SNE
ย 
Playing atari with deep reinforcement learning
Playing atari with deep reinforcement learningPlaying atari with deep reinforcement learning
Playing atari with deep reinforcement learning
ย 
Chapter 20 - GAN
Chapter 20 - GANChapter 20 - GAN
Chapter 20 - GAN
ย 
Chapter 20 - VAE
Chapter 20 - VAEChapter 20 - VAE
Chapter 20 - VAE
ย 
Chapter 20 Deep generative models
Chapter 20 Deep generative modelsChapter 20 Deep generative models
Chapter 20 Deep generative models
ย 
Natural Language Processing(NLP) - basic 2
Natural Language Processing(NLP) - basic 2Natural Language Processing(NLP) - basic 2
Natural Language Processing(NLP) - basic 2
ย 
Natural Language Processing(NLP) - Basic
Natural Language Processing(NLP) - BasicNatural Language Processing(NLP) - Basic
Natural Language Processing(NLP) - Basic
ย 
Chapter 17 monte carlo methods
Chapter 17 monte carlo methodsChapter 17 monte carlo methods
Chapter 17 monte carlo methods
ย 
Chapter 16 structured probabilistic models for deep learning - 2
Chapter 16 structured probabilistic models for deep learning - 2Chapter 16 structured probabilistic models for deep learning - 2
Chapter 16 structured probabilistic models for deep learning - 2
ย 
Chapter 16 structured probabilistic models for deep learning - 1
Chapter 16 structured probabilistic models for deep learning - 1Chapter 16 structured probabilistic models for deep learning - 1
Chapter 16 structured probabilistic models for deep learning - 1
ย 
Chapter 15 Representation learning - 2
Chapter 15 Representation learning - 2Chapter 15 Representation learning - 2
Chapter 15 Representation learning - 2
ย 

Recently uploaded

Grid Layout (Kitworks Team Study ์žฅํ˜„์ • ๋ฐœํ‘œ์ž๋ฃŒ)
Grid Layout (Kitworks Team Study ์žฅํ˜„์ • ๋ฐœํ‘œ์ž๋ฃŒ)Grid Layout (Kitworks Team Study ์žฅํ˜„์ • ๋ฐœํ‘œ์ž๋ฃŒ)
Grid Layout (Kitworks Team Study ์žฅํ˜„์ • ๋ฐœํ‘œ์ž๋ฃŒ)
Wonjun Hwang
ย 

Recently uploaded (7)

๋„์‹ฌ ํ•˜๋Š˜์—์„œ ์‹œ์† 200km๋กœ ๋น„ํ–‰ํ•  ์ˆ˜ ์žˆ๋Š” ๋ฏธ๋ž˜ ํ•ญ๊ณต ๋ชจ๋นŒ๋ฆฌํ‹ฐ 'S-A2'
๋„์‹ฌ ํ•˜๋Š˜์—์„œ ์‹œ์† 200km๋กœ ๋น„ํ–‰ํ•  ์ˆ˜ ์žˆ๋Š” ๋ฏธ๋ž˜ ํ•ญ๊ณต ๋ชจ๋นŒ๋ฆฌํ‹ฐ 'S-A2'๋„์‹ฌ ํ•˜๋Š˜์—์„œ ์‹œ์† 200km๋กœ ๋น„ํ–‰ํ•  ์ˆ˜ ์žˆ๋Š” ๋ฏธ๋ž˜ ํ•ญ๊ณต ๋ชจ๋นŒ๋ฆฌํ‹ฐ 'S-A2'
๋„์‹ฌ ํ•˜๋Š˜์—์„œ ์‹œ์† 200km๋กœ ๋น„ํ–‰ํ•  ์ˆ˜ ์žˆ๋Š” ๋ฏธ๋ž˜ ํ•ญ๊ณต ๋ชจ๋นŒ๋ฆฌํ‹ฐ 'S-A2'
ย 
[Terra] Terra Money: Stability and Adoption
[Terra] Terra Money: Stability and Adoption[Terra] Terra Money: Stability and Adoption
[Terra] Terra Money: Stability and Adoption
ย 
A future that integrates LLMs and LAMs (Symposium)
A future that integrates LLMs and LAMs (Symposium)A future that integrates LLMs and LAMs (Symposium)
A future that integrates LLMs and LAMs (Symposium)
ย 
์บ๋“œ์•ค๊ทธ๋ž˜ํ”ฝ์Šค 2024๋…„ 5์›”ํ˜ธ ๋ชฉ์ฐจ
์บ๋“œ์•ค๊ทธ๋ž˜ํ”ฝ์Šค 2024๋…„ 5์›”ํ˜ธ ๋ชฉ์ฐจ์บ๋“œ์•ค๊ทธ๋ž˜ํ”ฝ์Šค 2024๋…„ 5์›”ํ˜ธ ๋ชฉ์ฐจ
์บ๋“œ์•ค๊ทธ๋ž˜ํ”ฝ์Šค 2024๋…„ 5์›”ํ˜ธ ๋ชฉ์ฐจ
ย 
MOODv2 : Masked Image Modeling for Out-of-Distribution Detection
MOODv2 : Masked Image Modeling for Out-of-Distribution DetectionMOODv2 : Masked Image Modeling for Out-of-Distribution Detection
MOODv2 : Masked Image Modeling for Out-of-Distribution Detection
ย 
Grid Layout (Kitworks Team Study ์žฅํ˜„์ • ๋ฐœํ‘œ์ž๋ฃŒ)
Grid Layout (Kitworks Team Study ์žฅํ˜„์ • ๋ฐœํ‘œ์ž๋ฃŒ)Grid Layout (Kitworks Team Study ์žฅํ˜„์ • ๋ฐœํ‘œ์ž๋ฃŒ)
Grid Layout (Kitworks Team Study ์žฅํ˜„์ • ๋ฐœํ‘œ์ž๋ฃŒ)
ย 
Continual Active Learning for Efficient Adaptation of Machine LearningModels ...
Continual Active Learning for Efficient Adaptation of Machine LearningModels ...Continual Active Learning for Efficient Adaptation of Machine LearningModels ...
Continual Active Learning for Efficient Adaptation of Machine LearningModels ...
ย 

Chapter 13 Linear Factor Models

Editor's Notes

  1. ํŒŒํŠธ 3์— ๋“ค์–ด์™”๋‹ค. ์ดํ›„๋กœ ๋‹ค๋ฃฐ ๋‚ด์šฉ์€ ์ด ์ „๊นŒ์ง€ ๋‚ด์šฉ๊ณผ ๋งค์šฐ ๋‹ค๋ฅด๋‹ค. ์ง€๊ธˆ๊นŒ์ง€ ๋ฐฐ์šด ๋ฐฉ๋ฒ•๋“ค์€ ๋ณดํ†ต ์ง€๋„ํ•™์Šต์˜ ๋ฌธ์ œ๋ฅผ ํ‘ธ๋Š” ๋ฐฉ๋ฒ•๋“ค์ด์—ˆ๋‹ค. ๋ผ๋ฒจ๋ง์ด ๋œ ๊ฒฌ๋ณธ๋“ค์ด ์ถฉ๋ถ„ํžˆ ์ฃผ์–ด์กŒ์„๋•Œ, ํ•™์Šต๋ชจํ˜•์ด ๊ทธ๊ฒƒ์„ ๋ฐฐ์šฐ๊ฒŒ ํ–ˆ๋˜๊ฒƒ. ํ•˜์ง€๋งŒ ์‹ค์ œ ์„ธ๊ณ„์—์„œ๋Š” ๊ฒฐ์ธก๊ฐ’์„ ์ฒ˜๋ฆฌํ•˜๊ฑฐ๋‚˜, ๋ผ๋ฒจ๋ง์ด ์—†๋Š”๋ฐ์ดํ„ฐ๋ฅผ ๋‹ค๋ฃจ์–ด์•ผ ํ•  ๋•Œ๋„์žˆ๋‹ค. ํ˜„์žฌ๊นŒ์ง€ ๋”ฅ๋Ÿฌ๋‹๊ธฐ์ˆ ์˜ ํ•œ๊ณ„๋กœ ์ง€์ ๋˜๋Š” ๊ฒƒ์€ ๋Œ€๋Ÿ‰์˜ ํ•™์Šต์ž๋ฃŒ๊ฐ€ ํ•„์š”ํ•˜๋‹ค๋Š” ๊ฒƒ์ธ๋ฐ ๊ทธ๋Ÿฐ ๋ชฉํ‘œ๋“ค์„ ์œ„ํ•ด ๋น„์ง€๋„, ์ค€์ง€๋„ ํ•™์Šต์ด ์–ด๋Š์ •๋„ ํ•„์š”ํ•˜๋‹ค. ๋˜, ๊ณผ๋„ํ•œ ๊ณ„์‚ฐ์  ๋ถ€๋‹ด์„ ๊ฒฝ๊ฐ์‹œํ‚ค๊ณ ์ž ์ฒ˜๋ฆฌ๋ถˆ๊ฐ€๋Šฅํ•˜๊ฑฐ๋‚˜ ๋„ˆ๋ฌด ์–ด๋ ค์šด ๊ณ„์‚ฐ๋“ค์„ ํ†ต๊ณ„์ ์œผ๋กœ ๊ทผ์‚ฌํ•˜๋Š” ์—ฌ๋Ÿฌ ๋ฐฉ์‹๋“ค์— ๋Œ€ํ•ด ๋ฐฐ์šด๋‹ค. ๋ญ ๋Œ€๋ถ€๋ถ„์˜ ์ธ๊ณต์ง€๋Šฅ ์ „๋ฌธ๊ฐ€๋“ค์ด ๋ฏธ๋ž˜์˜ ์ธ๊ณต์ง€๋Šฅ ๊ธฐ์ˆ ์€ ์ง€๋„ํ•™์Šต์ด ์•„๋‹Œ, ๋น„์ง€๋„ํ•™์Šต์ด ์„ ๋„ํ•˜๊ฒŒ ๋  ๊ฒƒ์ด๋ผ๊ณ  ์ „๋งํ•˜๊ณ  ์žˆ๋‹ค๊ณ  ํ•œ๋‹ค.
  2. ์ด ์ฑ…์—์„œ ๋‹ค๋ฃจ๋Š” ๋น„์ง€๋„ ํ•™์Šต์€ ์› ๋ฐ์ดํ„ฐ๊ฐ€ ๊ฐ€์ง€๊ณ  ์žˆ๋Š” ํ™•๋ฅ ๋ถ„ํฌ๋ฅผ ์ถ”์ •ํ•˜๋„๋ก ํ•˜๊ณ , ์ธ๊ณต์‹ ๊ฒฝ๋ง์ด ๊ทธ ๋ถ„ํฌ๋ฅผ ๋งŒ๋“ค์–ด ๋‚ผ ์ˆ˜ ์žˆ๋„๋ก ํ•œ๋‹ค๋Š” ์ ์—์„œ ๋‹จ์ˆœํ•œ ๊ตฐ์ง‘ํ™” ๊ธฐ๋ฐ˜์˜ ๋น„์ง€๋„ํ•™์Šต๊ณผ ์ฐจ์ด๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค. ์ผ๋‹จ, GM์„ ์ดํ•ดํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š” ํ™•๋ฅ ๋ถ„ํฌ์˜ ๊ฐœ๋…์„ ํ™•์‹คํžˆ ์•Œ๊ณ  ๋„˜์–ด๊ฐ€์•ผ ํ•˜๋Š”๋ฐ, ๊ทธ ์ด์œ ๋Š” ์šฐ๋ฆฌ๊ฐ€ GM์—์„œ ๋‹ค๋ฃจ๊ณ ์ž ํ•˜๋Š” ๋ชจ๋“  ๋ฐ์ดํ„ฐ๋Š” ํ™•๋ฅ ๋ถ„ํฌ๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ๋Š” ๋žœ๋ค๋ณ€์ˆ˜(Random Variable)์ด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค. ๊ฐ€๋ น 2์ฐจ ๋ฐฉ์ •์‹์—์„œ ๋ฏธ์ง€ ์ˆ˜ X๋ฅผ ๋ณ€์ˆ˜๋ผ ๋ถ€๋ฅด๊ณ , ์ด๋ฅผ ๋Œ€์ž…ํ•ด ๋ฐฉ์ •์‹์„ ํ’€๋ฉด ๋ฏธ์ง€์ˆ˜ X๋Š” ํŠน์ •ํ•œ ์ˆ˜๊ฐ€ ๋ฉ๋‹ˆ๋‹ค. ๊ทธ๋Ÿฌ๋‚˜, ๋žœ๋ค๋ณ€์ˆ˜๋Š” ์ธก์ •ํ•  ๋•Œ๋งˆ๋‹ค ๋‹ค๋ฅธ ๊ฐ’์ด ๋‚˜์˜ต๋‹ˆ๋‹ค. ํ•˜์ง€๋งŒ, ํŠน์ •ํ•œ ํ™•๋ฅ ๋ถ„ํฌ๋ฅผ ๋”ฐ๋ฅด๋Š” ์ˆซ์ž๋ฅผ ์ƒ์„ฑํ•˜๋ฏ€๋กœ, ๋žœ๋ค๋ณ€์ˆ˜์— ๋Œ€ํ•œ ํ™•๋ฅ ๋ถ„ํฌ๋ฅผ ์•ˆ๋‹ค๋Š” ์ด์•ผ๊ธฐ๋Š” ๋žœ๋ค๋ณ€์ˆ˜ ์ฆ‰ ๋ฐ์ดํ„ฐ์— ๋Œ€ํ•œ ์ „๋ถ€๋ฅผ ์ดํ•ดํ•˜๊ณ  ์žˆ๋‹ค๋Š” ๊ฒƒ๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค.ย  ์˜ˆ๋ฅผ ๋“ค์–ด, ํ™•๋ฅ ๋ถ„ํฌ๋ฅผ ์•Œ๋ฉด ๊ทธ ๋ฐ์ดํ„ฐ์˜ ์˜ˆ์ธก ๊ธฐ๋Œ“๊ฐ’, ๋ฐ์ดํ„ฐ์˜ ๋ถ„์‚ฐ์„ ์ฆ‰๊ฐ ์•Œ์•„๋‚ผ ์ˆ˜ ์žˆ์–ด ๋ฐ์ดํ„ฐ์˜ ํ†ต๊ณ„์  ํŠน์„ฑ์„ ๋ฐ”๋กœ ๋ถ„์„ํ•  ์ˆ˜ ์žˆ์œผ๋ฉฐ, ์ฃผ์–ด์ง„ ํ™•๋ฅ ๋ถ„ํฌ๋ฅผ ๋”ฐ๋ฅด๋„๋ก ๋ฐ์ดํ„ฐ๋ฅผ ์ž„์˜ ์ƒ์„ฑํ•˜๋ฉด ๊ทธ ๋ฐ์ดํ„ฐ๋Š” ํ™•๋ฅ ๋ถ„ํฌ๋ฅผ ๊ตฌํ•  ๋•Œ ์‚ฌ์šฉํ•œ ์› ๋ฐ์ดํ„ฐ์™€ ์œ ์‚ฌํ•œ ๊ฐ’์„ ๊ฐ€์ง‘๋‹ˆ๋‹ค. ์ฆ‰, ๋น„์ง€๋„ํ•™์Šต์ด ๊ฐ€๋Šฅํ•œ ๋จธ์‹ ๋Ÿฌ๋‹ ์•Œ๊ณ ๋ฆฌ์ฆ˜์œผ๋กœ ๋ฐ์ดํ„ฐ์— ๋Œ€ํ•œ ํ™•๋ฅ ๋ถ„ํฌ๋ฅผ ๋ชจ๋ธ๋ง ํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋˜๋ฉด, ์› ๋ฐ์ดํ„ฐ์™€ ํ™•๋ฅ ๋ถ„ํฌ๋ฅผ ์ •ํ™•ํžˆ ๊ณต์œ ํ•˜๋Š” ๋ฌดํ•œํžˆ ๋งŽ์€ ์ƒˆ๋กœ์šด ๋ฐ์ดํ„ฐ๋ฅผ ์ƒˆ๋กœ ์ƒ์„ฑํ•  ์ˆ˜ ์žˆ์Œ์„ ์˜๋ฏธํ•ฉ๋‹ˆ๋‹ค.
  3. GAN์€ 2014๋…„ NIPS์—์„œ Ian Goodfellow๊ฐ€ ๋ฐœํ‘œํ•œ ํšŒ๊ท€์ƒ์„ฑ ๋ชจ๋ธ๋กœ์„œ ๋ถ„๋ฅ˜๋ฅผ ๋‹ด๋‹นํ•˜๋Š” ๋ชจ๋ธ(ํŒ๋ณ„์ž D)๊ณผ ํšŒ๊ท€์ƒ์„ฑ์„ ๋‹ด๋‹นํ•˜๋Š” ๋‘ ๊ฐœ์˜ ๋ชจ๋ธ(์ƒ์„ฑ์ž G)๋กœ ๊ตฌ์„ฑ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค. ๋‘ ๋ชจ๋ธ์€ GAN์ด๋ž€ ์ด๋ฆ„์—์„œ ์‰ฝ๊ฒŒ ์•Œ ์ˆ˜ ์žˆ๋“ฏ์ด, ์ƒ์„ฑ์ž G์™€ ํŒ๋ณ„์ž D๊ฐ€ ์„œ๋กœ์˜ ์„ฑ๋Šฅ์„ ๊ฐœ์„ ํ•ด ์ ๋Œ€์ ์œผ๋กœ ๊ฒฝ์Ÿํ•ด ๋‚˜๊ฐ€๋Š” ๋ชจ๋ธ์ž…๋‹ˆ๋‹ค. ์‰ฝ๊ฒŒ ๋งํ•ด ๊ฒฝ์ฐฐ๊ณผ ์ง€ํ ์œ„์กฐ๋ฒ”์˜ ๋Œ€๋ฆฝ๊ณผ ๊ฐ™์€ ๋ฐฉ์‹์œผ๋กœ ์ดํ•ดํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์ง€ํ ์œ„์กฐ๋ฒ”(์ƒ์„ฑ์ž G)์€ ๊ฒฝ์ฐฐ(๋ถ„๋ฅ˜์ž D)์„ ์ตœ๋Œ€ํ•œ ์—ด์‹ฌํžˆ ์†์ด๋ ค๊ณ  ํ•˜๊ณ , ๋‹ค๋ฅธ ํ•œํŽธ์—์„œ๋Š” ๊ฒฝ์ฐฐ์€ ์ด๋ ‡๊ฒŒ ์œ„์กฐ๋œ ์ง€ํ์™€ ์ง„์งœ ์ง€ํ๋ฅผ ๋‘๊ณ  ๋ถ„๋ฅ˜ํ•˜๊ธฐ ์œ„ํ•ด ๋…ธ๋ ฅํ•ฉ๋‹ˆ๋‹ค. ์ด๋Ÿฌํ•œ ๊ฒฝ์Ÿ์ด ์ง€์†์ ์œผ๋กœ ํ•™์Šต๋˜๋ฉด ๊ฒฐ๊ณผ์ ์œผ๋กœ๋Š” ์ง„์งœ ์ง€ํ์™€ ์œ„์กฐ์ง€ํ๋ฅผ ๊ตฌ๋ณ„ํ•  ์ˆ˜ ์—†์„ ์ •๋„์˜ ์ƒํƒœ๊ฐ€ ๋˜๋ฉฐ, ์ง„์งœ์™€ ๊ฑฐ์˜ ์ฐจ์ด๊ฐ€ ์—†๋Š” ๊ฐ€์งœ ์ง€ํ๋ฅผ ๋งŒ๋“ค์–ด ๋‚ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์ˆ˜ํ•™์ ์œผ๋กœ ์ƒ์„ฑ์ž G๋Š” ์•ž์—์„œ ๋งํ•œ ์› ๋ฐ์ดํ„ฐ์˜ ํ™•๋ฅ ๋ถ„ํฌ๋ฅผ ์•Œ์•„๋‚ด๋ ค๊ณ  ๋…ธ๋ ฅํ•˜๋ฉฐ, ํ•™์Šต์ด ์ข…๋ฃŒ๋œ ํ›„์—๋Š” ์› ๋ฐ์ดํ„ฐ์˜ ํ™•๋ฅ ๋ถ„ํฌ๋ฅผ ๋”ฐ๋ฅด๋Š” ์ƒˆ๋กœ์šด ๋ฐ์ดํ„ฐ๋ฅผ ๋งŒ๋“ค์–ด ๋‚ด๊ฒŒ ๋ฉ๋‹ˆ๋‹ค. ์œ„์˜ (a)~(d) ๊ทธ๋ž˜ํ”„์—์„œ ์› ๋ฐ์ดํ„ฐ์˜ ํ™•๋ฅ ๋ถ„ํฌ๊ฐ€ ํ•™์Šต์ด ๊ฑฐ๋“ญ ์ง„ํ–‰๋จ์— ๋”ฐ๋ผ GAN์ด ๋งŒ๋“ค์–ด ๋‚ด๋Š” ํ™•๋ฅ ๋ถ„ํฌ์™€ ๊ฑฐ์˜ ๋™์ผํ•ด ์ง์„ ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์ด๋ ‡๊ฒŒ ๋˜๋ฉด ํŒŒ๋ž€์ƒ‰ ์ ์„ ์ธ ๋ถ„๋ฅ˜์ž D๋Š” ๋” ์ด์ƒ ๋ถ„๋ฅ˜๋ฅผ ํ•ด๋„ ์˜๋ฏธ๊ฐ€ ์—†๋Š” 0.5๋ผ๋Š” ํ™•๋ฅ  ๊ฐ’์„ ๋ฑ‰์–ด๋‚ด๊ฒŒ ๋˜์ฃ . ์ด๊ฒƒ์€ ๋™์ „์„ ๋˜์ ธ์„œ ์•ž๋ฉด์„ ์ง„์‹ค, ๋’ท๋ฉด์„ ๊ฑฐ์ง“์ด๋ผ๊ณ  ํ–ˆ์„ ๋•Œ, ์ง„์‹ค์„ ๋งž์ถœ ํ™•๋ฅ ์ด 0.5๊ฐ€ ๋˜๋Š” ๊ฒƒ์ฒ˜๋Ÿผ GAN์— ์˜ํ•ด ๋งŒ๋“ค์–ด์ง„ ๋ฐ์ดํ„ฐ๊ฐ€ ์ง„์งœ ์ธ์ง€ ๊ฐ€์งœ์ธ์ง€ ๋งž์ถœ ํ™•๋ฅ ์ด 0.5๊ฐ€ ๋˜๋ฉด์„œ ๋ถ„๋ฅ˜์ž๊ฐ€ ์˜๋ฏธ ์—†๊ฒŒ ๋˜๋Š” ๊ฒ๋‹ˆ๋‹ค. ๊ฒฐ๋ก ์ ์œผ๋กœ ์ƒ์„ฑ์ž G๊ฐ€ ์‹ค์ œ ๋ฐ์ดํ„ฐ์™€ ๊ฑฐ์˜ ์œ ์‚ฌํ•œ ๋ฐ์ดํ„ฐ๋ฅผ ๋งŒ๋“ค์–ด ๋‚ผ ์ˆ˜ ์žˆ๋Š” ์ƒํ™ฉ์ด ๋˜์—ˆ์Œ์„ ์˜๋ฏธํ•˜์ฃ .
  4. ๊ธฐ์กด์—๋Š” x์™€ y๋“ค์„ ๊ด€์ฐฐํ•˜๊ณ  ์ƒˆ๋กœ์šด x๊ฐ€ ๋“ค์–ด์™”์„๋•Œ ์ƒˆ๋กœ์šด y๋ฅผ ์˜ˆ์ธก. Latent variable์€ ์ง์ ‘ ๊ด€์ฐฐํ•˜๋Š”๊ฒƒ์ด ์•„๋‹ˆ๋ผ ๋ฐ์ดํ„ฐ๋ฅผ ํ†ตํ•ด์„œ ์œ ์ถ”ํ•˜๋Š”๊ฒƒ! ์ด์ „๊นŒ์ง€ ๋ฐฐ์šด๋‚ด์šฉ์€ ๋ฐ์ดํ„ฐ๊ฐ€ ์žˆ๊ณ , ์ด๋ฏธ์ง€์—์„œ ์–ด๋–ค ์ •๋ณด๋ฅผ ์ถ”์ถœํ•˜๊ฑฐ๋‚˜, ๋ฌธ์žฅ์—์„œ ์ •๋ณด๋ฅผ ์ถ”์ถœํ•˜๊ฑฐ๋‚˜ ์ด๋Ÿฐ๊ฒƒ์ด์—ˆ๋Š”๋ฐ, ์ด์ œ๋Š” ์ด ๋ฐ์ดํ„ฐ๊ฐ€ ์–ด๋–ป๊ฒŒ ์ƒ์„ฑ๋˜์—ˆ๋Š”์ง€์— ๋Œ€ํ•ด ์งˆ๋ฌธ์„ ํ•˜๋Š”๊ฒƒ. โ€œ์ด ๋ฌธ์žฅ์€ ์–ด๋–ป๊ฒŒ ์ƒ์„ฑ๋œ ๊ฒƒ์ผ๊นŒ?โ€ -์— ๋Œ€ํ•ด ์ž ์žฌ๋ณ€์ˆ˜๊ฐ€ ์ž‘์šฉํ•œ๋‹ค๊ณ  ๊ฐ€์ •ํ•˜๋Š”๊ฒƒ(์–ธ์ œ๋‚˜ ๊ทธ๋Ÿฐ๊ฒƒ์€ ์•„๋‹ˆ์ง€๋งŒ)
  5. ์ด ์ž ์žฌ๋ณ€์ˆ˜ ๊ฐœ๋…์€ ๋ฐฉ๋Œ€ํ•œ ์˜์—ญ์—์„œ ํ™œ์šฉ๋˜๊ณ  ์žˆ๋Š”๋ฐ ์ดํ•ด๋ฅผ ๋•๊ธฐ์œ„ํ•ด ๊ฐ„๋‹จํ•œ ์˜ˆ์‹œ๋ฅผ ๋“ค์ž๋ฉด ๊ฒฝ์ œํ•™์—ฐ๊ตฌ์—์„œ ํ–‰๋ณต์˜ ์ •๋„๋ฅผ ์ธก์ •ํ•˜๊ณ ์‹ถ๋‹ค. ๊ทผ๋ฐ ์ง์ ‘์ ์œผ๋กœ ์ ์ˆ˜๋ฅผ ์ค˜๋ผ ํ•ด์„œ ์ฒ˜๋ฆฌํ•ด๋ฒ„๋ฆฌ๋ฉด ์ˆ˜๋งŽ์€ ๋…ธ์ด์ฆˆ๊ฐ€ ์ค‘๊ฐ„์— ๊ฐ€๋ฏธ๋˜๊ฑฐ๋‚˜ ๋‹ค๋ฅธ ํ˜ผ๋™์š”์ธ๋“ค์ด ์ถ”๊ฐ€๋ ๊ฒƒ. ๊ทธ๋ž˜์„œ ์ตœ๊ทผ์˜ ์—ฐ๊ตฌ๋“ค์€ ํ–‰๋ณต์— ์˜ํ–ฅ์„ ์ค„ ์š”์†Œ๋“ค์„ ์กฐ์‚ฌํ•ด์„œ ์ด ์ธก์ •๋œ ๋ฐ์ดํ„ฐ๋ฅผ ๊ฐ€์ง€๊ณ  ํ–‰๋ณต์˜ ์ •๋„๋ฅผ ์ถ”๋ก ํ•˜๊ฒŒ ๋œ๋‹ค. ์ด ๊ฒฝ์šฐ ํ–‰๋ณต์„ ์ž ์žฌ๋ณ€์ˆ˜๋กœ ๋ณผ ์ˆ˜ ์žˆ์„๊ฒƒ. ์ง์ ‘ ์ธก์ •ํ• ์ˆ˜๋Š” ์—†์ง€๋งŒ ์ธก์ •๋œ ๋ฐ์ดํ„ฐ๋ฅผ ํ†ตํ•ด์„œ ์ถ”๋ก ๊ฐ€๋Šฅํ•˜๋‹ค.
  6. ์ด๋Ÿฐ ์ž ์žฌ๋ณ€์ˆ˜๋“ค์„ ํ™œ์šฉํ•˜๋Š” ์ˆ˜๋งŽ์€ ํ™•๋ฅ ์ ์ƒ์„ฑ๋ชจํ˜•๋“ค์ด ์žˆ์œผ๋ฉฐ linear Factor Model์€ ๊ฐ€์žฅ ๊ธฐ์ดˆ์ ์ธ ์˜ˆ. ๋ชจ๋“  ๋ฐ์ดํ„ฐ x ๊ฐ€ ์ž ์žฌ๋ณ€์ˆ˜ h์˜ ์„ ํ˜•๋ณ€ํ™˜์œผ๋กœ ์ด๋ฃจ์–ด์กŒ๋‹ค๋Š” ๊ฐ€์ •์„ ๊ธฐ๋ฐ˜์œผ๋กœ ํ•œ๋‹ค. ๊ฒฐ๊ตญ ์ด ์ž ์žฌ๋ณ€์ˆ˜๋“ค์„ ์–ด๋–ป๊ฒŒ ์ฐพ์•„๋‚ผ๊ฒƒ์ธ๊ฐ€๊ฐ€ ๊ทผ๋ณธ์ ์ธ ๋ชฉ์ 
  7. ์ด์ œ๋ถ€ํ„ฐ ๋”ฅ๋Ÿฌ๋‹์˜ unsupervised model์„ ๋ณธ๊ฒฉ์ ์œผ๋กœ ๊ณต๋ถ€ํ•˜๊ฒŒ ๋˜๋Š”๋ฐ ๋‹ค์–‘ํ•œ Linear Factor model ์žˆ์ง€๋งŒ ์—ฌ๊ธฐ ์„ธ์…˜์—์„œ๋Š” ์ด๋ ‡๊ฒŒ ์„ธ๊ฐ€์ง€๋ฅผ ์•Œ์•„๋ณผ๊ฒƒ์ด๋‹ค. ์ด ๋ชจ๋ธ๋“ค์˜ ์ฐจ์ด๋ฅผ ๊ฐ„๋‹จํ•˜๊ฒŒ ์„ค๋ช…ํ•˜์ž๋ฉด ๊ฐ๊ฐ ๋…ธ์ด์ฆˆ์™€, ์ž ์žฌ๋ณ€์ˆ˜์˜ ์‚ฌ์ „๋ถ„ํฌ๋ฅผ ์–ด๋–ป๊ฒŒ ๊ฐ€์ •ํ•˜๋Š๋ƒ์ด๋‹ค. ๋‹ค๋ฅธ ๊ฐ€์ •์€ ๋‹ค๋ฅธ ๊ฒฐ๊ณผ๋กœ ๊ท€๊ฒฐ๋œ๋‹ค.
  8. ๋”ฐ๋ผ์„œ ์‹œ๊ทธ๋งˆ๊ฐ€ 0์ด๋˜๋ฉด ์ผ๋ฐ˜์ ์ธ PCA๊ฐ€ ๋œ๋‹ค.
  9. ๋”ฐ๋ผ์„œ ์‹œ๊ทธ๋งˆ๊ฐ€ 0์ด๋˜๋ฉด ์ผ๋ฐ˜์ ์ธ PCA๊ฐ€ ๋œ๋‹ค.
  10. EM ์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ๋ชจ์ˆ˜์— ๊ด€ํ•œ ์ถ”์ •๊ฐ’์œผ๋กœ ๋กœ๊ทธ๊ฐ€๋Šฅ๋„(log likelihood)์˜ ๊ธฐ๋Œ“๊ฐ’์„ ๊ณ„์‚ฐํ•˜๋Š” ๊ธฐ๋Œ“๊ฐ’ (E) ๋‹จ๊ณ„์™€ ์ด ๊ธฐ๋Œ“๊ฐ’์„ ์ตœ๋Œ€ํ™”ํ•˜๋Š” ๋ชจ์ˆ˜ ์ถ”์ •๊ฐ’๋“ค์„ ๊ตฌํ•˜๋Š” ์ตœ๋Œ€ํ™” (M) ๋‹จ๊ณ„๋ฅผ ๋ฒˆ๊ฐˆ์•„๊ฐ€๋ฉด์„œ ์ ์šฉํ•œ๋‹ค. ์ตœ๋Œ€ํ™” ๋‹จ๊ณ„์—์„œ ๊ณ„์‚ฐํ•œ ๋ณ€์ˆ˜๊ฐ’์€ ๋‹ค์Œ ๊ธฐ๋Œ“๊ฐ’ ๋‹จ๊ณ„์˜ ์ถ”์ •๊ฐ’์œผ๋กœ ์“ฐ์ธ๋‹ค.
  11. ๋Œ€๋ถ€๋ถ„์˜ ๊ฒฝ์šฐ ์ฐจ์› ์ถ•์†Œ๋ฅผ ํ™œ์šฉํ•˜์—ฌ ์ค„์ด๊ณ ์ž ํ•˜๋Š” ๋ชฉํ‘œ์ฐจ์› K๊ฐ€ ๋ฐ์ดํ„ฐ์˜ ๊ฐฏ์ˆ˜ N๋ณด๋‹ค ์ž‘๊ธฐ๋•Œ๋ฌธ์— ํ›จ์”ฌ ์ ์€ ๊ณ„์‚ฐ์œผ๋กœ ๊ตฌํ•˜๋Š”๊ฒŒ ๊ฐ€๋Šฅ
  12. ์‹œ๊ทธ๋งˆ์ œ๊ณฑ์˜ ์ •๊ทœ๋ถ„ํฌ๋ฅผ ๋”ฐ๋ฅด๋Š” ๋…ธ์ด์ฆˆ ํ•ญ์„ ๋„์ž…ํ•ด์„œ EM์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ํ™œ์šฉํ• ์ˆ˜ ์žˆ๋„๋ก ์‹์„ ๊ณ ์น˜๊ณ , EM์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ํ™œ์šฉํ•˜์—ฌ ์ตœ๋Œ€ํ•œ ๊ณ„์‚ฐ๋Ÿ‰์„ ์ค„์ด๊ณ  ๊ทผ์‚ฌํ•œ ๋‹ต์„ ์ฐพ์„์ˆ˜ ์žˆ๋„๋ก ํ•˜๋Š”๊ฒƒ
  13. ๋‹ค์ฐจ์›(Multi-dimension)์œผ๋กœ ์ด๋ฃจ์–ด์ง„ ๋ฐ์ดํ„ฐ์˜ ๊ฒฝ์šฐ ๊ทธ๋Œ€๋กœ ์‚ฌ์šฉํ•  ์ˆ˜๋„ ์žˆ์ง€๋งŒ ๋ฐ์ดํ„ฐ๋ฅผ ์–ด๋–ค ์ถ•์œผ๋กœ ํˆฌ์˜(projection)์‹œ์ผœ ๋ฐ์ดํ„ฐ์˜ ์†์„ฑ์„ ๋ฐ”๊ฟ€ ์ˆ˜ ์žˆ๋‹ค. ์ด ๋•Œ ํˆฌ์˜์‹œํ‚ค๋Š” ์ถ•์˜ ์ˆ˜๋ฅผ ์›๋ž˜์˜ ๋ฐ์ดํ„ฐ ์ฐจ์›์ˆ˜์™€ ์ผ์น˜์‹œ์ผœ ํˆฌ์˜ ์ดํ›„์— ๋งŒ๋“ค์–ด์ง€๋Š” ๋ฐ์ดํ„ฐ์˜ ์ฐจ์›์ด ํˆฌ์˜ ์ „์˜ ๋ฐ์ดํ„ฐ์™€ ๊ฐ™๊ฒŒ ํ•  ์ˆ˜ ์žˆ๊ณ  ๋” ์ ๊ฒŒ ํ•˜์—ฌ ์••์ถ•์˜ ํšจ๊ณผ๋ฅผ ๋ณผ ์ˆ˜๋„ ์žˆ๋‹ค. ๋‹ค์ฐจ์› ๋ฐ์ดํ„ฐ๊ฐ€ ํˆฌ์˜๋  ๋•Œ ํˆฌ์˜๋œ ๊ฒฐ๊ณผ ์ˆ˜์น˜๋“ค์ด ์˜๋ฏธ ์žˆ๋Š” ์†์„ฑ์„ ์ง€๋‹ˆ๋„๋ก ์ถ•์„ ๊ฒฐ์ •ํ•˜๋Š” ๋ฐฉ๋ฒ•์œผ๋กœ์„œ Projection Pursuit์ด ์žˆ๋‹ค.ย  Projection Pursuit์—์„œ ์˜๋ฏธ ์žˆ๋Š” ์†์„ฑ์ด๋ž€ ํˆฌ์˜๋œ ์ˆ˜์น˜๋“ค์ด ๊ฐ€์šฐ์‹œ์•ˆ(Gaussian) ๋ถ„ํฌ๋กœ๋ถ€ํ„ฐ ๊ฐ€๋Šฅํ•œ ๋ฉ€๋ฆฌ ๋–จ์–ด์ง„ ๋ถ„ํฌ๋ฅผ ์ด๋ฃจ๋Š” ๊ฒƒ์„ ๋งํ•œ๋‹ค. ์ฆ‰ non-gaussianity๊ฐ€ ๋†’๊ฒŒ ๋‚˜ํƒ€๋‚˜๋Š” ๋ฐฉํ–ฅ์œผ๋กœ ์ถ•์„ ์ •ํ•˜๋Š” ๊ฒƒ์ด๋‹ค. ์ฃผ์š”์„ฑ๋ถ„๋ถ„์„(Principal Component Analysis : PCA)์€ ๋‹จ์œ„ ํฌ๊ธฐ์ด๊ณ  ์„œ๋กœ ์ง๊ต(orthogonal) ๊ด€๊ณ„์— ์žˆ๋Š” ์•„์ด๊ฒ ๋ฒกํ„ฐ(eigen vector)๋“ค์„ ์•„์ด๊ฒ ๊ฐ’(eigen value)์˜ ํฌ๊ธฐ ์ˆœ์œผ๋กœ ์ผ๋ถ€๋ฅผ ์„ ํƒํ•˜์—ฌ ๋ฐ์ดํ„ฐ๋ฅผ ๋ณธ๋ž˜๋ณด๋‹ค ๋” ์ ์€ ์ฐจ์›์œผ๋กœ ์••์ถ•ํ•  ์ˆ˜ ์žˆ๋‹ค. ๋ฐ์ดํ„ฐ์˜ ์ƒ๊ด€๊ด€๊ณ„๋ฟ๋งŒ ์•„๋‹ˆ๋ผ ๋” ๋†’์€ ์ฐจ์ˆ˜์˜ ์ƒ๊ด€๊ด€๊ณ„๊นŒ์ง€๋„ ์—†์•จ ์ˆ˜ ์žˆ์œผ๋ฉฐ ๊ฒฐ๊ตญ ์ฐจ์›๋“ค๊ฐ„์˜ ๊ด€๊ณ„๋ฅผ ๋…๋ฆฝ(independent)์œผ๋กœ ๋ณ€ํ™˜์‹œํ‚ค๋Š” ๋ฐฉ๋ฒ•์„ ๋…๋ฆฝ์„ฑ๋ถ„๋ถ„์„(Independent Component Analysis : ICA)์ด๋ผ๊ณ  ํ•œ๋‹ค.ย  ๋ณธ๋ž˜ ๋…๋ฆฝ์„ฑ๋ถ„๋ถ„์„์€ cocktail-party problem๋ผ๋Š” ๋ฌธ์ œ์™€ ์—ฐ๊ด€๋˜์–ด ์žˆ๋‹ค. ํŒŒํ‹ฐ์žฅ์—์„œ ์—ฌ๋Ÿฌ ์‚ฌ๋žŒ, ์Œ์•…, ๋ฌผ์ฒด ๋“ฑ์˜ ์†Œ๋ฆฌ๋ฅผ ๋™์‹œ์— ๋“ฃ๊ฒŒ ๋˜์ง€๋งŒ ๋Œ€ํ™”๋ฅผ ํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š” ๋Œ€ํ™” ์ƒ๋Œ€์˜ ๋ชฉ์†Œ๋ฅผ ์ œ๋Œ€๋กœ ์ธ์‹ํ•ด์•ผ ํ•˜๋Š” ๊ฒƒ์ด๋‹ค. ์—ฌ๊ธฐ์„œ ๊ฐ๊ฐ์˜ ๋…๋ฆฝ์ ์ธ ์†Œ๋ฆฌ๊ฐ€ ๋…๋ฆฝ์„ฑ๋ถ„๋ถ„์„์—์„œ์˜ ์†Œ์Šค(source)์— ํ•ด๋‹น๋œ๋‹ค. ์ด๋•Œ ์›๋ž˜์˜ ์†Œ์Šค์™€ ์†Œ์Šค๋“ค์ด ์„ž์ด๊ฒŒ(mixing) ๋˜๋Š” ๊ณผ์ •์— ๋Œ€ํ•œ ์ •๋ณด๋ฅผ ๋ชจ๋ฅธ ์ฑ„ ๋‹จ์ง€ ์„ž์ธ ์‹ ํ˜ธ(mixture)๋งŒ์„ ๊ฐ€์ง€๊ณ  ์›๋ž˜์˜ ์†Œ์Šค๋ฅผ ๋ฐํ˜€๋‚ด๊ฒŒ ๋˜๋Š”๋ฐ ์ด๋ฅผ Blind Source Separation(BBS)๋ผ ๋ถ€๋ฅธ๋‹ค.ย  // ์„ž์—ฌ์žˆ๋Š” ๊ด€์ธก๋œ ์‹ ํ˜ธ๋“ค์„ ๋…๋ฆฝ๋œ ๊ฐ๊ฐ์˜ ์‹ ํ˜ธ๋“ค๋กœ ๋ถ„๋ฆฌํ•ด๋‚ธ๋‹ค.
  14. ์• ์ดˆ์— ์›๋ž˜ ์‹ ํ˜ธ๋“ค์ด ์™„๋ฒฝํ•˜๊ฒŒ ๋…๋ฆฝ์ด๋ผ๊ณ  ๊ฐ€์ •ํ•œ๋‹ค (cocktail party ๊ฐ™์€ ์ƒํ™ฉ์— ๋งค์šฐ ์ ํ•ฉ) ๊ด€์ธก๋œ mix ๋ฐ์ดํ„ฐ์—์„œ ๋…๋ฆฝ์ธ ์‹ ํ˜ธ๋“ค๋กœ ๋ถ„๋ฆฌํ•ด๋‚ด๋Š”๊ฒƒ
  15. ICA๋Š” PCA๋ž‘ ์—ฐ๊ด€์ง€์–ด์„œ ์ƒ๊ฐํ•ด๋ณด๋ฉด ์ดํ•ดํ•˜๊ธฐ๊ฐ€ ๊ฐ€์žฅ ์‰ฝ์Šต๋‹ˆ๋‹ค. PCA๋Š” ์œ„์—์„œ ๋ณด์ด๋Š” ๊ฒƒ์ฒ˜๋Ÿผ factor๋“ค์ด uncorrelated๋˜๋Š” ์ถ•์„ ์ฐพ๋Š” ๊ฑด๋ฐ, ์—ฌ๊ธฐ์„œ uncorrelated๋ผ๋Š” ๊ฒƒ์ด statistically independent๋ฅผ ๋ณด์žฅํ•ด์ฃผ์ง€ ์•Š์Šต๋‹ˆ๋‹ค (์ฆ‰, X์™€ Y๊ฐ€ ๋…๋ฆฝ์ด๋ฉด cov(X,Y)=0์ด์ง€๋งŒ ์ผ๋ฐ˜์ ์œผ๋กœ ์—ญ์€ ์„ฑ๋ฆฝํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค. X, Y๊ฐ€ gaussian์ด๋ฉด ์„ฑ๋ฆฝํ•˜๊ตฌ์š”). ICA๋Š” ์ด๋ฆ„ ๊ทธ๋Œ€๋กœ statistically independent์ธ component๋ฅผ ์ฐพ๋Š”๊ฒŒ ๋ชฉ์ ์ธ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ž…๋‹ˆ๋‹ค.
  16. ๋…๋ฆฝ์„ฑ๋ถ„๋ถ„์„์ด ์‚ฌ์šฉ๋˜๋Š” ๋ถ„์•ผ ์ค‘์˜ ํ•˜๋‚˜๋กœ์„œ ๋‡Œ ํ‘œ๋ฉด์˜ ์‹ ํ˜ธ๋ฅผ ์ „๊ทน์„ ์ด์šฉํ•˜์—ฌ ์ธก์ •ํ•œ ์ „์œ„์ธ ์‹ ํ˜ธ์—์„œ์˜ ์ ์šฉ์„ ๋“ค ์ˆ˜ EEG(electroencephalogram) ์žˆ๋‹ค. ๋‡Œํ™œ๋™์— ์˜ํ•ด ๋ฐœ์ƒ๋˜๋Š” ์‹ ํ˜ธ์™€ ๊ทธ EEG ์™ธ์˜ ๋‹ค๋ฅธ ๋ถ€์œ„์—์„œ ๋ฐœ์ƒ๋˜๋Š” ์‹ ํ˜ธ ์ฆ‰, ๋…ธ์ด์ฆˆ(noise) ๋“ค์ด ์„ž์ธ ์ƒํƒœ๋กœ ์ธก์ •์ด ๋˜๋ฏ€๋กœ ๋…๋ฆฝ์„ฑ๋ถ„๋ถ„์„์„ ์ ์šฉ ํ•˜๋ฉด ์›ํ•˜๋Š” ๋‡ŒํŒŒ๋ฅผ ์–ป์„ ์ˆ˜ ์žˆ๋‹คย 
  17. ์–ผ๋ฃฉ๋ง์ด ์˜์ƒ์„ ์ง€๋‚˜๊ฐ€๊ฒŒ๋˜๋ฉด ๊ฐ๊ฐ์˜ ํ”ฝ์…€๋“ค์˜ ๊ฐ’์€ ํฐ์ƒ‰๊ฒ€์€์ƒ‰ ๊ต‰์žฅํžˆ ๋น ๋ฅด๊ฒŒ ๋ณ€ํ•˜์ง€๋งŒ ํ™”๋ฉด๋‚ด์— ์–ผ๋ฃฉ๋ง์ด ์กด์žฌํ•จ์„ ํ‘œํ˜„ํ•˜๋Š” ํŠน์ง•์€ ๋น„๊ต์  ์˜ค๋ž˜ ๋‚˜ํƒ€๋‚˜๊ฒŒ ๋  ๊ฒƒ. ์ฆ‰, ์–ด๋–ค ๊ณ ์ฐจ์›์˜ ํŠน์ง•์€ ๋น„๊ต์  ๋Š๋ฆฌ๊ฒŒ ๋ณ€ํ™”ํ•œ๋‹ค๊ณ  ํ•  ์ˆ˜ ์žˆ๋‹ค.
  18. ์ด๋ฒˆ์—” ์‚ฌ์ „๋ถ„ํฌ๋ฅผ ๊ฐ€์ •ํ•˜์ง€๋Š” ์•Š์ง€๋งŒ ์‚ฌ์ „์ ์œผ๋กœ โ€œ๊ฐ€์žฅ ๋Š๋ฆฌ๊ฒŒ ๋ณ€ํ•˜๋Š”๊ฒƒ์ด ์ค‘์š”ํ•œ์ •๋ณดโ€๋ผ๋Š” ์‚ฌ์ „์ ์ธ ์›์น™์„ ๊ธฐ๋ฐ˜์œผ๋กœ ์‹œ์ž‘ํ•œ๋‹ค.
  19. ๊ฐ€์žฅ ์ฐจ์ด๊ฐ€ ์ž‘์€ ํŠน์ง•์„ ์ฐพ์•„๋‚ด๋Š”๊ฒƒ
  20. ํŒŒํŠธ 3์— ๋“ค์–ด์™”๋‹ค. ์ดํ›„๋กœ ๋‹ค๋ฃฐ ๋‚ด์šฉ์€ ์ด ์ „๊นŒ์ง€ ๋‚ด์šฉ๊ณผ ๋งค์šฐ ๋‹ค๋ฅด๋‹ค. ์ง€๊ธˆ๊นŒ์ง€ ๋ฐฐ์šด ๋ฐฉ๋ฒ•๋“ค์€ ๋ณดํ†ต ์ง€๋„ํ•™์Šต์˜ ๋ฌธ์ œ๋ฅผ ํ‘ธ๋Š” ๋ฐฉ๋ฒ•๋“ค์ด์—ˆ๋‹ค. ๋ผ๋ฒจ๋ง์ด ๋œ ๊ฒฌ๋ณธ๋“ค์ด ์ถฉ๋ถ„ํžˆ ์ฃผ์–ด์กŒ์„๋•Œ, ํ•™์Šต๋ชจํ˜•์ด ๊ทธ๊ฒƒ์„ ๋ฐฐ์šฐ๊ฒŒ ํ–ˆ๋˜๊ฒƒ. ํ•˜์ง€๋งŒ ์‹ค์ œ ์„ธ๊ณ„์—์„œ๋Š” ๊ฒฐ์ธก๊ฐ’์„ ์ฒ˜๋ฆฌํ•˜๊ฑฐ๋‚˜, ๋ผ๋ฒจ๋ง์ด ์—†๋Š”๋ฐ์ดํ„ฐ๋ฅผ ๋‹ค๋ฃจ์–ด์•ผ ํ•  ๋•Œ๋„์žˆ๋‹ค. ํ˜„์žฌ๊นŒ์ง€ ๋”ฅ๋Ÿฌ๋‹๊ธฐ์ˆ ์˜ ํ•œ๊ณ„๋กœ ์ง€์ ๋˜๋Š” ๊ฒƒ์€ ๋Œ€๋Ÿ‰์˜ ํ•™์Šต์ž๋ฃŒ๊ฐ€ ํ•„์š”ํ•˜๋‹ค๋Š” ๊ฒƒ์ธ๋ฐ ๊ทธ๋Ÿฐ ๋ชฉํ‘œ๋“ค์„ ์œ„ํ•ด ๋น„์ง€๋„, ์ค€์ง€๋„ ํ•™์Šต์ด ์–ด๋Š์ •๋„ ํ•„์š”ํ•˜๋‹ค. ๋˜, ๊ณผ๋„ํ•œ ๊ณ„์‚ฐ์  ๋ถ€๋‹ด์„ ๊ฒฝ๊ฐ์‹œํ‚ค๊ณ ์ž ์ฒ˜๋ฆฌ๋ถˆ๊ฐ€๋Šฅํ•˜๊ฑฐ๋‚˜ ๋„ˆ๋ฌด ์–ด๋ ค์šด ๊ณ„์‚ฐ๋“ค์„ ํ†ต๊ณ„์ ์œผ๋กœ ๊ทผ์‚ฌํ•˜๋Š” ์—ฌ๋Ÿฌ ๋ฐฉ์‹๋“ค์— ๋Œ€ํ•ด ๋ฐฐ์šด๋‹ค. ์ด์ œ๋Š” ์ด ๋ฐ์ดํ„ฐ๊ฐ€ ์–ด๋–ป๊ฒŒ ์ƒ์„ฑ๋˜์—ˆ๋Š”์ง€์— ๋Œ€ํ•ด ์งˆ๋ฌธ์„ ํ•˜๋Š”๊ฒƒ. โ€œ์ด ๋ฌธ์žฅ์€ ์–ด๋–ป๊ฒŒ ์ƒ์„ฑ๋œ ๊ฒƒ์ผ๊นŒ?โ€ -์— ๋Œ€ํ•ด ์ž ์žฌ๋ณ€์ˆ˜๊ฐ€ ์ž‘์šฉํ•œ๋‹ค๊ณ  ๊ฐ€์ •ํ•˜๋Š”๊ฒƒ(์–ธ์ œ๋‚˜ ๊ทธ๋Ÿฐ๊ฒƒ์€ ์•„๋‹ˆ์ง€๋งŒ)