SlideShare a Scribd company logo
1 of 23
An Efficient Explorative Sampling
Considering the Generative Boundaries
of Deep Generative Neural Networks
Giyoung Jeon1*, Haedong Jeong1* and Jaesik Choi2
Statistical Artificial Intelligence Laboratory of
1Ulsan National Institute of Science and Technology (UNIST) and
2Korea Advanced Institute of Science and Technology (KAIST)
*Equal contribution
Motivation
• Generative Adversarial Networks (GANs) make high-quality and
various images in many domains.
𝑍
4x4
1024x1024
LSUN dataset
CelebA dataset
Generator
𝐺(𝑍)
Discriminator
𝐷(𝐺(𝑍))
T/F
Latent
Vector
𝑍 ∼ 𝑁(0, 𝐼)
Discriminative
value
T. Karras, et al., "Progressive growing of GANs for improved quality, stability, and variation”, ICLR, 2018.
1024x1024
Motivation
• The generative process is not well understood yet.
• We wish to give example-based explanation on the generative process.
Latent
Vector
𝑍
ℓ-th layer LSUN dataset
CelebA dataset
ℎℓ ∈ ℝ16×16×512
4x4
1024x1024
Previous work: analyzing the inside of deep neural networks
Google Deep Dream (Mordvintsev et al., 2015)
GAN dissection (Bau et al., 2019)
D. Bau et al., "Network Dissection: Quantifying Interpretability of Deep Visual Representations." CVPR, 2017.
D. Bau et al., "GAN Dissection: Visualizing and Understanding Generative Adversarial Networks”, ICLR, 2019.
A. Mordvintsev et al., "Inceptionism: Going deeper into neural networks”, 2015.
K. Dvijotham et al., "A Dual Approach to Scalable Verification of Deep Networks." UAI, 2018.
Lagrangian relaxed decision boundary
(Dvijotham et al., 2018)
Network dissection (Bau et al., 2017)
Interpretation: lamp
Interpretation: car
Unit 1
Unit 4
Interpretation: lamp
Interpretation: car
Unit 1
Unit 4
Interval bound
propagation
𝑥0 ∈ 𝑆
Input
perturbations
Propagated
regions
Refinement using
cutting planes
from dual 𝝀
Decision boundary
𝒄 𝑻
𝒙 𝑲 + 𝒅
Naïve bounds would fail to
certify robustness
Definitions
• Generator 𝐺 𝑍 : a generated image from 𝑍
• Hidden nodes ℎℓ: a neural representation of ℓ-th layer
• Partial generation 𝑔j:i: ℝ|hi| → ℝ|hj|
: a generative function from layer 𝑖
to layer 𝑗
𝑔4:1 𝑔 𝐿:5𝑍
4-th layer
ℎ4 ∈ ℝ16×16×512
𝐺 𝑍
Generative Boundary
– A value of ℎℓ is determined by the linear hyperplane in the space of the previous layer, ℎℓ−1
– Stacking of layers toward input makes highly non-linear and non-convex shape
• We want to see only feasible regions which constructed from the input to the target.
– Trained to fool the discriminator in GANs
𝑔ℓ:1 𝑔 𝐿:ℓ+1𝑍
ℎℓ
+
-
𝑔ℓ
𝑖
(ℎ𝑙−1) = 0 𝑔ℓ
𝑖
(ℎ𝑙−1) = ℎ𝑙
𝑖
ℎ𝑙
𝑖
nonlinearnonlinear
A space of previous layer
𝐵ℓ
𝑖
= {𝑍|𝑔ℓ:1
𝑖
(𝑧) = 0, 𝑍 ∈ ℝ|𝑍|
}
– In the ℓ-th layer, a space (Sℓ) which is surrounded by a set of generative boundaries.
– In the input space, a set of equivalent class of Z w.r.t 𝑺ℓ.
– In the image space, a set of equivalent class of image w.r.t. 𝑺ℓ.
Generative Region
𝑔ℓ:1 𝑔 𝐿:ℓ+1𝑍
T
T
ℎℓ
+
-
-
+
-
+
+
-
-
+
+ - - + - + + - -+
+ + - + - + + - -+
A space of previous layer
A
B
Problem Definition: Explorative sampling in a generative region
• Given: A GAN model (𝐺), a target layer (ℓ), and an input query (𝑧0)
• Goal: find a set of equivalent class of images generated from
the same generative region (𝑺ℓ).
𝑔ℓ:1 𝑔 𝐿:ℓ+1𝑧0
A space of previous layer
T
Query
ℎℓ
+
-
-
+
-
+
+
-
-
+
+ - - + - + + - -+
• The dimension of latent space and a lot of hyperplanes are hard
to handle in practice. (E.g., 4th layer in PGGAN: ℝ512 → ℝ8192)
• Typically generative region is nonconvex in higher layer due to
nonlinear activations.
Challenges of Sampling in a Generative Region
Small 𝜖-based sampling
• Every samples inside the region
• Exists blind regions
Large 𝜖-based spherical sampling
• Cover the region
• Might have out-of-region samples
Latent Space Latent Space
Reduction to the Robot Planning Problem
• Searching a path in non-convex space
• High degree of freedom of robot joint
• Searching samples in non-convex space
• High dimensional explorative space
𝑍 ∈ ℝ512
𝑆ℓ
Exploring a Generative Region Problem Robot Planning Problem
We reduce our sampling problem into robot-planning problem.
Reduction
Generative Boundary constrained
Rapidly-exploring Random Tree (RRT)
• Given generative boundary as constraints,
RRT is gives solution to search over the generative region.
• This explorative sampling always guarantee acceptance inside the region
LaValle, Steven M. “Rapidly-exploring random trees: A new tool for path planning”. Technical Report. Computer Science Department, Iowa State University. 1998.
Illustrative example Example in nonconvex regionAlgorithm of RRT
Smallest Supporting Generative Boundary Set
• Using all the boundaries, constraints get too tight and
computationally expensive.
• We observe not all the boundaries affects equally on the output.
𝑔 𝐿:ℓ+1𝑔ℓ:1𝑧0
Latent Space
Latent
Vector
Disregard values of relaxed boundaries
ℎℓ ℎℓ𝑚⊙ =
+
-
-
+
-
+
+
-
-
+
-
+
+
-
-
Smallest Supporting Generative Boundary Set
• Apply Bernoulli mask optimization to relax boundaries but
maintain the output.
Entire boundaries Using 10% Using 5%
Chang, Chun-Hao, et al. "Explaining image classifiers by adaptive dropout and generative in-filling." International Conference on Learning Representations (ICLR). 2018.
𝜃∗ = argmin
𝜃
ℒ(𝑧0, ℓ, 𝜃)
= argmin
𝜃
𝑔 𝐿:ℓ+1 𝑔𝑙:1 𝑧0 ⊙ 𝑚 − 𝐺 𝑧0 + 𝜆 𝜃 1 where 𝑚 ∼ 𝐵𝑒𝑟 𝜃
Masked image reconstruction error Mask l1 regularizer
Proposed Algorithm
𝑔 𝐿:ℓ+1𝑔ℓ:1
ℎℓ
𝑧0
𝑔 𝐿:ℓ+1𝑔ℓ:1
Bernoulli Mask
Optimization
RRT on
Generative
Boundary
Latent Space Smallest Supporing Generative Region Exploration and generation
LSUNdatasetCelebAdataset
+
-
-
+
-
+
+
-
-
+
ℎℓ
-
+
+
-
-
𝑧0
Explorative Generative Boundary Aware Sampling
• Accepted Cluster 1
• Accepted Cluster 2
• Accepted Cluster 3
• Rejected Sample
Experiment : DCGAN-MNIST
𝜖-based sampling
Our method
Query
Experiment : PGGAN-LSUN-church
𝜖-based sampling
Our method
Query
Experiment : PGGAN-LSUN-church
𝜖-based sampling Our methodQuery 𝜖-based sampling Our methodQuery
Experiment : PGGAN-celebA
𝜖-based sampling
Our method
Query
Experiment : PGGAN-celebA
𝜖-based sampling Our methodQuery Our methodQuery 𝜖-based sampling
Experiment : According to the portion of
activate mask
Query <5%<10%>10%
Conclusion
• We propose a new interpretable method to analyze the inside of
deep generative neural networks.
• Our explorative sampling method demonstrate the better
performance compared to existing method (e.g., 𝜖-based sampling)
when investigating decision regions.
• Our algorithm can be extended to different types of deep
neural networks models (e.g., classification model).
Thank you!
Explainable Artificial Intelligent Center of Korea
https://xai.kaist.ac.kr
This work was supported by the Institute for Information & communications Technology Planning & Evaluation
(IITP) grant funded by the Ministry of Science and ICT (MSIT), Korea (No. 2017-0-01779, XAI)

More Related Content

What's hot

Flow based generative models
Flow based generative modelsFlow based generative models
Flow based generative models수철 박
 
Introduction to batch normalization
Introduction to batch normalizationIntroduction to batch normalization
Introduction to batch normalizationJamie (Taka) Wang
 
A brief introduction to recent segmentation methods
A brief introduction to recent segmentation methodsA brief introduction to recent segmentation methods
A brief introduction to recent segmentation methodsShunta Saito
 
Applied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural NetworksApplied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural NetworksMark Chang
 
Continual Learning with Deep Architectures - Tutorial ICML 2021
Continual Learning with Deep Architectures - Tutorial ICML 2021Continual Learning with Deep Architectures - Tutorial ICML 2021
Continual Learning with Deep Architectures - Tutorial ICML 2021Vincenzo Lomonaco
 
Convolutional neural network
Convolutional neural networkConvolutional neural network
Convolutional neural networkFerdous ahmed
 
Denoising autoencoder by Harish.R
Denoising autoencoder by Harish.RDenoising autoencoder by Harish.R
Denoising autoencoder by Harish.RHARISH R
 
Deep learning lecture - part 1 (basics, CNN)
Deep learning lecture - part 1 (basics, CNN)Deep learning lecture - part 1 (basics, CNN)
Deep learning lecture - part 1 (basics, CNN)SungminYou
 
Recent Progress on Object Detection_20170331
Recent Progress on Object Detection_20170331Recent Progress on Object Detection_20170331
Recent Progress on Object Detection_20170331Jihong Kang
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural NetworkPrakash K
 
PR-317: MLP-Mixer: An all-MLP Architecture for Vision
PR-317: MLP-Mixer: An all-MLP Architecture for VisionPR-317: MLP-Mixer: An all-MLP Architecture for Vision
PR-317: MLP-Mixer: An all-MLP Architecture for VisionJinwon Lee
 
Semantic segmentation with Convolutional Neural Network Approaches
Semantic segmentation with Convolutional Neural Network ApproachesSemantic segmentation with Convolutional Neural Network Approaches
Semantic segmentation with Convolutional Neural Network ApproachesFellowship at Vodafone FutureLab
 
Transfer learning-presentation
Transfer learning-presentationTransfer learning-presentation
Transfer learning-presentationBushra Jbawi
 
Lecture 2 - Bit vs Qubits.pptx
Lecture 2 - Bit vs Qubits.pptxLecture 2 - Bit vs Qubits.pptx
Lecture 2 - Bit vs Qubits.pptxNatKell
 

What's hot (20)

Lecture 05 cmos logic gates
Lecture 05   cmos logic gatesLecture 05   cmos logic gates
Lecture 05 cmos logic gates
 
Flow based generative models
Flow based generative modelsFlow based generative models
Flow based generative models
 
06. graph mining
06. graph mining06. graph mining
06. graph mining
 
Introduction to batch normalization
Introduction to batch normalizationIntroduction to batch normalization
Introduction to batch normalization
 
MobileNet V3
MobileNet V3MobileNet V3
MobileNet V3
 
A brief introduction to recent segmentation methods
A brief introduction to recent segmentation methodsA brief introduction to recent segmentation methods
A brief introduction to recent segmentation methods
 
Applied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural NetworksApplied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural Networks
 
Continual Learning with Deep Architectures - Tutorial ICML 2021
Continual Learning with Deep Architectures - Tutorial ICML 2021Continual Learning with Deep Architectures - Tutorial ICML 2021
Continual Learning with Deep Architectures - Tutorial ICML 2021
 
Convolutional neural network
Convolutional neural networkConvolutional neural network
Convolutional neural network
 
Denoising autoencoder by Harish.R
Denoising autoencoder by Harish.RDenoising autoencoder by Harish.R
Denoising autoencoder by Harish.R
 
Deep learning lecture - part 1 (basics, CNN)
Deep learning lecture - part 1 (basics, CNN)Deep learning lecture - part 1 (basics, CNN)
Deep learning lecture - part 1 (basics, CNN)
 
Recent Progress on Object Detection_20170331
Recent Progress on Object Detection_20170331Recent Progress on Object Detection_20170331
Recent Progress on Object Detection_20170331
 
R-CNN
R-CNNR-CNN
R-CNN
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Fractional knapsack problem
Fractional knapsack problemFractional knapsack problem
Fractional knapsack problem
 
PR-317: MLP-Mixer: An all-MLP Architecture for Vision
PR-317: MLP-Mixer: An all-MLP Architecture for VisionPR-317: MLP-Mixer: An all-MLP Architecture for Vision
PR-317: MLP-Mixer: An all-MLP Architecture for Vision
 
Semantic segmentation with Convolutional Neural Network Approaches
Semantic segmentation with Convolutional Neural Network ApproachesSemantic segmentation with Convolutional Neural Network Approaches
Semantic segmentation with Convolutional Neural Network Approaches
 
Linear algebra
Linear algebraLinear algebra
Linear algebra
 
Transfer learning-presentation
Transfer learning-presentationTransfer learning-presentation
Transfer learning-presentation
 
Lecture 2 - Bit vs Qubits.pptx
Lecture 2 - Bit vs Qubits.pptxLecture 2 - Bit vs Qubits.pptx
Lecture 2 - Bit vs Qubits.pptx
 

Similar to Efficient Explorative Sampling of Deep Generative Models

Deep learning from a novice perspective
Deep learning from a novice perspectiveDeep learning from a novice perspective
Deep learning from a novice perspectiveAnirban Santara
 
Introduction to deep learning
Introduction to deep learningIntroduction to deep learning
Introduction to deep learningJunaid Bhat
 
Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)Gaurav Mittal
 
Neural Art (English Version)
Neural Art (English Version)Neural Art (English Version)
Neural Art (English Version)Mark Chang
 
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks, arXiv e-...
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks, arXiv e-...Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks, arXiv e-...
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks, arXiv e-...ssuser2624f71
 
Super resolution in deep learning era - Jaejun Yoo
Super resolution in deep learning era - Jaejun YooSuper resolution in deep learning era - Jaejun Yoo
Super resolution in deep learning era - Jaejun YooJaeJun Yoo
 
D1L5 Visualization (D1L2 Insight@DCU Machine Learning Workshop 2017)
D1L5 Visualization (D1L2 Insight@DCU Machine Learning Workshop 2017)D1L5 Visualization (D1L2 Insight@DCU Machine Learning Workshop 2017)
D1L5 Visualization (D1L2 Insight@DCU Machine Learning Workshop 2017)Universitat Politècnica de Catalunya
 
Nonlinear dimension reduction
Nonlinear dimension reductionNonlinear dimension reduction
Nonlinear dimension reductionYan Xu
 
物件偵測與辨識技術
物件偵測與辨識技術物件偵測與辨識技術
物件偵測與辨識技術CHENHuiMei
 
Deep Learning in Recommender Systems - RecSys Summer School 2017
Deep Learning in Recommender Systems - RecSys Summer School 2017Deep Learning in Recommender Systems - RecSys Summer School 2017
Deep Learning in Recommender Systems - RecSys Summer School 2017Balázs Hidasi
 
Overview of Convolutional Neural Networks
Overview of Convolutional Neural NetworksOverview of Convolutional Neural Networks
Overview of Convolutional Neural Networksananth
 
GAN for Bayesian Inference objectives
GAN for Bayesian Inference objectivesGAN for Bayesian Inference objectives
GAN for Bayesian Inference objectivesNatan Katz
 
Recurrent Neural Networks, LSTM and GRU
Recurrent Neural Networks, LSTM and GRURecurrent Neural Networks, LSTM and GRU
Recurrent Neural Networks, LSTM and GRUananth
 
Evolution of Deep Learning and new advancements
Evolution of Deep Learning and new advancementsEvolution of Deep Learning and new advancements
Evolution of Deep Learning and new advancementsChitta Ranjan
 
Machine Learning Foundations for Professional Managers
Machine Learning Foundations for Professional ManagersMachine Learning Foundations for Professional Managers
Machine Learning Foundations for Professional ManagersAlbert Y. C. Chen
 
[PR12] Inception and Xception - Jaejun Yoo
[PR12] Inception and Xception - Jaejun Yoo[PR12] Inception and Xception - Jaejun Yoo
[PR12] Inception and Xception - Jaejun YooJaeJun Yoo
 
Deep Convolutional GANs - meaning of latent space
Deep Convolutional GANs - meaning of latent spaceDeep Convolutional GANs - meaning of latent space
Deep Convolutional GANs - meaning of latent spaceHansol Kang
 
Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
Convolutional Neural Networks on Graphs with Fast Localized Spectral FilteringConvolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
Convolutional Neural Networks on Graphs with Fast Localized Spectral FilteringSOYEON KIM
 

Similar to Efficient Explorative Sampling of Deep Generative Models (20)

Deep learning from a novice perspective
Deep learning from a novice perspectiveDeep learning from a novice perspective
Deep learning from a novice perspective
 
Introduction to deep learning
Introduction to deep learningIntroduction to deep learning
Introduction to deep learning
 
Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)
 
Neural Art (English Version)
Neural Art (English Version)Neural Art (English Version)
Neural Art (English Version)
 
Deep Learning
Deep LearningDeep Learning
Deep Learning
 
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks, arXiv e-...
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks, arXiv e-...Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks, arXiv e-...
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks, arXiv e-...
 
Super resolution in deep learning era - Jaejun Yoo
Super resolution in deep learning era - Jaejun YooSuper resolution in deep learning era - Jaejun Yoo
Super resolution in deep learning era - Jaejun Yoo
 
D1L5 Visualization (D1L2 Insight@DCU Machine Learning Workshop 2017)
D1L5 Visualization (D1L2 Insight@DCU Machine Learning Workshop 2017)D1L5 Visualization (D1L2 Insight@DCU Machine Learning Workshop 2017)
D1L5 Visualization (D1L2 Insight@DCU Machine Learning Workshop 2017)
 
Nonlinear dimension reduction
Nonlinear dimension reductionNonlinear dimension reduction
Nonlinear dimension reduction
 
物件偵測與辨識技術
物件偵測與辨識技術物件偵測與辨識技術
物件偵測與辨識技術
 
Deep Learning in Recommender Systems - RecSys Summer School 2017
Deep Learning in Recommender Systems - RecSys Summer School 2017Deep Learning in Recommender Systems - RecSys Summer School 2017
Deep Learning in Recommender Systems - RecSys Summer School 2017
 
Overview of Convolutional Neural Networks
Overview of Convolutional Neural NetworksOverview of Convolutional Neural Networks
Overview of Convolutional Neural Networks
 
Machine Learning Tools and Particle Swarm Optimization for Content-Based Sear...
Machine Learning Tools and Particle Swarm Optimization for Content-Based Sear...Machine Learning Tools and Particle Swarm Optimization for Content-Based Sear...
Machine Learning Tools and Particle Swarm Optimization for Content-Based Sear...
 
GAN for Bayesian Inference objectives
GAN for Bayesian Inference objectivesGAN for Bayesian Inference objectives
GAN for Bayesian Inference objectives
 
Recurrent Neural Networks, LSTM and GRU
Recurrent Neural Networks, LSTM and GRURecurrent Neural Networks, LSTM and GRU
Recurrent Neural Networks, LSTM and GRU
 
Evolution of Deep Learning and new advancements
Evolution of Deep Learning and new advancementsEvolution of Deep Learning and new advancements
Evolution of Deep Learning and new advancements
 
Machine Learning Foundations for Professional Managers
Machine Learning Foundations for Professional ManagersMachine Learning Foundations for Professional Managers
Machine Learning Foundations for Professional Managers
 
[PR12] Inception and Xception - Jaejun Yoo
[PR12] Inception and Xception - Jaejun Yoo[PR12] Inception and Xception - Jaejun Yoo
[PR12] Inception and Xception - Jaejun Yoo
 
Deep Convolutional GANs - meaning of latent space
Deep Convolutional GANs - meaning of latent spaceDeep Convolutional GANs - meaning of latent space
Deep Convolutional GANs - meaning of latent space
 
Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
Convolutional Neural Networks on Graphs with Fast Localized Spectral FilteringConvolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
 

Recently uploaded

Presentation on Engagement in Book Clubs
Presentation on Engagement in Book ClubsPresentation on Engagement in Book Clubs
Presentation on Engagement in Book Clubssamaasim06
 
Open Source Strategy in Logistics 2015_Henrik Hankedvz-d-nl-log-conference.pdf
Open Source Strategy in Logistics 2015_Henrik Hankedvz-d-nl-log-conference.pdfOpen Source Strategy in Logistics 2015_Henrik Hankedvz-d-nl-log-conference.pdf
Open Source Strategy in Logistics 2015_Henrik Hankedvz-d-nl-log-conference.pdfhenrik385807
 
Call Girl Number in Khar Mumbai📲 9892124323 💞 Full Night Enjoy
Call Girl Number in Khar Mumbai📲 9892124323 💞 Full Night EnjoyCall Girl Number in Khar Mumbai📲 9892124323 💞 Full Night Enjoy
Call Girl Number in Khar Mumbai📲 9892124323 💞 Full Night EnjoyPooja Nehwal
 
Exploring protein-protein interactions by Weak Affinity Chromatography (WAC) ...
Exploring protein-protein interactions by Weak Affinity Chromatography (WAC) ...Exploring protein-protein interactions by Weak Affinity Chromatography (WAC) ...
Exploring protein-protein interactions by Weak Affinity Chromatography (WAC) ...Salam Al-Karadaghi
 
Introduction to Prompt Engineering (Focusing on ChatGPT)
Introduction to Prompt Engineering (Focusing on ChatGPT)Introduction to Prompt Engineering (Focusing on ChatGPT)
Introduction to Prompt Engineering (Focusing on ChatGPT)Chameera Dedduwage
 
Night 7k Call Girls Noida Sector 128 Call Me: 8448380779
Night 7k Call Girls Noida Sector 128 Call Me: 8448380779Night 7k Call Girls Noida Sector 128 Call Me: 8448380779
Night 7k Call Girls Noida Sector 128 Call Me: 8448380779Delhi Call girls
 
George Lever - eCommerce Day Chile 2024
George Lever -  eCommerce Day Chile 2024George Lever -  eCommerce Day Chile 2024
George Lever - eCommerce Day Chile 2024eCommerce Institute
 
Microsoft Copilot AI for Everyone - created by AI
Microsoft Copilot AI for Everyone - created by AIMicrosoft Copilot AI for Everyone - created by AI
Microsoft Copilot AI for Everyone - created by AITatiana Gurgel
 
Russian Call Girls in Kolkata Vaishnavi 🤌 8250192130 🚀 Vip Call Girls Kolkata
Russian Call Girls in Kolkata Vaishnavi 🤌  8250192130 🚀 Vip Call Girls KolkataRussian Call Girls in Kolkata Vaishnavi 🤌  8250192130 🚀 Vip Call Girls Kolkata
Russian Call Girls in Kolkata Vaishnavi 🤌 8250192130 🚀 Vip Call Girls Kolkataanamikaraghav4
 
Andrés Ramírez Gossler, Facundo Schinnea - eCommerce Day Chile 2024
Andrés Ramírez Gossler, Facundo Schinnea - eCommerce Day Chile 2024Andrés Ramírez Gossler, Facundo Schinnea - eCommerce Day Chile 2024
Andrés Ramírez Gossler, Facundo Schinnea - eCommerce Day Chile 2024eCommerce Institute
 
ANCHORING SCRIPT FOR A CULTURAL EVENT.docx
ANCHORING SCRIPT FOR A CULTURAL EVENT.docxANCHORING SCRIPT FOR A CULTURAL EVENT.docx
ANCHORING SCRIPT FOR A CULTURAL EVENT.docxNikitaBankoti2
 
Call Girls in Sarojini Nagar Market Delhi 💯 Call Us 🔝8264348440🔝
Call Girls in Sarojini Nagar Market Delhi 💯 Call Us 🔝8264348440🔝Call Girls in Sarojini Nagar Market Delhi 💯 Call Us 🔝8264348440🔝
Call Girls in Sarojini Nagar Market Delhi 💯 Call Us 🔝8264348440🔝soniya singh
 
Mathematics of Finance Presentation.pptx
Mathematics of Finance Presentation.pptxMathematics of Finance Presentation.pptx
Mathematics of Finance Presentation.pptxMoumonDas2
 
Re-membering the Bard: Revisiting The Compleat Wrks of Wllm Shkspr (Abridged)...
Re-membering the Bard: Revisiting The Compleat Wrks of Wllm Shkspr (Abridged)...Re-membering the Bard: Revisiting The Compleat Wrks of Wllm Shkspr (Abridged)...
Re-membering the Bard: Revisiting The Compleat Wrks of Wllm Shkspr (Abridged)...Hasting Chen
 
WhatsApp 📞 9892124323 ✅Call Girls In Juhu ( Mumbai )
WhatsApp 📞 9892124323 ✅Call Girls In Juhu ( Mumbai )WhatsApp 📞 9892124323 ✅Call Girls In Juhu ( Mumbai )
WhatsApp 📞 9892124323 ✅Call Girls In Juhu ( Mumbai )Pooja Nehwal
 
VVIP Call Girls Nalasopara : 9892124323, Call Girls in Nalasopara Services
VVIP Call Girls Nalasopara : 9892124323, Call Girls in Nalasopara ServicesVVIP Call Girls Nalasopara : 9892124323, Call Girls in Nalasopara Services
VVIP Call Girls Nalasopara : 9892124323, Call Girls in Nalasopara ServicesPooja Nehwal
 
Mohammad_Alnahdi_Oral_Presentation_Assignment.pptx
Mohammad_Alnahdi_Oral_Presentation_Assignment.pptxMohammad_Alnahdi_Oral_Presentation_Assignment.pptx
Mohammad_Alnahdi_Oral_Presentation_Assignment.pptxmohammadalnahdi22
 
SaaStr Workshop Wednesday w: Jason Lemkin, SaaStr
SaaStr Workshop Wednesday w: Jason Lemkin, SaaStrSaaStr Workshop Wednesday w: Jason Lemkin, SaaStr
SaaStr Workshop Wednesday w: Jason Lemkin, SaaStrsaastr
 
Governance and Nation-Building in Nigeria: Some Reflections on Options for Po...
Governance and Nation-Building in Nigeria: Some Reflections on Options for Po...Governance and Nation-Building in Nigeria: Some Reflections on Options for Po...
Governance and Nation-Building in Nigeria: Some Reflections on Options for Po...Kayode Fayemi
 
BDSM⚡Call Girls in Sector 93 Noida Escorts >༒8448380779 Escort Service
BDSM⚡Call Girls in Sector 93 Noida Escorts >༒8448380779 Escort ServiceBDSM⚡Call Girls in Sector 93 Noida Escorts >༒8448380779 Escort Service
BDSM⚡Call Girls in Sector 93 Noida Escorts >༒8448380779 Escort ServiceDelhi Call girls
 

Recently uploaded (20)

Presentation on Engagement in Book Clubs
Presentation on Engagement in Book ClubsPresentation on Engagement in Book Clubs
Presentation on Engagement in Book Clubs
 
Open Source Strategy in Logistics 2015_Henrik Hankedvz-d-nl-log-conference.pdf
Open Source Strategy in Logistics 2015_Henrik Hankedvz-d-nl-log-conference.pdfOpen Source Strategy in Logistics 2015_Henrik Hankedvz-d-nl-log-conference.pdf
Open Source Strategy in Logistics 2015_Henrik Hankedvz-d-nl-log-conference.pdf
 
Call Girl Number in Khar Mumbai📲 9892124323 💞 Full Night Enjoy
Call Girl Number in Khar Mumbai📲 9892124323 💞 Full Night EnjoyCall Girl Number in Khar Mumbai📲 9892124323 💞 Full Night Enjoy
Call Girl Number in Khar Mumbai📲 9892124323 💞 Full Night Enjoy
 
Exploring protein-protein interactions by Weak Affinity Chromatography (WAC) ...
Exploring protein-protein interactions by Weak Affinity Chromatography (WAC) ...Exploring protein-protein interactions by Weak Affinity Chromatography (WAC) ...
Exploring protein-protein interactions by Weak Affinity Chromatography (WAC) ...
 
Introduction to Prompt Engineering (Focusing on ChatGPT)
Introduction to Prompt Engineering (Focusing on ChatGPT)Introduction to Prompt Engineering (Focusing on ChatGPT)
Introduction to Prompt Engineering (Focusing on ChatGPT)
 
Night 7k Call Girls Noida Sector 128 Call Me: 8448380779
Night 7k Call Girls Noida Sector 128 Call Me: 8448380779Night 7k Call Girls Noida Sector 128 Call Me: 8448380779
Night 7k Call Girls Noida Sector 128 Call Me: 8448380779
 
George Lever - eCommerce Day Chile 2024
George Lever -  eCommerce Day Chile 2024George Lever -  eCommerce Day Chile 2024
George Lever - eCommerce Day Chile 2024
 
Microsoft Copilot AI for Everyone - created by AI
Microsoft Copilot AI for Everyone - created by AIMicrosoft Copilot AI for Everyone - created by AI
Microsoft Copilot AI for Everyone - created by AI
 
Russian Call Girls in Kolkata Vaishnavi 🤌 8250192130 🚀 Vip Call Girls Kolkata
Russian Call Girls in Kolkata Vaishnavi 🤌  8250192130 🚀 Vip Call Girls KolkataRussian Call Girls in Kolkata Vaishnavi 🤌  8250192130 🚀 Vip Call Girls Kolkata
Russian Call Girls in Kolkata Vaishnavi 🤌 8250192130 🚀 Vip Call Girls Kolkata
 
Andrés Ramírez Gossler, Facundo Schinnea - eCommerce Day Chile 2024
Andrés Ramírez Gossler, Facundo Schinnea - eCommerce Day Chile 2024Andrés Ramírez Gossler, Facundo Schinnea - eCommerce Day Chile 2024
Andrés Ramírez Gossler, Facundo Schinnea - eCommerce Day Chile 2024
 
ANCHORING SCRIPT FOR A CULTURAL EVENT.docx
ANCHORING SCRIPT FOR A CULTURAL EVENT.docxANCHORING SCRIPT FOR A CULTURAL EVENT.docx
ANCHORING SCRIPT FOR A CULTURAL EVENT.docx
 
Call Girls in Sarojini Nagar Market Delhi 💯 Call Us 🔝8264348440🔝
Call Girls in Sarojini Nagar Market Delhi 💯 Call Us 🔝8264348440🔝Call Girls in Sarojini Nagar Market Delhi 💯 Call Us 🔝8264348440🔝
Call Girls in Sarojini Nagar Market Delhi 💯 Call Us 🔝8264348440🔝
 
Mathematics of Finance Presentation.pptx
Mathematics of Finance Presentation.pptxMathematics of Finance Presentation.pptx
Mathematics of Finance Presentation.pptx
 
Re-membering the Bard: Revisiting The Compleat Wrks of Wllm Shkspr (Abridged)...
Re-membering the Bard: Revisiting The Compleat Wrks of Wllm Shkspr (Abridged)...Re-membering the Bard: Revisiting The Compleat Wrks of Wllm Shkspr (Abridged)...
Re-membering the Bard: Revisiting The Compleat Wrks of Wllm Shkspr (Abridged)...
 
WhatsApp 📞 9892124323 ✅Call Girls In Juhu ( Mumbai )
WhatsApp 📞 9892124323 ✅Call Girls In Juhu ( Mumbai )WhatsApp 📞 9892124323 ✅Call Girls In Juhu ( Mumbai )
WhatsApp 📞 9892124323 ✅Call Girls In Juhu ( Mumbai )
 
VVIP Call Girls Nalasopara : 9892124323, Call Girls in Nalasopara Services
VVIP Call Girls Nalasopara : 9892124323, Call Girls in Nalasopara ServicesVVIP Call Girls Nalasopara : 9892124323, Call Girls in Nalasopara Services
VVIP Call Girls Nalasopara : 9892124323, Call Girls in Nalasopara Services
 
Mohammad_Alnahdi_Oral_Presentation_Assignment.pptx
Mohammad_Alnahdi_Oral_Presentation_Assignment.pptxMohammad_Alnahdi_Oral_Presentation_Assignment.pptx
Mohammad_Alnahdi_Oral_Presentation_Assignment.pptx
 
SaaStr Workshop Wednesday w: Jason Lemkin, SaaStr
SaaStr Workshop Wednesday w: Jason Lemkin, SaaStrSaaStr Workshop Wednesday w: Jason Lemkin, SaaStr
SaaStr Workshop Wednesday w: Jason Lemkin, SaaStr
 
Governance and Nation-Building in Nigeria: Some Reflections on Options for Po...
Governance and Nation-Building in Nigeria: Some Reflections on Options for Po...Governance and Nation-Building in Nigeria: Some Reflections on Options for Po...
Governance and Nation-Building in Nigeria: Some Reflections on Options for Po...
 
BDSM⚡Call Girls in Sector 93 Noida Escorts >༒8448380779 Escort Service
BDSM⚡Call Girls in Sector 93 Noida Escorts >༒8448380779 Escort ServiceBDSM⚡Call Girls in Sector 93 Noida Escorts >༒8448380779 Escort Service
BDSM⚡Call Girls in Sector 93 Noida Escorts >༒8448380779 Escort Service
 

Efficient Explorative Sampling of Deep Generative Models

  • 1. An Efficient Explorative Sampling Considering the Generative Boundaries of Deep Generative Neural Networks Giyoung Jeon1*, Haedong Jeong1* and Jaesik Choi2 Statistical Artificial Intelligence Laboratory of 1Ulsan National Institute of Science and Technology (UNIST) and 2Korea Advanced Institute of Science and Technology (KAIST) *Equal contribution
  • 2. Motivation • Generative Adversarial Networks (GANs) make high-quality and various images in many domains. 𝑍 4x4 1024x1024 LSUN dataset CelebA dataset Generator 𝐺(𝑍) Discriminator 𝐷(𝐺(𝑍)) T/F Latent Vector 𝑍 ∼ 𝑁(0, 𝐼) Discriminative value T. Karras, et al., "Progressive growing of GANs for improved quality, stability, and variation”, ICLR, 2018. 1024x1024
  • 3. Motivation • The generative process is not well understood yet. • We wish to give example-based explanation on the generative process. Latent Vector 𝑍 ℓ-th layer LSUN dataset CelebA dataset ℎℓ ∈ ℝ16×16×512 4x4 1024x1024
  • 4. Previous work: analyzing the inside of deep neural networks Google Deep Dream (Mordvintsev et al., 2015) GAN dissection (Bau et al., 2019) D. Bau et al., "Network Dissection: Quantifying Interpretability of Deep Visual Representations." CVPR, 2017. D. Bau et al., "GAN Dissection: Visualizing and Understanding Generative Adversarial Networks”, ICLR, 2019. A. Mordvintsev et al., "Inceptionism: Going deeper into neural networks”, 2015. K. Dvijotham et al., "A Dual Approach to Scalable Verification of Deep Networks." UAI, 2018. Lagrangian relaxed decision boundary (Dvijotham et al., 2018) Network dissection (Bau et al., 2017) Interpretation: lamp Interpretation: car Unit 1 Unit 4 Interpretation: lamp Interpretation: car Unit 1 Unit 4 Interval bound propagation 𝑥0 ∈ 𝑆 Input perturbations Propagated regions Refinement using cutting planes from dual 𝝀 Decision boundary 𝒄 𝑻 𝒙 𝑲 + 𝒅 Naïve bounds would fail to certify robustness
  • 5. Definitions • Generator 𝐺 𝑍 : a generated image from 𝑍 • Hidden nodes ℎℓ: a neural representation of ℓ-th layer • Partial generation 𝑔j:i: ℝ|hi| → ℝ|hj| : a generative function from layer 𝑖 to layer 𝑗 𝑔4:1 𝑔 𝐿:5𝑍 4-th layer ℎ4 ∈ ℝ16×16×512 𝐺 𝑍
  • 6. Generative Boundary – A value of ℎℓ is determined by the linear hyperplane in the space of the previous layer, ℎℓ−1 – Stacking of layers toward input makes highly non-linear and non-convex shape • We want to see only feasible regions which constructed from the input to the target. – Trained to fool the discriminator in GANs 𝑔ℓ:1 𝑔 𝐿:ℓ+1𝑍 ℎℓ + - 𝑔ℓ 𝑖 (ℎ𝑙−1) = 0 𝑔ℓ 𝑖 (ℎ𝑙−1) = ℎ𝑙 𝑖 ℎ𝑙 𝑖 nonlinearnonlinear A space of previous layer 𝐵ℓ 𝑖 = {𝑍|𝑔ℓ:1 𝑖 (𝑧) = 0, 𝑍 ∈ ℝ|𝑍| }
  • 7. – In the ℓ-th layer, a space (Sℓ) which is surrounded by a set of generative boundaries. – In the input space, a set of equivalent class of Z w.r.t 𝑺ℓ. – In the image space, a set of equivalent class of image w.r.t. 𝑺ℓ. Generative Region 𝑔ℓ:1 𝑔 𝐿:ℓ+1𝑍 T T ℎℓ + - - + - + + - - + + - - + - + + - -+ + + - + - + + - -+ A space of previous layer A B
  • 8. Problem Definition: Explorative sampling in a generative region • Given: A GAN model (𝐺), a target layer (ℓ), and an input query (𝑧0) • Goal: find a set of equivalent class of images generated from the same generative region (𝑺ℓ). 𝑔ℓ:1 𝑔 𝐿:ℓ+1𝑧0 A space of previous layer T Query ℎℓ + - - + - + + - - + + - - + - + + - -+
  • 9. • The dimension of latent space and a lot of hyperplanes are hard to handle in practice. (E.g., 4th layer in PGGAN: ℝ512 → ℝ8192) • Typically generative region is nonconvex in higher layer due to nonlinear activations. Challenges of Sampling in a Generative Region Small 𝜖-based sampling • Every samples inside the region • Exists blind regions Large 𝜖-based spherical sampling • Cover the region • Might have out-of-region samples Latent Space Latent Space
  • 10. Reduction to the Robot Planning Problem • Searching a path in non-convex space • High degree of freedom of robot joint • Searching samples in non-convex space • High dimensional explorative space 𝑍 ∈ ℝ512 𝑆ℓ Exploring a Generative Region Problem Robot Planning Problem We reduce our sampling problem into robot-planning problem. Reduction
  • 11. Generative Boundary constrained Rapidly-exploring Random Tree (RRT) • Given generative boundary as constraints, RRT is gives solution to search over the generative region. • This explorative sampling always guarantee acceptance inside the region LaValle, Steven M. “Rapidly-exploring random trees: A new tool for path planning”. Technical Report. Computer Science Department, Iowa State University. 1998. Illustrative example Example in nonconvex regionAlgorithm of RRT
  • 12. Smallest Supporting Generative Boundary Set • Using all the boundaries, constraints get too tight and computationally expensive. • We observe not all the boundaries affects equally on the output. 𝑔 𝐿:ℓ+1𝑔ℓ:1𝑧0 Latent Space Latent Vector Disregard values of relaxed boundaries ℎℓ ℎℓ𝑚⊙ = + - - + - + + - - + - + + - -
  • 13. Smallest Supporting Generative Boundary Set • Apply Bernoulli mask optimization to relax boundaries but maintain the output. Entire boundaries Using 10% Using 5% Chang, Chun-Hao, et al. "Explaining image classifiers by adaptive dropout and generative in-filling." International Conference on Learning Representations (ICLR). 2018. 𝜃∗ = argmin 𝜃 ℒ(𝑧0, ℓ, 𝜃) = argmin 𝜃 𝑔 𝐿:ℓ+1 𝑔𝑙:1 𝑧0 ⊙ 𝑚 − 𝐺 𝑧0 + 𝜆 𝜃 1 where 𝑚 ∼ 𝐵𝑒𝑟 𝜃 Masked image reconstruction error Mask l1 regularizer
  • 14. Proposed Algorithm 𝑔 𝐿:ℓ+1𝑔ℓ:1 ℎℓ 𝑧0 𝑔 𝐿:ℓ+1𝑔ℓ:1 Bernoulli Mask Optimization RRT on Generative Boundary Latent Space Smallest Supporing Generative Region Exploration and generation LSUNdatasetCelebAdataset + - - + - + + - - + ℎℓ - + + - - 𝑧0
  • 15. Explorative Generative Boundary Aware Sampling • Accepted Cluster 1 • Accepted Cluster 2 • Accepted Cluster 3 • Rejected Sample
  • 16. Experiment : DCGAN-MNIST 𝜖-based sampling Our method Query
  • 17. Experiment : PGGAN-LSUN-church 𝜖-based sampling Our method Query
  • 18. Experiment : PGGAN-LSUN-church 𝜖-based sampling Our methodQuery 𝜖-based sampling Our methodQuery
  • 19. Experiment : PGGAN-celebA 𝜖-based sampling Our method Query
  • 20. Experiment : PGGAN-celebA 𝜖-based sampling Our methodQuery Our methodQuery 𝜖-based sampling
  • 21. Experiment : According to the portion of activate mask Query <5%<10%>10%
  • 22. Conclusion • We propose a new interpretable method to analyze the inside of deep generative neural networks. • Our explorative sampling method demonstrate the better performance compared to existing method (e.g., 𝜖-based sampling) when investigating decision regions. • Our algorithm can be extended to different types of deep neural networks models (e.g., classification model).
  • 23. Thank you! Explainable Artificial Intelligent Center of Korea https://xai.kaist.ac.kr This work was supported by the Institute for Information & communications Technology Planning & Evaluation (IITP) grant funded by the Ministry of Science and ICT (MSIT), Korea (No. 2017-0-01779, XAI)

Editor's Notes

  1. Hi everyone. As introduced I’m Giyoung Jeon. This work was jointly done with Haedong Jeong and my advisor Jaesik Choi.
  2. Generative Adversarial Networks, GANs, has shown high performance in generating realistic and various images. For example, Progressive GAN has shown its performance in LSUN to generate architectural buildings and celebA dataset to generate face of celebrities, whose resolution is up to 1024 by 1024.
  3. To generate such realistic images, a lot of nodes are involved in the generation. However, understanding the roles of those nodes in the generation process is not well studied yet. So given a trained GAN, we wish to explain the generation process by sampling images which passes similar generation process with respect to a query image.
  4. There exist previous work to solve similar problem. Network dissection tries to combine the feature map and the segmentation model to explain the role of each unit in CNN based classifiers. This method considers only single unit at once and requires segmentation model as a supervision. They also applied same method to GAN models, and they have shown some of units are tied and cooperate to generate attributions. Google Deep dream tries to explain the internal unit of the neural network by generating example which maximally activates the unit. However, this generates not real but synthetic image. Langrangian relaxed decision boundary approximates complex decision boundary to linear space using Lagrangian multiplier. However, this method have blind spots where Lagrangian cannot fit exactly.
  5. In this paper, we define G(Z) as a generator to generate an image from random input Z h_l is a node (neural) representation of l-th layer of GAN Little g represents a partial generation of image. As an example g_j:I presents a generative function from layer I to layer j ___________________ In the generator, a lot of nodes are involved in the generation of theses images. However, studies to understand the roles of those nodes in the generation process is not well done yet.
  6. Before we formally define a problem, here, we will define a generative boundary and a generative region, which are main difference with previous methods. Given a layer h_l, a node define a generative boundary which is an output applied to a linear hyperplane to the space of the previous layer h_l-1. Of course, the boundary is highly non-linear, non-convex with respect to the input space, where the linear transformation and nonlinearity combines. The hyperplane is learned and trained to fool the discriminator in a GAN. A generative region is a space surrounded or closed by a set of generative boundaries. classifier
  7. Similarly, a generative region can be defied as the combination of sign of node values. As an example the second node is changed from positive to negative (or vise versa), the generated image will be in different region and have different functionality.
  8. In this paper, we wish to search the generatvie region in a way that we could extract/generate images which are in the same generative region efficiently.
  9. However it is non trivial to generate samples from a generative region in two reasons. One is that the conventional GAN has a lot of base dimensions and numerous hyperplanes. As an example, for a Progressive GAN model, the 4th layer include 512 base dimensions and 8192 hyperplanes. Another reason is that the generative region of our interest is non-convex. Of course, one can generate sample in the (l-1)-th layer directly. However, this does not reflect the distribution of sample generation procedure of GAN. Not all the regions in l-1-th layer is approachable from the input. Thus, when we generate sample from the input space Z (where the actual sample generation procedure is took), the linear hyperplanes in the l-th layer become highly non-linear in the input space.
  10. So here we introduce RRT, which is originally used in path-planning problems. The algorithm of this method is first, uniform randomly sample new points, and find the nearest que. To step forward, it takes unit vector with direction from the nearest que to the random sample. The new point moving from the nearest query with unit step, does not collide any obstacle, then the sample is merged into the queue. For our objective, we will give the generative boundary conditions as obstacle in RRT. This explorative sampling enables us to collect samples in high-dimensional, nonconvex region, guaranteeing inside of the region.
  11. So here we introduce RRT, which is originally used in path-planning problems. The algorithm of this method is first, uniform randomly sample new points, and find the nearest que. To step forward, it takes unit vector with direction from the nearest que to the random sample. The new point moving from the nearest query with unit step, does not collide any obstacle, then the sample is merged into the queue. For our objective, we will give the generative boundary conditions as obstacle in RRT. This explorative sampling enables us to collect samples in high-dimensional, nonconvex region, guaranteeing inside of the region.
  12. When we use all the boundaries, the constraints are too tight to explore and numerous boundaries make it costly. So we tried to take associated boundaries, which affect more than negligible ones. We take element-wise multiplication on the hidden value, which will disregard some of nodes.
  13. Then how can we choose the mask? We appley Bernoulli mask optimization. Our objective function aims to reduce the masked image reconstruction error while theta to be sparse as possible. Below figures are examples when all the boundaries are used, 10% are used and 5%are used. You may notice that we can reduce the burdon of computation while maintaining the quality of the output. We define a set of boundary which maintains the original output with minimal number of boundaries, as Smallest Supporting Generative Boundary Set.
  14. Finally our algorithm is shown as following step. First, we optimize the mask to reduce the burdon of large number of boundaries. Actually, this step is optional if you want to fully use the boundaries. Then we apply RRT to the obtained region and gather samples which shares the attributions.