SlideShare a Scribd company logo
1 of 55
Download to read offline
Table of Contents
목차
01 G A N E X A M P L E S
02 G A N I N T R O D U C T I O N
03 G A N C O D E
04 M AT H B E H I N D G A N
05 D I F F E R E N T T Y P E S O F G A N
G A N EXA M PLES
LvMin Zhang*, Chengze Li*, Tien-Tsin Wong, Yi Ji, and ChunPing Liu. 2018. Two-stage Sketch Colorization.
ACM Trans. Graph. 37, 6, Article 261 (No- vember 2018), 14 pages.
https://doi.org/10.1145/3272127.3275090
G A N EXA M PLES
EdgeConnect: Generative Image Inpainting
with Adversarial Edge Learning
G A N EXA M PLES
G A N I N TR O D U C TI O N
Generator
(𝐺)
Discriminator
(𝐷)
FAKE
z
Discriminator
(𝐷)
REAL
real image
fake image
Generator want to fool!
Discr im inat or Updat e
Discriminator
(𝐷)
𝐷(𝑥)
real image(𝑥)
should be close to 1
Generator
(𝐺)
Discriminator
(𝐷)
𝐷(𝐺(𝑧))
z fake image𝐷(𝐺(𝑧))
should be close to 0
G e ne r a t or upda t e
Generator
(𝐺)
Discriminator
(𝐷)
𝐷(𝐺 𝑧 )
z
should be close to 1
tries to fool Discriminator
𝐷 𝐺 𝑧 1 ≈ 1
fake image(G(z))
G A N Los s
min
.
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ]
Sample from real data distribution Sample from Gaussian distribution
log(𝑥) log(1 − 𝑥)
G A N Los s ( G e ne r a t or )
min
.
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ]
Generator
(𝐺)
Discriminator
(𝐷)
𝐷(𝐺 𝑧 )
z
should be close to 1
tries to fool Discriminator
𝐷 𝐺 𝑧 1 ≈ 1
fake image(G(z))
G A N Tr a i ni ng Pr oc e s s
discriminator
Generative distribution
True data distribution
G A N Py t or c h Code
G A N Py t or c h Code
G A N Py t or c h Code
G A N- G l oba l O pt i m a l i t y
min
.
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ]
OUR	GOAL	
𝑝. 𝑥 = 𝑝OPQP(𝑥)
G A N- G l oba l O pt i m a l i t y
min
.
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ]
𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] = 𝐸6~8R 6 [log(1 − 𝐷 𝑥 ]
G A N- G l oba l O pt i m a l i t y
min
.
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ]
Preposistion 1. 𝐹𝑜𝑟 𝐺 𝑓𝑖𝑥𝑒𝑑, 𝑡ℎ𝑒 𝑜𝑝𝑡𝑖𝑚𝑎𝑙 𝑑𝑖𝑠𝑐𝑟𝑖𝑛𝑎𝑡𝑜𝑟 𝐷 𝑖𝑠
𝐷.
∗
𝑥 =
89:;: 6
89:;: 6 a8R(6)
Proof:
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ]
= ∫6
𝑝OPQP 𝑥 log 𝐷 𝑥 𝑑𝑥 + ∫@
𝑝@ 𝑧 log 1 − 𝐷 𝐺(𝑧 𝑑𝑧
= ∫6
𝑝OPQP 𝑥 log 𝐷 𝑥 + 𝑝. 𝑥 log 1 − 𝐷(𝑥) 𝑑𝑥
≤ ∫6
max
d
𝑝OPQP 𝑥 log(𝑦) + 𝑝. 𝑥 log 1 − 𝑦 𝑑𝑥
G A N- G l oba l O pt i m a l i t y
min
.
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ]
Preposistion 1. 𝐹𝑜𝑟 𝐺 𝑓𝑖𝑥𝑒𝑑, 𝑡ℎ𝑒 𝑜𝑝𝑡𝑖𝑚𝑎𝑙 𝑑𝑖𝑠𝑐𝑟𝑖𝑛𝑎𝑡𝑜𝑟 𝐷 𝑖𝑠
𝐷.
∗
𝑥 =
89:;: 6
89:;: 6 a8R(6)
Proof:
𝑓 𝑦 = max
d
[𝑝OPQP 𝑥 log 𝑦 + 𝑝. 𝑥 log 1 − 𝑦 ] = max
d
𝑎𝑙𝑜𝑔 𝑦 + 𝑏𝑙𝑜𝑔 1 − 𝑦 𝑎, 𝑏 > 0
𝑓i
𝑦 =
P
d
−
j
kld
= 0 → 𝑦 =
P
Paj
G A N- G l oba l O pt i m a l i t y
min
.
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ]
Preposistion 1. 𝐹𝑜𝑟 𝐺 𝑓𝑖𝑥𝑒𝑑, 𝑡ℎ𝑒 𝑜𝑝𝑡𝑖𝑚𝑎𝑙 𝑑𝑖𝑠𝑐𝑟𝑖𝑛𝑎𝑡𝑜𝑟 𝐷 𝑖𝑠
𝐷.
∗
𝑥 =
89:;: 6
89:;: 6 a8R(6)
Proof:
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ]
= ∫6
𝑝OPQP 𝑥 log 𝐷 𝑥 𝑑𝑥 + ∫@
𝑝@ 𝑧 log 1 − 𝐷 𝐺(𝑧 𝑑𝑧
= ∫6
𝑝OPQP 𝑥 log 𝐷 𝑥 + 𝑝. 𝑥 log 1 − 𝐷(𝑥) 𝑑𝑥
≤ ∫6
max
d
𝑝OPQP 𝑥 log 𝑦 + 𝑝. 𝑥 log 1 − 𝑦 𝑑𝑥
∴ 𝐷.
∗
𝑥 =
𝑝OPQP 𝑥
𝑝OPQP 𝑥 + 𝑝.(𝑥)
G A N- G l oba l O pt i m a l i t y
min
.
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ]
𝑂𝑝𝑡𝑖𝑚𝑎𝑙 𝑃𝑜𝑖𝑛𝑡 𝑓𝑜𝑟 𝐺𝑒𝑛𝑒𝑟𝑎𝑡𝑜𝑟
𝑝OPQP 𝑥 = 𝑝. 𝑥
𝑂𝑝𝑡𝑖𝑚𝑎𝑙 𝑃𝑜𝑖𝑛𝑡 𝑓𝑜𝑟 𝐷𝑖𝑠𝑐𝑟𝑖𝑚𝑖𝑛𝑎𝑡𝑜𝑟
𝐷.
∗
𝑥 =
𝑝OPQP 𝑥
𝑝OPQP 𝑥 + 𝑝.(𝑥)
=
1
2
G A N- G l oba l O pt i m a l i t y
min
.
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ]
Theorem 1. 𝑇ℎ𝑒 𝑔𝑙𝑜𝑏𝑎𝑙 𝑚𝑖𝑛𝑖𝑚𝑢𝑚 𝑜𝑓 𝑡ℎ𝑒 𝑣𝑖𝑟𝑡𝑢𝑎𝑙 𝑡𝑟𝑎𝑖𝑛𝑖𝑛𝑔 𝑐𝑟𝑖𝑡𝑒𝑟𝑖𝑜𝑛 𝐶 𝐺 𝑖𝑠 𝑎𝑐ℎ𝑖𝑒𝑣𝑒𝑑 𝑖𝑓 𝑎𝑛𝑑 𝑜𝑛𝑙𝑦 𝑖𝑓
𝑝v = 𝑝OPQP. 𝐴𝑡 𝑡ℎ𝑎𝑡 𝑝𝑜𝑖𝑛𝑡, 𝐶 𝐺 𝑎𝑐ℎ𝑖𝑒𝑣𝑒𝑠 𝑡ℎ𝑒 𝑣𝑎𝑙𝑢𝑒: −𝑙𝑜𝑔4.
Proof(only if):
𝐶 𝐺 = 𝑉 𝐷∗
, 𝐺 = ∫6
𝑝OPQP 𝑥 log 𝐷∗
𝑥 𝑑𝑥 + ∫@
𝑝@ 𝑧 log 1 − 𝐷∗
𝐺(𝑧 𝑑𝑧
= ∫6
𝑝OPQP 𝑥 log
k
{
+ 𝑝. 𝑥 log 1 −
k
{
𝑑𝑥
= −𝑙𝑜𝑔2 ∫6
𝑝OPQP 𝑥 + 𝑝. 𝑥 𝑑𝑥 = −2𝑙𝑜𝑔2 = −𝑙𝑜𝑔4
G A N- G l oba l O pt i m a l i t y
min
.
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ]
Theorem 1. 𝑇ℎ𝑒 𝑔𝑙𝑜𝑏𝑎𝑙 𝑚𝑖𝑛𝑖𝑚𝑢𝑚 𝑜𝑓 𝑡ℎ𝑒 𝑣𝑖𝑟𝑡𝑢𝑎𝑙 𝑡𝑟𝑎𝑖𝑛𝑖𝑛𝑔 𝑐𝑟𝑖𝑡𝑒𝑟𝑖𝑜𝑛 𝐶 𝐺 𝑖𝑠 𝑎𝑐ℎ𝑖𝑒𝑣𝑒𝑑 𝑖𝑓 𝑎𝑛𝑑 𝑜𝑛𝑙𝑦 𝑖𝑓
𝑝v = 𝑝OPQP. 𝐴𝑡 𝑡ℎ𝑎𝑡 𝑝𝑜𝑖𝑛𝑡, 𝐶 𝐺 𝑎𝑐ℎ𝑖𝑒𝑣𝑒𝑠 𝑡ℎ𝑒 𝑣𝑎𝑙𝑢𝑒: −𝑙𝑜𝑔4.
Proof(if):
𝐶 𝐺 = 𝑉 𝐷∗
, 𝐺 = ∫6
𝑝OPQP 𝑥 log(
89:;: 6
89:;: 6 a8R(6)
) 𝑑𝑥 + ∫@
𝑝@ 𝑧 log
8| 6
89:;: 6 a8R(6)
𝑑𝑧
= ∫6
𝑙𝑜𝑔2 − 𝑙𝑜𝑔2 𝑝OPQP(𝑥) + 𝑝OPQP 𝑥 log(
89:;: 6
89:;: 6 a8R(6)
) 𝑑𝑥
+ 𝑙𝑜𝑔2 − 𝑙𝑜𝑔2 𝑝. 𝑥 + ∫@
𝑝@ 𝑧 log
8| 6
89:;: 6 a8R(6)
𝑑𝑧
= −𝑙𝑜𝑔2 ∫6
𝑝. 𝑥 + 𝑝OPQP 𝑥 𝑑𝑥
+ ∫6
𝑝OPQP 𝑥 𝑙𝑜𝑔2 + log
89:;: 6
89:;: 6 a8R 6
+ 𝑝v 𝑥 𝑙𝑜𝑔2 + log
8| 6
89:;: 6 a8R 6
𝑑𝑥
G A N- G l oba l O pt i m a l i t y
min
.
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ]
Theorem 1. 𝑇ℎ𝑒 𝑔𝑙𝑜𝑏𝑎𝑙 𝑚𝑖𝑛𝑖𝑚𝑢𝑚 𝑜𝑓 𝑡ℎ𝑒 𝑣𝑖𝑟𝑡𝑢𝑎𝑙 𝑡𝑟𝑎𝑖𝑛𝑖𝑛𝑔 𝑐𝑟𝑖𝑡𝑒𝑟𝑖𝑜𝑛 𝐶 𝐺 𝑖𝑠 𝑎𝑐ℎ𝑖𝑒𝑣𝑒𝑑 𝑖𝑓 𝑎𝑛𝑑 𝑜𝑛𝑙𝑦 𝑖𝑓
𝑝v = 𝑝OPQP. 𝐴𝑡 𝑡ℎ𝑎𝑡 𝑝𝑜𝑖𝑛𝑡, 𝐶 𝐺 𝑎𝑐ℎ𝑖𝑒𝑣𝑒𝑠 𝑡ℎ𝑒 𝑣𝑎𝑙𝑢𝑒: −𝑙𝑜𝑔4.
Proof(if):
∫6
𝑝OPQP 𝑥 𝑙𝑜𝑔2 + log
89:;: 6
89:;: 6 a8R 6
+ 𝑝. 𝑥 𝑙𝑜𝑔2 + log
8| 6
89:;: 6 a8R 6
𝑑𝑥
= ∫6
𝑝OPQP 𝑥 log
89:;: 6
(89:;: 6 a8| 6 )/{
+ 𝑝. 𝑥 log
8R 6
(89:;: 6 a8R 6 )/{
𝑑𝑥
= 𝐾𝐿 𝑝OPQP
89:;:a8R
{
+ 𝐾𝐿 𝑝.
89:;:a8R
{
= 2 ∗ 𝐽𝑆𝐷(𝑃OPQP|𝑝.)
G A N- G l oba l O pt i m a l i t y
min
.
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ]
Theorem 1. 𝑇ℎ𝑒 𝑔𝑙𝑜𝑏𝑎𝑙 𝑚𝑖𝑛𝑖𝑚𝑢𝑚 𝑜𝑓 𝑡ℎ𝑒 𝑣𝑖𝑟𝑡𝑢𝑎𝑙 𝑡𝑟𝑎𝑖𝑛𝑖𝑛𝑔 𝑐𝑟𝑖𝑡𝑒𝑟𝑖𝑜𝑛 𝐶 𝐺 𝑖𝑠 𝑎𝑐ℎ𝑖𝑒𝑣𝑒𝑑 𝑖𝑓 𝑎𝑛𝑑 𝑜𝑛𝑙𝑦 𝑖𝑓
𝑝v = 𝑝OPQP. 𝐴𝑡 𝑡ℎ𝑎𝑡 𝑝𝑜𝑖𝑛𝑡, 𝐶 𝐺 𝑎𝑐ℎ𝑖𝑒𝑣𝑒𝑠 𝑡ℎ𝑒 𝑣𝑎𝑙𝑢𝑒: −𝑙𝑜𝑔4.
Proof(if):
𝐶 𝐺 = 𝑉 𝐷∗
, 𝐺 = ∫6
𝑝OPQP 𝑥 log(
89:;: 6
89:;: 6 a8R(6)
) 𝑑𝑥 + ∫@
𝑝@ 𝑧 log
8| 6
89:;: 6 a8R(6)
𝑑𝑧
= ∫6
𝑙𝑜𝑔2 − 𝑙𝑜𝑔2 𝑝OPQP(𝑥) + 𝑝OPQP 𝑥 log(
89:;: 6
89:;: 6 a8|(6)
) 𝑑𝑥
+ 𝑙𝑜𝑔2 − 𝑙𝑜𝑔2 𝑝. 𝑥 + ∫@
𝑝@ 𝑧 log
8| 6
89:;: 6 a8R(6)
𝑑𝑧
= −𝑙𝑜𝑔2 ∫6
𝑝. 𝑥 + 𝑝OPQP 𝑥 𝑑𝑥
+ ∫6
𝑝OPQP 𝑥 𝑙𝑜𝑔2 + log
89:;: 6
89:;: 6 a8R 6
+ 𝑝. 𝑥 𝑙𝑜𝑔2 + log
8| 6
89:;: 6 a8R 6
𝑑𝑥
= −𝑙𝑜𝑔4 + 2 ∗ 𝐽𝑆𝐷 𝑝OPQP 𝑝.
≥ −𝑙𝑜𝑔4
0 𝑤ℎ𝑒𝑛 𝑝OPQP = 𝑝v
DCG A N
•Replace all max pooling with convolutional stride
•Use transposed convolution for upsampling.
•Eliminate fully connected layers.
•Use Batch normalization except the output layer for the
generator and the input layer of the discriminator.
•Use ReLU in the generator except for the output which uses
tanh.
•Use LeakyReLU in the discriminator.
DCG A N- w a k i ng i n t he l a t e nt s pa c e
Condit ional GAN
Limitation of vanilla GAN: Random generation of image!
How to overcome?
Condit ional GAN
min
.
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ]
min
.
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥|𝑦 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧|𝑦 ]
Condit ional GAN
min
.
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ]
min
.
max
1
𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥|𝑦 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧|𝑦 ]
Pi x 2 Pi x
Pi x 2 Pi x ( G e ne r a t or )
L1 Loss
E D
𝐺 𝑥𝑥
Discriminator
(𝐷)
Generator
(𝐺)
𝐷(𝑥, 𝐺 𝑥 )
𝑦
Loss
Pi x 2 Pi x ( Di s c r i m i na t or )
𝑥
𝐷(𝑥, 𝑦)𝑦 Discriminator
(𝐷)
Pi x 2 Pi x - pa t c hG A N
CycleGAN
z h u e t a l 2 0 1 7
CycleGAN
z h u e t a l 2 0 1 7
CycleGAN
z h u e t a l 2 0 1 7
CycleGAN
z h u e t a l 2 0 1 7
CycleGAN
z h u e t a l 2 0 1 7
CycleGAN
z h u e t a l 2 0 1 7
CycleGAN
z h u e t a l 2 0 1 7
CycleGAN
z h u e t a l 2 0 1 7
E D
Real image from domain 𝐴
Discriminator
(𝐷)
𝐺…†
Fake image in domain 𝐵
E D
𝐺†…
Reconstructed Image
Real/Fake
Real image from domain 𝐵
Edge Conne c t
Edge Conne c t
Edge Conne c t
Stage1: Edge Generator
Edge Conne c t
Stage2: Image Completion Network
Edge Conne c t
Stage2: Image Completion NetworkStage1: Edge Generator
Edge Conne c t – Edge G e ne r a t or
Stage1: Edge Generator
𝐼vQ: 𝑔𝑟𝑜𝑢𝑛𝑑 𝑡𝑟𝑢𝑡ℎ 𝑖𝑚𝑎𝑔𝑒 𝐶vQ: 𝑔𝑟𝑜𝑢𝑛𝑑 𝑡𝑟𝑢𝑡ℎ 𝑒𝑑𝑔𝑒 𝑚𝑎𝑝 𝐼v‰Pd: 𝑔𝑟𝑎𝑦𝑠𝑐𝑎𝑙𝑒 𝑖𝑚𝑎𝑔𝑒 Ĩ v‰Pd: 𝑔𝑟𝑎𝑦𝑠𝑐𝑎𝑙𝑒 𝑖𝑚𝑎𝑔𝑒
𝑤𝑖𝑡ℎ 𝑚𝑎𝑠𝑘
Edge Conne c t – Edge G e ne r a t or
Stage1: Edge Generator
𝐼vQ: 𝑔𝑟𝑜𝑢𝑛𝑑 𝑡𝑟𝑢𝑡ℎ 𝑖𝑚𝑎𝑔𝑒
𝐶vQ: 𝑔𝑟𝑜𝑢𝑛𝑑 𝑡𝑟𝑢𝑡ℎ 𝑒𝑑𝑔𝑒 𝑚𝑎𝑝
𝐼v‰Pd: 𝑔𝑟𝑎𝑦𝑠𝑐𝑎𝑙𝑒 𝑖𝑚𝑎𝑔𝑒
𝑀: 𝑖𝑚𝑎𝑔𝑒 𝑚𝑎𝑠𝑘
𝐼v‰Pd
i = 𝐼v‰Pd⊙ 1 − 𝑀 : 𝑚𝑎𝑠𝑘𝑒𝑑 𝑔𝑟𝑎𝑦𝑠𝑐𝑎𝑙𝑒 𝑖𝑚𝑎𝑔𝑒
𝐶vQ
i
= 𝐶vQ⊙ 1 − 𝑀 : 𝑚𝑎𝑠𝑘𝑒𝑑 𝑒𝑑𝑔𝑒 𝑚𝑎𝑝
𝐶8‰ŽO = 𝐺k 𝐼v‰Pd
i , 𝐶vQ
i
, 𝑀
min
.•
max
1•
𝐿.•
= min
.•
(𝜆PO‘,k max
1•
𝐿PO‘,k + 𝜆’“ 𝐿’“)
𝐿PO‘,k = 𝐸”|;,•|–:—
𝑙𝑜𝑔𝐷k 𝐶vQ, 𝐼v‰Pd + 𝐸•|–:—
1 − 𝐷k 𝐶8‰ŽO, 𝐼v‰Pd
𝐿’“ = 𝐸[˜
™šk
›
1
𝑁™
∥ 𝐷k
™
𝐶vQ − 𝐷k
™
𝐶8‰ŽO ∥{]
Edge Conne c t – I m a ge C om pl e t i on
Stage2: Image Completion Network𝐼vQ: 𝑔𝑟𝑜𝑢𝑛𝑑 𝑡𝑟𝑢𝑡ℎ 𝑖𝑚𝑎𝑔𝑒
𝐶vQ: 𝑔𝑟𝑜𝑢𝑛𝑑 𝑡𝑟𝑢𝑡ℎ 𝑒𝑑𝑔𝑒 𝑚𝑎𝑝
𝐼v‰Pd: 𝑔𝑟𝑎𝑦𝑠𝑐𝑎𝑙𝑒 𝑖𝑚𝑎𝑔𝑒
𝑀: 𝑖𝑚𝑎𝑔𝑒 𝑚𝑎𝑠𝑘
𝐼v‰Pd
i = 𝐼v‰Pd⊙ 1 − 𝑀 : 𝑚𝑎𝑠𝑘𝑒𝑑 𝑔𝑟𝑎𝑦𝑠𝑐𝑎𝑙𝑒
𝐶vQ
i
= 𝐶vQ⊙ 1 − 𝑀 : 𝑚𝑎𝑠𝑘𝑒𝑑 𝑒𝑑𝑔𝑒 𝑚𝑎𝑝
𝐼vQ
i
= 𝐼vQ⊙ 1 − 𝑀 : 𝑚𝑎𝑠𝑘𝑒𝑑 𝑐𝑜𝑙𝑜𝑟 𝑖𝑚𝑎𝑔𝑒
𝐶žŸ 8 = 𝐶vQ⊙ 1 − 𝑀 + 𝐶8‰ŽO⊙𝑀
𝐼8‰ŽO = 𝐺{ 𝐼vQ
i
, 𝐶žŸ 8
𝐿PO‘,{ = 𝐸”¡¢£¤,•|;
𝑙𝑜𝑔𝐷k 𝐼vQ, 𝐶žŸ 8 + 𝐸”¡¢£¤
1 − 𝐷k 𝐼8‰ŽO, 𝐶žŸ 8
𝐿8Ž‰ž = 𝐸 ˜
™
1
𝑁™
∥ 𝜙™ 𝐼vQ − 𝜙™ 𝐼8‰ŽO ∥k
𝐿¦Qd§Ž = 𝐸¨ ∥ 𝐺¨
©
𝐼8‰ŽO
i
− 𝐺¨
©
𝐼vQ
i
∥k
𝐿.ª
= 𝜆§•
𝐿§•
+ 𝜆PO‘,{ 𝐿PO‘,{ + 𝜆8Ž‰ž 𝐿8Ž‰ž + 𝜆¦Qd§Ž 𝐿¦Qd§Ž
Pe r c e pt ua l Los s a nd St y l e Los s
Pr ogr e s s i v e G r ow i ng of G AN
St y l e G AN
Thank You!

More Related Content

What's hot

Generative Adversarial Networks (GAN)
Generative Adversarial Networks (GAN)Generative Adversarial Networks (GAN)
Generative Adversarial Networks (GAN)Manohar Mukku
 
(Paper Seminar detailed version) BART: Denoising Sequence-to-Sequence Pre-tra...
(Paper Seminar detailed version) BART: Denoising Sequence-to-Sequence Pre-tra...(Paper Seminar detailed version) BART: Denoising Sequence-to-Sequence Pre-tra...
(Paper Seminar detailed version) BART: Denoising Sequence-to-Sequence Pre-tra...hyunyoung Lee
 
Generative Adversarial Networks
Generative Adversarial NetworksGenerative Adversarial Networks
Generative Adversarial NetworksMustafa Yagmur
 
Generative Adversarial Networks (GANs)
Generative Adversarial Networks (GANs)Generative Adversarial Networks (GANs)
Generative Adversarial Networks (GANs)Amol Patil
 
Evolution of the StyleGAN family
Evolution of the StyleGAN familyEvolution of the StyleGAN family
Evolution of the StyleGAN familyVitaly Bondar
 
Introduction to Generative Adversarial Networks (GANs)
Introduction to Generative Adversarial Networks (GANs)Introduction to Generative Adversarial Networks (GANs)
Introduction to Generative Adversarial Networks (GANs)Appsilon Data Science
 
Image-to-Image Translation with Conditional Adversarial Nets (UPC Reading Group)
Image-to-Image Translation with Conditional Adversarial Nets (UPC Reading Group)Image-to-Image Translation with Conditional Adversarial Nets (UPC Reading Group)
Image-to-Image Translation with Conditional Adversarial Nets (UPC Reading Group)Universitat Politècnica de Catalunya
 
GANs and Applications
GANs and ApplicationsGANs and Applications
GANs and ApplicationsHoang Nguyen
 
Generative adversarial networks
Generative adversarial networksGenerative adversarial networks
Generative adversarial networksYunjey Choi
 
A Short Introduction to Generative Adversarial Networks
A Short Introduction to Generative Adversarial NetworksA Short Introduction to Generative Adversarial Networks
A Short Introduction to Generative Adversarial NetworksJong Wook Kim
 
Image Translation with GAN
Image Translation with GANImage Translation with GAN
Image Translation with GANJunho Cho
 
1시간만에 GAN(Generative Adversarial Network) 완전 정복하기
1시간만에 GAN(Generative Adversarial Network) 완전 정복하기1시간만에 GAN(Generative Adversarial Network) 완전 정복하기
1시간만에 GAN(Generative Adversarial Network) 완전 정복하기NAVER Engineering
 
Methods of Optimization in Machine Learning
Methods of Optimization in Machine LearningMethods of Optimization in Machine Learning
Methods of Optimization in Machine LearningKnoldus Inc.
 
GAN - Theory and Applications
GAN - Theory and ApplicationsGAN - Theory and Applications
GAN - Theory and ApplicationsEmanuele Ghelfi
 
Revisiting the Calibration of Modern Neural Networks
Revisiting the Calibration of Modern Neural NetworksRevisiting the Calibration of Modern Neural Networks
Revisiting the Calibration of Modern Neural NetworksSungchul Kim
 
Introduction to Generative Adversarial Networks
Introduction to Generative Adversarial NetworksIntroduction to Generative Adversarial Networks
Introduction to Generative Adversarial NetworksBennoG1
 
Introduction to Grad-CAM (short version)
Introduction to Grad-CAM (short version)Introduction to Grad-CAM (short version)
Introduction to Grad-CAM (short version)Hsing-chuan Hsieh
 
Image-to-Image Translation
Image-to-Image TranslationImage-to-Image Translation
Image-to-Image TranslationJunho Kim
 

What's hot (20)

Generative Adversarial Networks (GAN)
Generative Adversarial Networks (GAN)Generative Adversarial Networks (GAN)
Generative Adversarial Networks (GAN)
 
(Paper Seminar detailed version) BART: Denoising Sequence-to-Sequence Pre-tra...
(Paper Seminar detailed version) BART: Denoising Sequence-to-Sequence Pre-tra...(Paper Seminar detailed version) BART: Denoising Sequence-to-Sequence Pre-tra...
(Paper Seminar detailed version) BART: Denoising Sequence-to-Sequence Pre-tra...
 
Generative Adversarial Networks
Generative Adversarial NetworksGenerative Adversarial Networks
Generative Adversarial Networks
 
Generative Adversarial Networks (GANs)
Generative Adversarial Networks (GANs)Generative Adversarial Networks (GANs)
Generative Adversarial Networks (GANs)
 
Evolution of the StyleGAN family
Evolution of the StyleGAN familyEvolution of the StyleGAN family
Evolution of the StyleGAN family
 
Introduction to Generative Adversarial Networks (GANs)
Introduction to Generative Adversarial Networks (GANs)Introduction to Generative Adversarial Networks (GANs)
Introduction to Generative Adversarial Networks (GANs)
 
Image-to-Image Translation with Conditional Adversarial Nets (UPC Reading Group)
Image-to-Image Translation with Conditional Adversarial Nets (UPC Reading Group)Image-to-Image Translation with Conditional Adversarial Nets (UPC Reading Group)
Image-to-Image Translation with Conditional Adversarial Nets (UPC Reading Group)
 
StarGAN
StarGANStarGAN
StarGAN
 
GANs and Applications
GANs and ApplicationsGANs and Applications
GANs and Applications
 
Generative adversarial networks
Generative adversarial networksGenerative adversarial networks
Generative adversarial networks
 
A Short Introduction to Generative Adversarial Networks
A Short Introduction to Generative Adversarial NetworksA Short Introduction to Generative Adversarial Networks
A Short Introduction to Generative Adversarial Networks
 
Image Translation with GAN
Image Translation with GANImage Translation with GAN
Image Translation with GAN
 
1시간만에 GAN(Generative Adversarial Network) 완전 정복하기
1시간만에 GAN(Generative Adversarial Network) 완전 정복하기1시간만에 GAN(Generative Adversarial Network) 완전 정복하기
1시간만에 GAN(Generative Adversarial Network) 완전 정복하기
 
Methods of Optimization in Machine Learning
Methods of Optimization in Machine LearningMethods of Optimization in Machine Learning
Methods of Optimization in Machine Learning
 
Randomized smoothing
Randomized smoothingRandomized smoothing
Randomized smoothing
 
GAN - Theory and Applications
GAN - Theory and ApplicationsGAN - Theory and Applications
GAN - Theory and Applications
 
Revisiting the Calibration of Modern Neural Networks
Revisiting the Calibration of Modern Neural NetworksRevisiting the Calibration of Modern Neural Networks
Revisiting the Calibration of Modern Neural Networks
 
Introduction to Generative Adversarial Networks
Introduction to Generative Adversarial NetworksIntroduction to Generative Adversarial Networks
Introduction to Generative Adversarial Networks
 
Introduction to Grad-CAM (short version)
Introduction to Grad-CAM (short version)Introduction to Grad-CAM (short version)
Introduction to Grad-CAM (short version)
 
Image-to-Image Translation
Image-to-Image TranslationImage-to-Image Translation
Image-to-Image Translation
 

Similar to Gan

GAN in_kakao
GAN in_kakaoGAN in_kakao
GAN in_kakaoJunho Kim
 
Annals of Statistics読み回 第一回
Annals of Statistics読み回 第一回Annals of Statistics読み回 第一回
Annals of Statistics読み回 第一回jkomiyama
 
Maximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of BeetleMaximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of BeetleLiang Kai Hu
 
B.tech ii unit-2 material beta gamma function
B.tech ii unit-2 material beta gamma functionB.tech ii unit-2 material beta gamma function
B.tech ii unit-2 material beta gamma functionRai University
 
Semana 24 funciones iv álgebra uni ccesa007
Semana 24 funciones iv álgebra uni ccesa007Semana 24 funciones iv álgebra uni ccesa007
Semana 24 funciones iv álgebra uni ccesa007Demetrio Ccesa Rayme
 
Passivity-based control of rigid-body manipulator
Passivity-based control of rigid-body manipulatorPassivity-based control of rigid-body manipulator
Passivity-based control of rigid-body manipulatorHancheol Choi
 
Btech_II_ engineering mathematics_unit2
Btech_II_ engineering mathematics_unit2Btech_II_ engineering mathematics_unit2
Btech_II_ engineering mathematics_unit2Rai University
 
Introduction to PyTorch
Introduction to PyTorchIntroduction to PyTorch
Introduction to PyTorchJun Young Park
 
【DL輪読会】Unbiased Gradient Estimation for Marginal Log-likelihood
【DL輪読会】Unbiased Gradient Estimation for Marginal Log-likelihood【DL輪読会】Unbiased Gradient Estimation for Marginal Log-likelihood
【DL輪読会】Unbiased Gradient Estimation for Marginal Log-likelihoodDeep Learning JP
 
ADMM algorithm in ProxImaL
ADMM algorithm in ProxImaL ADMM algorithm in ProxImaL
ADMM algorithm in ProxImaL Masayuki Tanaka
 
"Incremental Lossless Graph Summarization", KDD 2020
"Incremental Lossless Graph Summarization", KDD 2020"Incremental Lossless Graph Summarization", KDD 2020
"Incremental Lossless Graph Summarization", KDD 2020지훈 고
 
Metrics for generativemodels
Metrics for generativemodelsMetrics for generativemodels
Metrics for generativemodelsDai-Hai Nguyen
 
Lecture 1
Lecture 1Lecture 1
Lecture 1butest
 
Product Rules & Amp Laplacian 1
Product Rules & Amp Laplacian 1Product Rules & Amp Laplacian 1
Product Rules & Amp Laplacian 1NumanUsama
 

Similar to Gan (20)

GAN in_kakao
GAN in_kakaoGAN in_kakao
GAN in_kakao
 
Annals of Statistics読み回 第一回
Annals of Statistics読み回 第一回Annals of Statistics読み回 第一回
Annals of Statistics読み回 第一回
 
Maximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of BeetleMaximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of Beetle
 
B.tech ii unit-2 material beta gamma function
B.tech ii unit-2 material beta gamma functionB.tech ii unit-2 material beta gamma function
B.tech ii unit-2 material beta gamma function
 
Q#8
Q#8Q#8
Q#8
 
Semana 24 funciones iv álgebra uni ccesa007
Semana 24 funciones iv álgebra uni ccesa007Semana 24 funciones iv álgebra uni ccesa007
Semana 24 funciones iv álgebra uni ccesa007
 
Passivity-based control of rigid-body manipulator
Passivity-based control of rigid-body manipulatorPassivity-based control of rigid-body manipulator
Passivity-based control of rigid-body manipulator
 
Backpropagation
BackpropagationBackpropagation
Backpropagation
 
Btech_II_ engineering mathematics_unit2
Btech_II_ engineering mathematics_unit2Btech_II_ engineering mathematics_unit2
Btech_II_ engineering mathematics_unit2
 
Introduction to PyTorch
Introduction to PyTorchIntroduction to PyTorch
Introduction to PyTorch
 
【DL輪読会】Unbiased Gradient Estimation for Marginal Log-likelihood
【DL輪読会】Unbiased Gradient Estimation for Marginal Log-likelihood【DL輪読会】Unbiased Gradient Estimation for Marginal Log-likelihood
【DL輪読会】Unbiased Gradient Estimation for Marginal Log-likelihood
 
On Cubic Graceful Labeling
On Cubic Graceful LabelingOn Cubic Graceful Labeling
On Cubic Graceful Labeling
 
ADMM algorithm in ProxImaL
ADMM algorithm in ProxImaL ADMM algorithm in ProxImaL
ADMM algorithm in ProxImaL
 
"Incremental Lossless Graph Summarization", KDD 2020
"Incremental Lossless Graph Summarization", KDD 2020"Incremental Lossless Graph Summarization", KDD 2020
"Incremental Lossless Graph Summarization", KDD 2020
 
機械学習と自動微分
機械学習と自動微分機械学習と自動微分
機械学習と自動微分
 
Lecture9 xing
Lecture9 xingLecture9 xing
Lecture9 xing
 
Metrics for generativemodels
Metrics for generativemodelsMetrics for generativemodels
Metrics for generativemodels
 
Lecture 1
Lecture 1Lecture 1
Lecture 1
 
Product Rules & Amp Laplacian 1
Product Rules & Amp Laplacian 1Product Rules & Amp Laplacian 1
Product Rules & Amp Laplacian 1
 
Some New Prime Graphs
Some New Prime GraphsSome New Prime Graphs
Some New Prime Graphs
 

Recently uploaded

Bluetooth Controlled Car with Arduino.pdf
Bluetooth Controlled Car with Arduino.pdfBluetooth Controlled Car with Arduino.pdf
Bluetooth Controlled Car with Arduino.pdfngoud9212
 
Unlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsUnlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsPrecisely
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Alan Dix
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Wonjun Hwang
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptxLBM Solutions
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...Fwdays
 
costume and set research powerpoint presentation
costume and set research powerpoint presentationcostume and set research powerpoint presentation
costume and set research powerpoint presentationphoebematthew05
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions
 
My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024The Digital Insurer
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 

Recently uploaded (20)

Bluetooth Controlled Car with Arduino.pdf
Bluetooth Controlled Car with Arduino.pdfBluetooth Controlled Car with Arduino.pdf
Bluetooth Controlled Car with Arduino.pdf
 
Unlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsUnlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power Systems
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptx
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
 
costume and set research powerpoint presentation
costume and set research powerpoint presentationcostume and set research powerpoint presentation
costume and set research powerpoint presentation
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
The transition to renewables in India.pdf
The transition to renewables in India.pdfThe transition to renewables in India.pdf
The transition to renewables in India.pdf
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food Manufacturing
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort ServiceHot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 

Gan

  • 1.
  • 2. Table of Contents 목차 01 G A N E X A M P L E S 02 G A N I N T R O D U C T I O N 03 G A N C O D E 04 M AT H B E H I N D G A N 05 D I F F E R E N T T Y P E S O F G A N
  • 3. G A N EXA M PLES LvMin Zhang*, Chengze Li*, Tien-Tsin Wong, Yi Ji, and ChunPing Liu. 2018. Two-stage Sketch Colorization. ACM Trans. Graph. 37, 6, Article 261 (No- vember 2018), 14 pages. https://doi.org/10.1145/3272127.3275090
  • 4. G A N EXA M PLES EdgeConnect: Generative Image Inpainting with Adversarial Edge Learning
  • 5. G A N EXA M PLES
  • 6. G A N I N TR O D U C TI O N Generator (𝐺) Discriminator (𝐷) FAKE z Discriminator (𝐷) REAL real image fake image Generator want to fool!
  • 7. Discr im inat or Updat e Discriminator (𝐷) 𝐷(𝑥) real image(𝑥) should be close to 1 Generator (𝐺) Discriminator (𝐷) 𝐷(𝐺(𝑧)) z fake image𝐷(𝐺(𝑧)) should be close to 0
  • 8. G e ne r a t or upda t e Generator (𝐺) Discriminator (𝐷) 𝐷(𝐺 𝑧 ) z should be close to 1 tries to fool Discriminator 𝐷 𝐺 𝑧 1 ≈ 1 fake image(G(z))
  • 9. G A N Los s min . max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] Sample from real data distribution Sample from Gaussian distribution log(𝑥) log(1 − 𝑥)
  • 10. G A N Los s ( G e ne r a t or ) min . max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] Generator (𝐺) Discriminator (𝐷) 𝐷(𝐺 𝑧 ) z should be close to 1 tries to fool Discriminator 𝐷 𝐺 𝑧 1 ≈ 1 fake image(G(z))
  • 11. G A N Tr a i ni ng Pr oc e s s discriminator Generative distribution True data distribution
  • 12. G A N Py t or c h Code
  • 13. G A N Py t or c h Code
  • 14. G A N Py t or c h Code
  • 15. G A N- G l oba l O pt i m a l i t y min . max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] OUR GOAL 𝑝. 𝑥 = 𝑝OPQP(𝑥)
  • 16. G A N- G l oba l O pt i m a l i t y min . max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] = 𝐸6~8R 6 [log(1 − 𝐷 𝑥 ]
  • 17. G A N- G l oba l O pt i m a l i t y min . max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] Preposistion 1. 𝐹𝑜𝑟 𝐺 𝑓𝑖𝑥𝑒𝑑, 𝑡ℎ𝑒 𝑜𝑝𝑡𝑖𝑚𝑎𝑙 𝑑𝑖𝑠𝑐𝑟𝑖𝑛𝑎𝑡𝑜𝑟 𝐷 𝑖𝑠 𝐷. ∗ 𝑥 = 89:;: 6 89:;: 6 a8R(6) Proof: max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] = ∫6 𝑝OPQP 𝑥 log 𝐷 𝑥 𝑑𝑥 + ∫@ 𝑝@ 𝑧 log 1 − 𝐷 𝐺(𝑧 𝑑𝑧 = ∫6 𝑝OPQP 𝑥 log 𝐷 𝑥 + 𝑝. 𝑥 log 1 − 𝐷(𝑥) 𝑑𝑥 ≤ ∫6 max d 𝑝OPQP 𝑥 log(𝑦) + 𝑝. 𝑥 log 1 − 𝑦 𝑑𝑥
  • 18. G A N- G l oba l O pt i m a l i t y min . max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] Preposistion 1. 𝐹𝑜𝑟 𝐺 𝑓𝑖𝑥𝑒𝑑, 𝑡ℎ𝑒 𝑜𝑝𝑡𝑖𝑚𝑎𝑙 𝑑𝑖𝑠𝑐𝑟𝑖𝑛𝑎𝑡𝑜𝑟 𝐷 𝑖𝑠 𝐷. ∗ 𝑥 = 89:;: 6 89:;: 6 a8R(6) Proof: 𝑓 𝑦 = max d [𝑝OPQP 𝑥 log 𝑦 + 𝑝. 𝑥 log 1 − 𝑦 ] = max d 𝑎𝑙𝑜𝑔 𝑦 + 𝑏𝑙𝑜𝑔 1 − 𝑦 𝑎, 𝑏 > 0 𝑓i 𝑦 = P d − j kld = 0 → 𝑦 = P Paj
  • 19. G A N- G l oba l O pt i m a l i t y min . max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] Preposistion 1. 𝐹𝑜𝑟 𝐺 𝑓𝑖𝑥𝑒𝑑, 𝑡ℎ𝑒 𝑜𝑝𝑡𝑖𝑚𝑎𝑙 𝑑𝑖𝑠𝑐𝑟𝑖𝑛𝑎𝑡𝑜𝑟 𝐷 𝑖𝑠 𝐷. ∗ 𝑥 = 89:;: 6 89:;: 6 a8R(6) Proof: max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] = ∫6 𝑝OPQP 𝑥 log 𝐷 𝑥 𝑑𝑥 + ∫@ 𝑝@ 𝑧 log 1 − 𝐷 𝐺(𝑧 𝑑𝑧 = ∫6 𝑝OPQP 𝑥 log 𝐷 𝑥 + 𝑝. 𝑥 log 1 − 𝐷(𝑥) 𝑑𝑥 ≤ ∫6 max d 𝑝OPQP 𝑥 log 𝑦 + 𝑝. 𝑥 log 1 − 𝑦 𝑑𝑥 ∴ 𝐷. ∗ 𝑥 = 𝑝OPQP 𝑥 𝑝OPQP 𝑥 + 𝑝.(𝑥)
  • 20. G A N- G l oba l O pt i m a l i t y min . max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] 𝑂𝑝𝑡𝑖𝑚𝑎𝑙 𝑃𝑜𝑖𝑛𝑡 𝑓𝑜𝑟 𝐺𝑒𝑛𝑒𝑟𝑎𝑡𝑜𝑟 𝑝OPQP 𝑥 = 𝑝. 𝑥 𝑂𝑝𝑡𝑖𝑚𝑎𝑙 𝑃𝑜𝑖𝑛𝑡 𝑓𝑜𝑟 𝐷𝑖𝑠𝑐𝑟𝑖𝑚𝑖𝑛𝑎𝑡𝑜𝑟 𝐷. ∗ 𝑥 = 𝑝OPQP 𝑥 𝑝OPQP 𝑥 + 𝑝.(𝑥) = 1 2
  • 21. G A N- G l oba l O pt i m a l i t y min . max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] Theorem 1. 𝑇ℎ𝑒 𝑔𝑙𝑜𝑏𝑎𝑙 𝑚𝑖𝑛𝑖𝑚𝑢𝑚 𝑜𝑓 𝑡ℎ𝑒 𝑣𝑖𝑟𝑡𝑢𝑎𝑙 𝑡𝑟𝑎𝑖𝑛𝑖𝑛𝑔 𝑐𝑟𝑖𝑡𝑒𝑟𝑖𝑜𝑛 𝐶 𝐺 𝑖𝑠 𝑎𝑐ℎ𝑖𝑒𝑣𝑒𝑑 𝑖𝑓 𝑎𝑛𝑑 𝑜𝑛𝑙𝑦 𝑖𝑓 𝑝v = 𝑝OPQP. 𝐴𝑡 𝑡ℎ𝑎𝑡 𝑝𝑜𝑖𝑛𝑡, 𝐶 𝐺 𝑎𝑐ℎ𝑖𝑒𝑣𝑒𝑠 𝑡ℎ𝑒 𝑣𝑎𝑙𝑢𝑒: −𝑙𝑜𝑔4. Proof(only if): 𝐶 𝐺 = 𝑉 𝐷∗ , 𝐺 = ∫6 𝑝OPQP 𝑥 log 𝐷∗ 𝑥 𝑑𝑥 + ∫@ 𝑝@ 𝑧 log 1 − 𝐷∗ 𝐺(𝑧 𝑑𝑧 = ∫6 𝑝OPQP 𝑥 log k { + 𝑝. 𝑥 log 1 − k { 𝑑𝑥 = −𝑙𝑜𝑔2 ∫6 𝑝OPQP 𝑥 + 𝑝. 𝑥 𝑑𝑥 = −2𝑙𝑜𝑔2 = −𝑙𝑜𝑔4
  • 22. G A N- G l oba l O pt i m a l i t y min . max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] Theorem 1. 𝑇ℎ𝑒 𝑔𝑙𝑜𝑏𝑎𝑙 𝑚𝑖𝑛𝑖𝑚𝑢𝑚 𝑜𝑓 𝑡ℎ𝑒 𝑣𝑖𝑟𝑡𝑢𝑎𝑙 𝑡𝑟𝑎𝑖𝑛𝑖𝑛𝑔 𝑐𝑟𝑖𝑡𝑒𝑟𝑖𝑜𝑛 𝐶 𝐺 𝑖𝑠 𝑎𝑐ℎ𝑖𝑒𝑣𝑒𝑑 𝑖𝑓 𝑎𝑛𝑑 𝑜𝑛𝑙𝑦 𝑖𝑓 𝑝v = 𝑝OPQP. 𝐴𝑡 𝑡ℎ𝑎𝑡 𝑝𝑜𝑖𝑛𝑡, 𝐶 𝐺 𝑎𝑐ℎ𝑖𝑒𝑣𝑒𝑠 𝑡ℎ𝑒 𝑣𝑎𝑙𝑢𝑒: −𝑙𝑜𝑔4. Proof(if): 𝐶 𝐺 = 𝑉 𝐷∗ , 𝐺 = ∫6 𝑝OPQP 𝑥 log( 89:;: 6 89:;: 6 a8R(6) ) 𝑑𝑥 + ∫@ 𝑝@ 𝑧 log 8| 6 89:;: 6 a8R(6) 𝑑𝑧 = ∫6 𝑙𝑜𝑔2 − 𝑙𝑜𝑔2 𝑝OPQP(𝑥) + 𝑝OPQP 𝑥 log( 89:;: 6 89:;: 6 a8R(6) ) 𝑑𝑥 + 𝑙𝑜𝑔2 − 𝑙𝑜𝑔2 𝑝. 𝑥 + ∫@ 𝑝@ 𝑧 log 8| 6 89:;: 6 a8R(6) 𝑑𝑧 = −𝑙𝑜𝑔2 ∫6 𝑝. 𝑥 + 𝑝OPQP 𝑥 𝑑𝑥 + ∫6 𝑝OPQP 𝑥 𝑙𝑜𝑔2 + log 89:;: 6 89:;: 6 a8R 6 + 𝑝v 𝑥 𝑙𝑜𝑔2 + log 8| 6 89:;: 6 a8R 6 𝑑𝑥
  • 23. G A N- G l oba l O pt i m a l i t y min . max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] Theorem 1. 𝑇ℎ𝑒 𝑔𝑙𝑜𝑏𝑎𝑙 𝑚𝑖𝑛𝑖𝑚𝑢𝑚 𝑜𝑓 𝑡ℎ𝑒 𝑣𝑖𝑟𝑡𝑢𝑎𝑙 𝑡𝑟𝑎𝑖𝑛𝑖𝑛𝑔 𝑐𝑟𝑖𝑡𝑒𝑟𝑖𝑜𝑛 𝐶 𝐺 𝑖𝑠 𝑎𝑐ℎ𝑖𝑒𝑣𝑒𝑑 𝑖𝑓 𝑎𝑛𝑑 𝑜𝑛𝑙𝑦 𝑖𝑓 𝑝v = 𝑝OPQP. 𝐴𝑡 𝑡ℎ𝑎𝑡 𝑝𝑜𝑖𝑛𝑡, 𝐶 𝐺 𝑎𝑐ℎ𝑖𝑒𝑣𝑒𝑠 𝑡ℎ𝑒 𝑣𝑎𝑙𝑢𝑒: −𝑙𝑜𝑔4. Proof(if): ∫6 𝑝OPQP 𝑥 𝑙𝑜𝑔2 + log 89:;: 6 89:;: 6 a8R 6 + 𝑝. 𝑥 𝑙𝑜𝑔2 + log 8| 6 89:;: 6 a8R 6 𝑑𝑥 = ∫6 𝑝OPQP 𝑥 log 89:;: 6 (89:;: 6 a8| 6 )/{ + 𝑝. 𝑥 log 8R 6 (89:;: 6 a8R 6 )/{ 𝑑𝑥 = 𝐾𝐿 𝑝OPQP 89:;:a8R { + 𝐾𝐿 𝑝. 89:;:a8R { = 2 ∗ 𝐽𝑆𝐷(𝑃OPQP|𝑝.)
  • 24. G A N- G l oba l O pt i m a l i t y min . max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] Theorem 1. 𝑇ℎ𝑒 𝑔𝑙𝑜𝑏𝑎𝑙 𝑚𝑖𝑛𝑖𝑚𝑢𝑚 𝑜𝑓 𝑡ℎ𝑒 𝑣𝑖𝑟𝑡𝑢𝑎𝑙 𝑡𝑟𝑎𝑖𝑛𝑖𝑛𝑔 𝑐𝑟𝑖𝑡𝑒𝑟𝑖𝑜𝑛 𝐶 𝐺 𝑖𝑠 𝑎𝑐ℎ𝑖𝑒𝑣𝑒𝑑 𝑖𝑓 𝑎𝑛𝑑 𝑜𝑛𝑙𝑦 𝑖𝑓 𝑝v = 𝑝OPQP. 𝐴𝑡 𝑡ℎ𝑎𝑡 𝑝𝑜𝑖𝑛𝑡, 𝐶 𝐺 𝑎𝑐ℎ𝑖𝑒𝑣𝑒𝑠 𝑡ℎ𝑒 𝑣𝑎𝑙𝑢𝑒: −𝑙𝑜𝑔4. Proof(if): 𝐶 𝐺 = 𝑉 𝐷∗ , 𝐺 = ∫6 𝑝OPQP 𝑥 log( 89:;: 6 89:;: 6 a8R(6) ) 𝑑𝑥 + ∫@ 𝑝@ 𝑧 log 8| 6 89:;: 6 a8R(6) 𝑑𝑧 = ∫6 𝑙𝑜𝑔2 − 𝑙𝑜𝑔2 𝑝OPQP(𝑥) + 𝑝OPQP 𝑥 log( 89:;: 6 89:;: 6 a8|(6) ) 𝑑𝑥 + 𝑙𝑜𝑔2 − 𝑙𝑜𝑔2 𝑝. 𝑥 + ∫@ 𝑝@ 𝑧 log 8| 6 89:;: 6 a8R(6) 𝑑𝑧 = −𝑙𝑜𝑔2 ∫6 𝑝. 𝑥 + 𝑝OPQP 𝑥 𝑑𝑥 + ∫6 𝑝OPQP 𝑥 𝑙𝑜𝑔2 + log 89:;: 6 89:;: 6 a8R 6 + 𝑝. 𝑥 𝑙𝑜𝑔2 + log 8| 6 89:;: 6 a8R 6 𝑑𝑥 = −𝑙𝑜𝑔4 + 2 ∗ 𝐽𝑆𝐷 𝑝OPQP 𝑝. ≥ −𝑙𝑜𝑔4 0 𝑤ℎ𝑒𝑛 𝑝OPQP = 𝑝v
  • 25.
  • 26.
  • 27. DCG A N •Replace all max pooling with convolutional stride •Use transposed convolution for upsampling. •Eliminate fully connected layers. •Use Batch normalization except the output layer for the generator and the input layer of the discriminator. •Use ReLU in the generator except for the output which uses tanh. •Use LeakyReLU in the discriminator.
  • 28. DCG A N- w a k i ng i n t he l a t e nt s pa c e
  • 29. Condit ional GAN Limitation of vanilla GAN: Random generation of image! How to overcome?
  • 30. Condit ional GAN min . max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] min . max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥|𝑦 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧|𝑦 ]
  • 31. Condit ional GAN min . max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧 ] min . max 1 𝑉 𝐷, 𝐺 = 𝐸6~89:;:(6) 𝑙𝑜𝑔𝐷 𝑥|𝑦 + 𝐸@~8A(@)[log 1 − 𝐷 𝐺 𝑧|𝑦 ]
  • 32. Pi x 2 Pi x
  • 33. Pi x 2 Pi x ( G e ne r a t or ) L1 Loss E D 𝐺 𝑥𝑥 Discriminator (𝐷) Generator (𝐺) 𝐷(𝑥, 𝐺 𝑥 ) 𝑦 Loss
  • 34. Pi x 2 Pi x ( Di s c r i m i na t or ) 𝑥 𝐷(𝑥, 𝑦)𝑦 Discriminator (𝐷)
  • 35. Pi x 2 Pi x - pa t c hG A N
  • 36. CycleGAN z h u e t a l 2 0 1 7
  • 37. CycleGAN z h u e t a l 2 0 1 7
  • 38. CycleGAN z h u e t a l 2 0 1 7
  • 39. CycleGAN z h u e t a l 2 0 1 7
  • 40. CycleGAN z h u e t a l 2 0 1 7
  • 41. CycleGAN z h u e t a l 2 0 1 7
  • 42. CycleGAN z h u e t a l 2 0 1 7
  • 43. CycleGAN z h u e t a l 2 0 1 7 E D Real image from domain 𝐴 Discriminator (𝐷) 𝐺…† Fake image in domain 𝐵 E D 𝐺†… Reconstructed Image Real/Fake Real image from domain 𝐵
  • 46. Edge Conne c t Stage1: Edge Generator
  • 47. Edge Conne c t Stage2: Image Completion Network
  • 48. Edge Conne c t Stage2: Image Completion NetworkStage1: Edge Generator
  • 49. Edge Conne c t – Edge G e ne r a t or Stage1: Edge Generator 𝐼vQ: 𝑔𝑟𝑜𝑢𝑛𝑑 𝑡𝑟𝑢𝑡ℎ 𝑖𝑚𝑎𝑔𝑒 𝐶vQ: 𝑔𝑟𝑜𝑢𝑛𝑑 𝑡𝑟𝑢𝑡ℎ 𝑒𝑑𝑔𝑒 𝑚𝑎𝑝 𝐼v‰Pd: 𝑔𝑟𝑎𝑦𝑠𝑐𝑎𝑙𝑒 𝑖𝑚𝑎𝑔𝑒 Ĩ v‰Pd: 𝑔𝑟𝑎𝑦𝑠𝑐𝑎𝑙𝑒 𝑖𝑚𝑎𝑔𝑒 𝑤𝑖𝑡ℎ 𝑚𝑎𝑠𝑘
  • 50. Edge Conne c t – Edge G e ne r a t or Stage1: Edge Generator 𝐼vQ: 𝑔𝑟𝑜𝑢𝑛𝑑 𝑡𝑟𝑢𝑡ℎ 𝑖𝑚𝑎𝑔𝑒 𝐶vQ: 𝑔𝑟𝑜𝑢𝑛𝑑 𝑡𝑟𝑢𝑡ℎ 𝑒𝑑𝑔𝑒 𝑚𝑎𝑝 𝐼v‰Pd: 𝑔𝑟𝑎𝑦𝑠𝑐𝑎𝑙𝑒 𝑖𝑚𝑎𝑔𝑒 𝑀: 𝑖𝑚𝑎𝑔𝑒 𝑚𝑎𝑠𝑘 𝐼v‰Pd i = 𝐼v‰Pd⊙ 1 − 𝑀 : 𝑚𝑎𝑠𝑘𝑒𝑑 𝑔𝑟𝑎𝑦𝑠𝑐𝑎𝑙𝑒 𝑖𝑚𝑎𝑔𝑒 𝐶vQ i = 𝐶vQ⊙ 1 − 𝑀 : 𝑚𝑎𝑠𝑘𝑒𝑑 𝑒𝑑𝑔𝑒 𝑚𝑎𝑝 𝐶8‰ŽO = 𝐺k 𝐼v‰Pd i , 𝐶vQ i , 𝑀 min .• max 1• 𝐿.• = min .• (𝜆PO‘,k max 1• 𝐿PO‘,k + 𝜆’“ 𝐿’“) 𝐿PO‘,k = 𝐸”|;,•|–:— 𝑙𝑜𝑔𝐷k 𝐶vQ, 𝐼v‰Pd + 𝐸•|–:— 1 − 𝐷k 𝐶8‰ŽO, 𝐼v‰Pd 𝐿’“ = 𝐸[˜ ™šk › 1 𝑁™ ∥ 𝐷k ™ 𝐶vQ − 𝐷k ™ 𝐶8‰ŽO ∥{]
  • 51. Edge Conne c t – I m a ge C om pl e t i on Stage2: Image Completion Network𝐼vQ: 𝑔𝑟𝑜𝑢𝑛𝑑 𝑡𝑟𝑢𝑡ℎ 𝑖𝑚𝑎𝑔𝑒 𝐶vQ: 𝑔𝑟𝑜𝑢𝑛𝑑 𝑡𝑟𝑢𝑡ℎ 𝑒𝑑𝑔𝑒 𝑚𝑎𝑝 𝐼v‰Pd: 𝑔𝑟𝑎𝑦𝑠𝑐𝑎𝑙𝑒 𝑖𝑚𝑎𝑔𝑒 𝑀: 𝑖𝑚𝑎𝑔𝑒 𝑚𝑎𝑠𝑘 𝐼v‰Pd i = 𝐼v‰Pd⊙ 1 − 𝑀 : 𝑚𝑎𝑠𝑘𝑒𝑑 𝑔𝑟𝑎𝑦𝑠𝑐𝑎𝑙𝑒 𝐶vQ i = 𝐶vQ⊙ 1 − 𝑀 : 𝑚𝑎𝑠𝑘𝑒𝑑 𝑒𝑑𝑔𝑒 𝑚𝑎𝑝 𝐼vQ i = 𝐼vQ⊙ 1 − 𝑀 : 𝑚𝑎𝑠𝑘𝑒𝑑 𝑐𝑜𝑙𝑜𝑟 𝑖𝑚𝑎𝑔𝑒 𝐶žŸ 8 = 𝐶vQ⊙ 1 − 𝑀 + 𝐶8‰ŽO⊙𝑀 𝐼8‰ŽO = 𝐺{ 𝐼vQ i , 𝐶žŸ 8 𝐿PO‘,{ = 𝐸”¡¢£¤,•|; 𝑙𝑜𝑔𝐷k 𝐼vQ, 𝐶žŸ 8 + 𝐸”¡¢£¤ 1 − 𝐷k 𝐼8‰ŽO, 𝐶žŸ 8 𝐿8Ž‰ž = 𝐸 ˜ ™ 1 𝑁™ ∥ 𝜙™ 𝐼vQ − 𝜙™ 𝐼8‰ŽO ∥k 𝐿¦Qd§Ž = 𝐸¨ ∥ 𝐺¨ © 𝐼8‰ŽO i − 𝐺¨ © 𝐼vQ i ∥k 𝐿.ª = 𝜆§• 𝐿§• + 𝜆PO‘,{ 𝐿PO‘,{ + 𝜆8Ž‰ž 𝐿8Ž‰ž + 𝜆¦Qd§Ž 𝐿¦Qd§Ž
  • 52. Pe r c e pt ua l Los s a nd St y l e Los s
  • 53. Pr ogr e s s i v e G r ow i ng of G AN
  • 54. St y l e G AN