Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
KaoNet v2
Face Translation using CycleGAN
October 29, 2017
Van Phu Quang Huy
Agenda
1. KaoNet recap
2. New feature: Face Translation
3. Technical Details
KaoNet recap: Face Recognition (OpenCV+CNN)
KaoNet recap: Face Generation (DCGAN)
KaoNet v2 ...
New feature: Face Translation
Khang → Huy Huy → Khang
In (real) In (real)Out (fake) Out (fake)
New feature: Face Translation
Girl → Khang Huy → Girl
In (real) In (real)Out (fake) Out (fake)
Technical Details: CycleGAN
CycleGAN
● Paper: https://arxiv.org/abs/1703.10593
● In ICCV 2017 (International Conference on
Computer Vision)
○ October ...
CycleGAN (https://junyanz.github.io/CycleGAN/)
CycleGAN (https://junyanz.github.io/CycleGAN/)
CycleGAN (https://junyanz.github.io/CycleGAN/)
My Implementation on GitHub!
https://github.com/vanhuyz/CycleGAN-TensorFlow
Reference: Generative Adversarial Networks (GAN)
(http://guimperarnau.com/blog/2017/03/Fantastic-GANs-and-where-to-find-th...
Reference: pix2pix (https://phillipi.github.io/pix2pix/)
Reference: Neural Style Transfer (arXiv1508.06576)
Paired and Unpaired Data
CycleGAN Overview
GeneratorX2Y
GeneratorY2X
DiscriminatorY
Cycle-consistency loss
GAN loss
G
DY
F
Loss Function
● GAN Loss
● Cycle-consistency Loss
● Full Loss
Network Architectures
Generator
(https://hardikbansal.github.io/CycleGANBlog/)
Reference: Residual Block (in ResNet)
(http://icml.cc/2016/tutorials/icml2016_tutorial_deep_residual_networks_kaiminghe.pd...
Reference: Transposed Convolution (a.k.a Deconvolution)
(https://github.com/vdumoulin/conv_arithmetic
)
Discriminator
(https://hardikbansal.github.io/CycleGANBlog/)
Training Techniques
Training Techniques in Generators
● Instance Normalization
● Reflect Padding
Training Techniques in Discriminators
● PatchGAN with fully convolutional networks
● Use least square loss instead of cros...
Upcoming SlideShare
Loading in …5
×

KaoNet v2 - Face Translation using CycleGAN

1,683 views

Published on

Face Translation using CycleGAN

Published in: Technology
  • Be the first to comment

KaoNet v2 - Face Translation using CycleGAN

  1. 1. KaoNet v2 Face Translation using CycleGAN October 29, 2017 Van Phu Quang Huy
  2. 2. Agenda 1. KaoNet recap 2. New feature: Face Translation 3. Technical Details
  3. 3. KaoNet recap: Face Recognition (OpenCV+CNN)
  4. 4. KaoNet recap: Face Generation (DCGAN)
  5. 5. KaoNet v2 ...
  6. 6. New feature: Face Translation Khang → Huy Huy → Khang In (real) In (real)Out (fake) Out (fake)
  7. 7. New feature: Face Translation Girl → Khang Huy → Girl In (real) In (real)Out (fake) Out (fake)
  8. 8. Technical Details: CycleGAN
  9. 9. CycleGAN ● Paper: https://arxiv.org/abs/1703.10593 ● In ICCV 2017 (International Conference on Computer Vision) ○ October 22-29, 2017 ○ @Venice, Italy
  10. 10. CycleGAN (https://junyanz.github.io/CycleGAN/)
  11. 11. CycleGAN (https://junyanz.github.io/CycleGAN/)
  12. 12. CycleGAN (https://junyanz.github.io/CycleGAN/)
  13. 13. My Implementation on GitHub! https://github.com/vanhuyz/CycleGAN-TensorFlow
  14. 14. Reference: Generative Adversarial Networks (GAN) (http://guimperarnau.com/blog/2017/03/Fantastic-GANs-and-where-to-find-them)
  15. 15. Reference: pix2pix (https://phillipi.github.io/pix2pix/)
  16. 16. Reference: Neural Style Transfer (arXiv1508.06576)
  17. 17. Paired and Unpaired Data
  18. 18. CycleGAN Overview
  19. 19. GeneratorX2Y GeneratorY2X DiscriminatorY Cycle-consistency loss GAN loss G DY F
  20. 20. Loss Function ● GAN Loss ● Cycle-consistency Loss ● Full Loss
  21. 21. Network Architectures
  22. 22. Generator (https://hardikbansal.github.io/CycleGANBlog/)
  23. 23. Reference: Residual Block (in ResNet) (http://icml.cc/2016/tutorials/icml2016_tutorial_deep_residual_networks_kaiminghe.pdf)
  24. 24. Reference: Transposed Convolution (a.k.a Deconvolution) (https://github.com/vdumoulin/conv_arithmetic )
  25. 25. Discriminator (https://hardikbansal.github.io/CycleGANBlog/)
  26. 26. Training Techniques
  27. 27. Training Techniques in Generators ● Instance Normalization ● Reflect Padding
  28. 28. Training Techniques in Discriminators ● PatchGAN with fully convolutional networks ● Use least square loss instead of cross entropy ● Use history of generated images rather than the latest ones ● Use LeakyReLU instead of ReLU

×