Style GAN
Zhedong Zheng
University of Technology Sydney
2019-1-12
What?
• Multiple levels of style
• Propose a style-based GAN
• New Evaluation Methods
• Collect a larger and various dataset FFHQ
Unsupervised separation of high-level attributes
Middle style
Coarse style
Fine style
Stochastic variation in the generated images
It demands …
• Network architecture to control style
• Style embedding Disentanglement (Separation)
What?
• Multiple levels of style
• Propose a style-based GAN
• New Evaluation Methods
• Collect a larger and various dataset FFHQ
Karras, Tero, Timo Aila, Samuli Laine, and Jaakko Lehtinen. “Progressive growing of gans
for improved quality, stability, and variation.” ICLR 2018
How?
=
What?
• Multiple levels of style
• Propose a style-based GAN
• New Evaluation Methods
• Collect a larger and various dataset FFHQ
FID on 2 Datasets
BigGAN:
Mix regularization
To be specific, we run two latent codes z1, z2
through the mapping network, and have the
corresponding w1, w2 control the styles so that w1
applies before the crossover point and w2 after it.
Mix regularization
If feature is entangled…
• FID evaluates the image quality.
• Bad Interpolation (multiple variants)
• Bad Classification
Perceptual path length (Smooth)
Linear separability on CelebA-HQ
1
2
3
What?
• Multiple levels of style
• Propose a style-based GAN
• New Evaluation Methods
• Collect a larger and various dataset FFHQ
Reference
• Karras, Tero, Samuli Laine, and Timo Aila. "A Style-Based Generator
Architecture for Generative Adversarial Networks." arXiv preprint
arXiv:1812.04948 (2018).
• Karras, Tero, Timo Aila, Samuli Laine, and Jaakko Lehtinen. “Progressive
growing of gans for improved quality, stability, and variation.” ICLR
2018
• Brock, Andrew, Jeff Donahue, and Karen Simonyan. "Large scale gan
training for high fidelity natural image synthesis." ICLR 2018.

Style gan

  • 1.
    Style GAN Zhedong Zheng Universityof Technology Sydney 2019-1-12
  • 2.
    What? • Multiple levelsof style • Propose a style-based GAN • New Evaluation Methods • Collect a larger and various dataset FFHQ
  • 3.
    Unsupervised separation ofhigh-level attributes Middle style Coarse style Fine style
  • 4.
    Stochastic variation inthe generated images
  • 5.
    It demands … •Network architecture to control style • Style embedding Disentanglement (Separation)
  • 6.
    What? • Multiple levelsof style • Propose a style-based GAN • New Evaluation Methods • Collect a larger and various dataset FFHQ
  • 7.
    Karras, Tero, TimoAila, Samuli Laine, and Jaakko Lehtinen. “Progressive growing of gans for improved quality, stability, and variation.” ICLR 2018 How? =
  • 14.
    What? • Multiple levelsof style • Propose a style-based GAN • New Evaluation Methods • Collect a larger and various dataset FFHQ
  • 15.
    FID on 2Datasets BigGAN:
  • 16.
    Mix regularization To bespecific, we run two latent codes z1, z2 through the mapping network, and have the corresponding w1, w2 control the styles so that w1 applies before the crossover point and w2 after it.
  • 17.
  • 18.
    If feature isentangled… • FID evaluates the image quality. • Bad Interpolation (multiple variants) • Bad Classification
  • 20.
  • 22.
    Linear separability onCelebA-HQ 1 2 3
  • 24.
    What? • Multiple levelsof style • Propose a style-based GAN • New Evaluation Methods • Collect a larger and various dataset FFHQ
  • 25.
    Reference • Karras, Tero,Samuli Laine, and Timo Aila. "A Style-Based Generator Architecture for Generative Adversarial Networks." arXiv preprint arXiv:1812.04948 (2018). • Karras, Tero, Timo Aila, Samuli Laine, and Jaakko Lehtinen. “Progressive growing of gans for improved quality, stability, and variation.” ICLR 2018 • Brock, Andrew, Jeff Donahue, and Karen Simonyan. "Large scale gan training for high fidelity natural image synthesis." ICLR 2018.