Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Progressive gan

772 views

Published on

progressive gan discussion

Published in: Software
  • You have to choose carefully. HelpWriting.net offers a professional writing service. I highly recommend them. The papers are delivered on time and customers are their first priority. This is their website: HelpWriting.net
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Be the first to like this

Progressive gan

  1. 1. Progressive Growing of GANS for Improved Quality, Stability, and Variation Tero Karras, Timo Aila, Samuli Laine, Jaakko Lehtinen ICLR 2018 Oral
  2. 2. Method Generate Image Disadvantages Autoregressive (Pixel RNN/CNN) Sharp 1. Slow to evaluate (generate pixel by pixel) 2. Do not have a latent representation (similar to Deconv) VAEs Blurry Information bottleneck GANs Sharp 1. Only work on small resolutions 2. Hard to train Motivation
  3. 3. • Higher resolution is easier to tell difference. • Smaller batch size. • So Grow G and D progressively Why it is hard?
  4. 4. weight α increases linearly from 0 to 1.
  5. 5. Minbatch standard deviation • Insert a constant feature map • Compute std and concat.
  6. 6. Equalized Learning Rate • Using (He et al., 2015) initialization
  7. 7. Pixelwise feature vector normalization in G • After every conv layer (In G) • It is like batch-norm but works on pixel wise • N is the number of feature maps • In instance-norm a-mean(a) /std(a)
  8. 8. Assessment • MS-SSIM (Gobal image) (all generated images) • Proposed method (Local image structure) (generated & real) • Feature (like Sift). • Distance (using Wassertein distance Rabin et al. 2011)
  9. 9. Details • Basic loss: Improved WGAN • Model: See Right
  10. 10. Related work: Laplacian GAN
  11. 11. Generative Multi-Adversarial Networks

×