1. The document discusses Wasserstein GAN (WGAN), a type of generative adversarial network (GAN) that uses the Wasserstein distance rather than Jensen-Shannon divergence. WGAN has improved stability during training over traditional GANs.
2. WGAN trains the discriminator/critic to estimate the Wasserstein distance between real and generated distributions rather than classify samples. The gradient of the Wasserstein distance can be used to train the generator.
3. WGAN allows for using techniques like minibatch discrimination and batch normalization that previously caused issues with GAN training stability. WGAN has better theoretical properties and often produces higher quality samples than alternatives like DCGAN.