Variational inference using implicit distributions aims to relax the strong parametric assumptions of traditional variational inference. Implicit distributions can model more complex distributions and are defined by their ability to sample from and take derivatives of samples. The key challenges with implicit distributions are that the evidence lower bound (ELBO) must be approximated differently, using density log-ratios. The paper proposes methods to optimize the ELBO for implicit distributions using prior-contrastive forms, adversarial training similar to generative adversarial networks, and learning from denoiser gradients.