This document presents the method of auto-encoding variational Bayes for training generative models. The method approximates the intractable posterior distribution p(z|x) with a variational distribution q(z|x). It maximizes a variational lower bound on the likelihood by minimizing the KL divergence between the variational and true posteriors. This is done using the reparameterization trick to backpropagate through stochastic nodes. The method can be seen as training a variational autoencoder to generate data and learn a latent representation. Experiments show it generates realistic samples and outperforms other methods on held-out likelihood.