This document proposes applying stochastic gradient variational Bayes (SGVB) to latent Dirichlet allocation (LDA) topic modeling to obtain an efficient posterior estimation. SGVB introduces randomness into variational inference for LDA by estimating expectations with Monte Carlo integration and using reparameterization to sample from approximate posterior distributions. Evaluation on several text corpora shows perplexities comparable to existing LDA inference methods, with the potential for faster parallelization using techniques like GPU processing. Future work will explore applying SGVB to other probabilistic document models like correlated topic models.