This document discusses using normalizing flows and mixture models for automatic variational inference with latent categorical variables. It introduces discrete normalizing flows which allow generating samples and evaluating probabilities of categorical distributions. Mixture models of discrete flows are proposed to better approximate complex categorical distributions. The document outlines ideas for specifying multivariate categorical distributions and evaluating probabilities in the entropy term. Finally, it lists several experiments including on Gaussian mixtures, Bayesian networks, hidden Markov models, and variational autoencoders.