This document discusses the Expectation Maximization (EM) algorithm. EM is an iterative method used to find maximum likelihood estimates of parameters in probabilistic models. It estimates parameters given measurement data, commonly for Gaussian mixture distributions. EM alternates between an expectation (E) step, which computes the expected likelihood including latent variables, and a maximization (M) step, which computes parameter estimates maximizing the expected likelihood. The EM algorithm converges by ensuring that the expected log likelihood is increased with each iteration. Jensen's inequality and the EM theorem prove that this process will converge to a local maximum of the likelihood function.