This document provides mathematical formulas and explanations for many machine learning and statistical concepts. Some of the key concepts covered include:
- Naive Bayes classifier formulas for calculating posterior probability.
- Bayes optimal classifier and maximum a posteriori (MAP) estimation.
- Mixture models, mixtures of Gaussians, and the EM algorithm.
- Derivative rules including the product, chain and sum rules.
- Statistical concepts such as variance, standard deviation, covariance, and confidence intervals.
- Linear and logistic regression formulas.
- Decision trees and information gain.
The document serves as a reference sheet, concisely defining many foundational techniques in just a few sentences each.