This document discusses key concepts in probability and information theory, including:
- Random variables that can take on discrete or continuous states, described by probability mass functions or probability density functions.
- Common probability distributions like the Bernoulli, Gaussian, and exponential distributions.
- Information theory concepts like entropy, Kullback-Leibler divergence, and how to structure probabilistic models as directed or undirected graphs.