This document provides an introduction to information theory, including definitions of key concepts like entropy, marginal entropy, joint entropy, and conditional entropy. It uses examples like coin flips to illustrate these concepts. Entropy quantifies the average amount of information or uncertainty in a random variable, while joint entropy looks at multiple variables together. Conditional entropy considers the information remaining in one variable after another is observed. The document outlines these topics to introduce fundamental information theory concepts.