This document discusses information theory concepts including information, entropy, joint entropy, conditional entropy, and mutual information. It provides definitions and formulas for quantifying information based on probability. Key points include: information is inversely proportional to probability; entropy H(X) quantifies average information of a source; joint entropy H(X,Y) quantifies information of two related sources; and conditional entropy H(X|Y) measures remaining uncertainty about a source given knowledge of the other. Several examples are provided to illustrate calculating these information theory metrics.