This document discusses information theory and entropy. It defines information as the number of letters in a message, which can be measured and compared. Shannon's information theory introduced entropy to quantify the uncertainty in a data source. Entropy is defined as the expected value of the information content per symbol and represents the number of bits needed to encode the data from the source. Higher entropy means more bits are needed and there is more uncertainty in the data. Examples are provided to illustrate entropy calculations for binary and non-binary sources.