The document discusses information theory and defines information content of symbols. The information content I(xi) of a symbol xi from an alphabet is defined as the logarithm of the probability of that symbol. The average information over all symbols is called the entropy H(X) of the source. Entropy provides a measure of uncertainty in the source and is maximum when all symbols are equally likely. The information rate R of a source is defined as its entropy H(X) multiplied by the symbol rate r. Some examples are provided to illustrate these concepts.