This document contains solved examples related to information theory. It begins with examples calculating the information rate of a telegraph source with dots and dashes. It then provides examples calculating the entropy, message rate, and information rate of a PCM voice signal quantized into 16 levels. Further examples calculate the source entropy and information rate for a message source that generates one of four messages. Finally, it constructs the Shannon-Fano code for a source with five symbols of varying probabilities.