This document provides information on the topic of information sources and entropy from a lecture on discrete mathematics and information theory. It defines information and discusses how entropy can quantify the uncertainty or average information of a data source. Entropy is defined as the measure of average information per symbol of a data source. The document gives several examples of calculating the entropy of different data sources with varying symbol probabilities. It also discusses properties of entropy like how it relates to source efficiency and redundancy.