This document discusses an introduction to coding theory and information theory. It defines source coding and channel coding as the two main fields of coding theory. Source coding aims to represent source symbols efficiently, while channel coding enhances error detection and correction during transmission. The document then discusses information measures like entropy, joint entropy, conditional entropy, and mutual information. It provides examples of calculating entropy for different probability distributions. The key concepts of information theory like Shannon's definition of information, chain rule, and mutual information are also summarized.