This course on information theory introduces principles and applications including measurement of information through probability and entropy, and explores communication channel capacities and coding strategies. Students will learn to calculate information content and entropies, define channel capacities using Shannon's theorems, and understand algorithms related to Fourier transforms. The course addresses fundamental questions regarding data compression limits and communication rates, integrating concepts from probability, thermodynamics, and signal processing.