Information is stored digitally in bits which allows for different combinations of 0s and 1s to represent letters, numbers, etc. Information storage and communication are similar as storage can be thought of as communication with a hard disk. The key aspects of information theory covered include: information rate, bit rate, baud rate, data rate, signal-to-noise ratio, noise figure, channel capacity as defined by Hartley's law and related to bandwidth and transmission time, sampling theory/Nyquist theorem regarding minimum sampling rate, Shannon's theorem stating error-free communication is possible below the channel capacity, and Shannon-Hartley theorem relating channel capacity to bandwidth and signal-to-noise ratio.