Information Theory 
Entropy 
Joint Entropy and Conditional
Entropy
Chain Rules for Entropy, Relative
Entropy amd Mutual Information.
Lemma -
Joint Entropy
Conditional Entropy
Theorem (Chain Rule)
Corollary
Chain rule for entropy
Chain rule for information
Chain rule for relative Entropy
Relative Entropy and Mutual
Information
Relative entropy 
Relative Entropy/ Kullback -
Leibler (KL) distance
Mutual Information 
Mutual Information
Mutual Information and Entropy

Introduction to information theory

  • 1.
    Information Theory  Entropy Joint Entropy and Conditional Entropy Chain Rules for Entropy, Relative Entropy amd Mutual Information. Lemma - Joint Entropy Conditional Entropy Theorem (Chain Rule) Corollary Chain rule for entropy Chain rule for information Chain rule for relative Entropy Relative Entropy and Mutual Information Relative entropy  Relative Entropy/ Kullback - Leibler (KL) distance Mutual Information  Mutual Information Mutual Information and Entropy