Joint, Conditional, & Mutual Information
& A Case Study
Unit 3
Joint Entropy
Joint Entropy
Joint Entropy
Joint Entropy
Conditional Entropy
Conditional Entropy
● H(Y|X) = 0, if and only if, the value of Y is completely
determined by the value of X
● H(Y|X) = H(Y), if and only if, Y and X are independent
random variables
○ X does not provide any information about Y
○ Entropy of Y given X is simply equal as asking for the
Entropy of Y directly with no access to X
Mutual Information
A Case Study
A Case Study
A Case Study
Expectation:
Testable hypotheses:
A Case Study

Unit 3: Joint, Conditional, Mutual Information, & a Case Study