SlideShare a Scribd company logo
ECE 562: Information Theory Spring 2006
Lecture 4 — February 2
Lecturer: Sergio D. Servetto Scribe: Frank Ciaramello
4.1 Some Useful Information Inequalities
This section proves some useful inequalities that will be used often.
First, we will show that conditioning a random variable cannot increase its entropy. In-
tuitively, this makes sense. The act of conditioning adds information about a particular
random variable. Therefore, the uncertainty must go down (or stay the same, if the condi-
tioning added no information, i.e. the conditioning variable is independent of the random
variable.)
Theorem 4.1. “Conditioning Does Not Increase Entropy”
H(X|Y ) ≤ H(X); for any random variable X, Y (4.1)
Proof:
H(X|Y ) = H(X) − I(X; Y )
I(X; Y ) ≥ 0
∴ H(X|Y ) ≤ H(X)
Two results that can be taken from theorem 4.1 are that equality holds only for the case
of independence and that we can condition on more than one random variable:
1. H(X|Y ) = H(X) ⇐⇒ X and Y are independent
2. H(X|Y Z) ≤ H(X|Y ) ≤ H(X)
The next inequality we will prove shows that mutual entropy can be upper bounded by
the case when each random variable is independent. This means that dependence among
random variables decreases entropy. We can prove it using two different methods.
4-1
ECE 562 Lecture 4 — February 2 Spring 2006
Theorem 4.2. “Independence Bound”
H(X1, X2, ..., Xn) ≤
n
i=1
H(Xi) (4.2)
Proof: Method 1 uses the chain rule for entropy.
H(X1, X2, ..., Xn) =
n
i=1
H(Xi|X1...Xi−1) ≤
n
i=1
H(Xi)
Proof: Method 2 expands the entropies and relates them to a relative entropy, or divergence.
n
i=1
H(Xi) − H(X1, ..., Xn) = −
n
i=1
E(log p(Xi)) + E(log p(X1...Xn))
= −E(log p(X1)...p(Xn)) + E(log p(X1...Xn))
= E(log
p(X1...Xn)
p(X1)...p(Xn)
)
= D(p(X1...Xn)||p(X1)...p(Xn)) ≥ 0
4.2 Data Processing Inequality
This section provides the necessary theorems and lemmas to prove the data processing in-
equality.
Theorem 4.3.
I(X; Y, Z) ≥ I(X; Y ) (4.3)
equality holds ⇐⇒ X-Y-Z forms a Markov chain.
Proof: Using the chain rule for mutual information, we show
I(X; Y, Z) = I(X; Y ) + I(X; Z|Y )
≥0
≥ I(X; Y )
4-2
ECE 562 Lecture 4 — February 2 Spring 2006
The following theorem, theorem 4.4 shows that the closer in the Markov Chain the
variables are, the more information they share between them. I.e. variables that are far
apart are closer to being independent.
Theorem 4.4. X-Y-Z forms a Markov Chain ⇐⇒
I(X; Z) ≤ I(X; Y ) (4.4)
I(X; Z) ≤ I(Y ; Z) (4.5)
Proof: Prove by expanding mutual information in two different ways.
I(X; Y, Z) = I(X; Z) + I(X; Y |Z)
I(X; Y, Z) = I(X; Y ) + I(X; Z|Y )
By the definition of a Markov chain, X⊥Z|Y , therefore, I(X; Z|Y ) = 0 and
I(X; Y ) = I(X; Z) + I(X; Y |Z)
Mutual information is always greater than or equal to zero, therefore
I(X; Y ) ≥ I(X; Z)
Since X-Y-Z is equivalent to Z-Y-X, the same method can be used to prove (4.5)
Theorem 4.5. “Data Processing Inequality”
If U-X-Y-V is a Markov Chain, then
I(U; V ) ≤ I(X; Y ) (4.6)
Proof: Since U-X-Y-V is a MC, then U-X-Y and X-Y-V are MCs. The proof follows simply
from theorem 4.4
I(U; Y ) ≤ I(X; Y )
I(U; V ) ≤ I(U; Y )
∴ I(U; V ) ≤ I(X; Y )
The data processing inequality shows us that if we want to infer X using Y, the best we
can do is simply to use an unprocessed version of Y. By processing Y (either deterministically
or probabilistically), we increase uncertainty in X, given the processed version of Y.
4-3
ECE 562 Lecture 4 — February 2 Spring 2006
4.3 Fano’s Inequality
The following are lemmas and definitions required for Fano’s Inequality.
Lemma 4.6 shows that the entropy of a random variable is always less than or equal to
the log of the size of its alphabet.
Lemma 4.6.
H(X) ≤ log |X| (4.7)
equality holds
⇐⇒ P(X = x) =
1
|X|
, ∀x
Proof: We prove this by expanding the terms into their summations and relating them to
a relative entropy measure.
log |X| − H(X) = −
x∈X
p(x) log |X|−1
+
x∈X
p(x) log p(x)
= −
x∈X
p(x) log u(x) +
x∈X
p(x) log p(x); u(x) =
1
|X|
=
x∈X
p(x) log
p(x)
u(x)
=D(p(x)||q(x)) ≥ 0
One consequence is that equality holds if and only if p(x) = u(x), i.e. p(x) is a uniform
distribution. This says that entropy is maximum when all outcomes are equally likely. This
makes sense, intuitively, since entropy measures uncertainty in the random variable X.
Another consequence is the following corollary:
Corollary 4.7. H(X) can be any non-negative real number.
Proof: The proof follows from the intermediate value theorem. We know that H(X) = 0
for a deterministic signal and H(X) = log |X| for a uniform distribution. For any value
0 < a < log |X|, ∃X such that H(X) = a. For |X| sufficiently large, H(X) can take any
positive value.
Theorem 4.8. “Fano’s Inequality”
4-4
ECE 562 Lecture 4 — February 2 Spring 2006
First, we define Pe, probability of error:
Let X, ˆX two random variables on X
Pe = P(X = ˆX) (4.8)
Fano’s Inequality:
H(X| ˆX) ≤ hb(Pe) + Pe log |X| − 1 (4.9)
Proof: We will prove Fano’s inequality by expanding the entropy and by using theorem 4.1.
Define an indicator function: Y =
0, X = ˆX
1, X = ˆX
Note:
p(Y = 1) = Pe
p(Y = 0) = 1 − Pe
H(Y ) = hb(Pe)
H(Y |X, ˆX) = 0
H(X| ˆX) = I(X; Y | ˆX) + H(X| ˆX, Y )
= H(X| ˆX) − H(X| ˆX, Y ) + H(X| ˆX, Y )
= H(Y | ˆX) − H(Y | ˆX, X) + H(X| ˆX, Y )
= H(Y | ˆX) + H(X| ˆX, Y )
≤ H(Y ) + H(X| ˆX, Y )
= H(Y ) +
ˆx∈X
P( ˆX = ˆx, Y = 0)H(X| ˆX = ˆx, Y = 0)
+ P( ˆX = ˆx, Y = 1)H(X| ˆX = ˆx, Y = 1)
Since Y = 0, X = ˆX. Therefore, the first term in the summation is 0:
H(X| ˆX = ˆx, Y = 0) = 0 (4.10)
Lemma 4.6 says that the entropy is less than or equal to the log of the size of the alphabet
of a random variable. Since we know that X = ˆX(Y = 1) then the possible alphabet size
for X is |X| − 1, the original minus the value that ˆX has taken. Therefore,
H(X| ˆX = ˆx, Y = 1) ≤ log(|X| − 1) (4.11)
4-5
ECE 562 Lecture 4 — February 2 Spring 2006
Using (4.10) and (4.11), we can show
H(X| ˆX) ≤ hb(Pe) + log(|X| − 1)
ˆx∈X
P( ˆX = ˆx, Y = 1)
p(Y =1)=Pe
H(X| ˆX) ≤ hb(Pe) + Pe log(|X| − 1)
Corollary 4.9. Weak Fano’s Inequality
H(X| ˆX) ≤ 1 + Pe log |X| (4.12)
Proof: We know that binary entropy is upper bounded by 1 and that log is an increasing
function in X , therefore log(|X| − 1) ≤ log(|X|). The corollary is proven.
4-6

More Related Content

What's hot

Roots of equations
Roots of equationsRoots of equations
Roots of equationsgilandio
 
Numerical analysis ppt
Numerical analysis pptNumerical analysis ppt
Numerical analysis ppt
MalathiNagarajan20
 
Lecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inferenceLecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inferenceasimnawaz54
 
Predicates and Quantifiers
Predicates and QuantifiersPredicates and Quantifiers
Predicates and Quantifiers
blaircomp2003
 
International Journal of Mathematics and Statistics Invention (IJMSI)
International Journal of Mathematics and Statistics Invention (IJMSI) International Journal of Mathematics and Statistics Invention (IJMSI)
International Journal of Mathematics and Statistics Invention (IJMSI)
inventionjournals
 
Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)
IJERD Editor
 
Numerical
NumericalNumerical
Numerical1821986
 
2 random variables notes 2p3
2 random variables notes 2p32 random variables notes 2p3
2 random variables notes 2p3
MuhannadSaleh
 
Equations root
Equations rootEquations root
Equations rootMileacre
 
On New Root Finding Algorithms for Solving Nonlinear Transcendental Equations
On New Root Finding Algorithms for Solving Nonlinear Transcendental EquationsOn New Root Finding Algorithms for Solving Nonlinear Transcendental Equations
On New Root Finding Algorithms for Solving Nonlinear Transcendental Equations
AI Publications
 
Langrange Interpolation Polynomials
Langrange Interpolation PolynomialsLangrange Interpolation Polynomials
Langrange Interpolation Polynomials
Sohaib H. Khan
 
Numerical solution of boundary value problems by piecewise analysis method
Numerical solution of boundary value problems by piecewise analysis methodNumerical solution of boundary value problems by piecewise analysis method
Numerical solution of boundary value problems by piecewise analysis method
Alexander Decker
 
Bayesian computation with INLA
Bayesian computation with INLABayesian computation with INLA
Bayesian computation with INLA
Thiago Guerrera Martins
 
Contribution of Fixed Point Theorem in Quasi Metric Spaces
Contribution of Fixed Point Theorem in Quasi Metric SpacesContribution of Fixed Point Theorem in Quasi Metric Spaces
Contribution of Fixed Point Theorem in Quasi Metric Spaces
AM Publications,India
 
Elzaki transform homotopy perturbation method for
Elzaki transform homotopy perturbation method forElzaki transform homotopy perturbation method for
Elzaki transform homotopy perturbation method for
eSAT Publishing House
 

What's hot (15)

Roots of equations
Roots of equationsRoots of equations
Roots of equations
 
Numerical analysis ppt
Numerical analysis pptNumerical analysis ppt
Numerical analysis ppt
 
Lecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inferenceLecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inference
 
Predicates and Quantifiers
Predicates and QuantifiersPredicates and Quantifiers
Predicates and Quantifiers
 
International Journal of Mathematics and Statistics Invention (IJMSI)
International Journal of Mathematics and Statistics Invention (IJMSI) International Journal of Mathematics and Statistics Invention (IJMSI)
International Journal of Mathematics and Statistics Invention (IJMSI)
 
Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)
 
Numerical
NumericalNumerical
Numerical
 
2 random variables notes 2p3
2 random variables notes 2p32 random variables notes 2p3
2 random variables notes 2p3
 
Equations root
Equations rootEquations root
Equations root
 
On New Root Finding Algorithms for Solving Nonlinear Transcendental Equations
On New Root Finding Algorithms for Solving Nonlinear Transcendental EquationsOn New Root Finding Algorithms for Solving Nonlinear Transcendental Equations
On New Root Finding Algorithms for Solving Nonlinear Transcendental Equations
 
Langrange Interpolation Polynomials
Langrange Interpolation PolynomialsLangrange Interpolation Polynomials
Langrange Interpolation Polynomials
 
Numerical solution of boundary value problems by piecewise analysis method
Numerical solution of boundary value problems by piecewise analysis methodNumerical solution of boundary value problems by piecewise analysis method
Numerical solution of boundary value problems by piecewise analysis method
 
Bayesian computation with INLA
Bayesian computation with INLABayesian computation with INLA
Bayesian computation with INLA
 
Contribution of Fixed Point Theorem in Quasi Metric Spaces
Contribution of Fixed Point Theorem in Quasi Metric SpacesContribution of Fixed Point Theorem in Quasi Metric Spaces
Contribution of Fixed Point Theorem in Quasi Metric Spaces
 
Elzaki transform homotopy perturbation method for
Elzaki transform homotopy perturbation method forElzaki transform homotopy perturbation method for
Elzaki transform homotopy perturbation method for
 

Viewers also liked

Equivariance
EquivarianceEquivariance
Equivariance
mustafa sarac
 
7 conclusion.pptx
7 conclusion.pptx7 conclusion.pptx
7 conclusion.pptx
mustafa sarac
 
how to design classes
how to design classeshow to design classes
how to design classes
mustafa sarac
 
The Biology of Memory
The Biology of MemoryThe Biology of Memory
The Biology of Memory
mustafa sarac
 
Memperf
MemperfMemperf
Memperf
mustafa sarac
 
A Critique of the CAP Theorem by Martin Kleppmann
A Critique of the CAP Theorem by Martin KleppmannA Critique of the CAP Theorem by Martin Kleppmann
A Critique of the CAP Theorem by Martin Kleppmann
mustafa sarac
 
Replect
ReplectReplect
Replect
mustafa sarac
 
Memory efficient java tutorial practices and challenges
Memory efficient java tutorial practices and challengesMemory efficient java tutorial practices and challenges
Memory efficient java tutorial practices and challenges
mustafa sarac
 
6 large-scale-learning.pptx
6 large-scale-learning.pptx6 large-scale-learning.pptx
6 large-scale-learning.pptx
mustafa sarac
 
AWS essentials EC2
AWS essentials EC2AWS essentials EC2
AWS essentials EC2
mustafa sarac
 
Lecture 05 gerard medioni - tensor voting: fundamentals and recent progress
Lecture 05   gerard medioni - tensor voting: fundamentals and recent progressLecture 05   gerard medioni - tensor voting: fundamentals and recent progress
Lecture 05 gerard medioni - tensor voting: fundamentals and recent progress
mustafa sarac
 
Scaling to Millions of Simultaneous Connections by Rick Reed from WhatsApp
Scaling to Millions of Simultaneous Connections by Rick Reed from WhatsAppScaling to Millions of Simultaneous Connections by Rick Reed from WhatsApp
Scaling to Millions of Simultaneous Connections by Rick Reed from WhatsApp
mustafa sarac
 
Lecture 01 frank dellaert - 3 d reconstruction and mapping: a factor graph ...
Lecture 01   frank dellaert - 3 d reconstruction and mapping: a factor graph ...Lecture 01   frank dellaert - 3 d reconstruction and mapping: a factor graph ...
Lecture 01 frank dellaert - 3 d reconstruction and mapping: a factor graph ...
mustafa sarac
 
1 introduction.pptx
1 introduction.pptx1 introduction.pptx
1 introduction.pptx
mustafa sarac
 
Big data & advanced analytics in Telecom: A multi-billion-dollar revenue oppo...
Big data & advanced analytics in Telecom: A multi-billion-dollar revenue oppo...Big data & advanced analytics in Telecom: A multi-billion-dollar revenue oppo...
Big data & advanced analytics in Telecom: A multi-billion-dollar revenue oppo...
mustafa sarac
 
Banking and fintech
Banking and fintechBanking and fintech
Banking and fintech
mustafa sarac
 

Viewers also liked (16)

Equivariance
EquivarianceEquivariance
Equivariance
 
7 conclusion.pptx
7 conclusion.pptx7 conclusion.pptx
7 conclusion.pptx
 
how to design classes
how to design classeshow to design classes
how to design classes
 
The Biology of Memory
The Biology of MemoryThe Biology of Memory
The Biology of Memory
 
Memperf
MemperfMemperf
Memperf
 
A Critique of the CAP Theorem by Martin Kleppmann
A Critique of the CAP Theorem by Martin KleppmannA Critique of the CAP Theorem by Martin Kleppmann
A Critique of the CAP Theorem by Martin Kleppmann
 
Replect
ReplectReplect
Replect
 
Memory efficient java tutorial practices and challenges
Memory efficient java tutorial practices and challengesMemory efficient java tutorial practices and challenges
Memory efficient java tutorial practices and challenges
 
6 large-scale-learning.pptx
6 large-scale-learning.pptx6 large-scale-learning.pptx
6 large-scale-learning.pptx
 
AWS essentials EC2
AWS essentials EC2AWS essentials EC2
AWS essentials EC2
 
Lecture 05 gerard medioni - tensor voting: fundamentals and recent progress
Lecture 05   gerard medioni - tensor voting: fundamentals and recent progressLecture 05   gerard medioni - tensor voting: fundamentals and recent progress
Lecture 05 gerard medioni - tensor voting: fundamentals and recent progress
 
Scaling to Millions of Simultaneous Connections by Rick Reed from WhatsApp
Scaling to Millions of Simultaneous Connections by Rick Reed from WhatsAppScaling to Millions of Simultaneous Connections by Rick Reed from WhatsApp
Scaling to Millions of Simultaneous Connections by Rick Reed from WhatsApp
 
Lecture 01 frank dellaert - 3 d reconstruction and mapping: a factor graph ...
Lecture 01   frank dellaert - 3 d reconstruction and mapping: a factor graph ...Lecture 01   frank dellaert - 3 d reconstruction and mapping: a factor graph ...
Lecture 01 frank dellaert - 3 d reconstruction and mapping: a factor graph ...
 
1 introduction.pptx
1 introduction.pptx1 introduction.pptx
1 introduction.pptx
 
Big data & advanced analytics in Telecom: A multi-billion-dollar revenue oppo...
Big data & advanced analytics in Telecom: A multi-billion-dollar revenue oppo...Big data & advanced analytics in Telecom: A multi-billion-dollar revenue oppo...
Big data & advanced analytics in Telecom: A multi-billion-dollar revenue oppo...
 
Banking and fintech
Banking and fintechBanking and fintech
Banking and fintech
 

Similar to 0202 fmc3

Data mining assignment 2
Data mining assignment 2Data mining assignment 2
Data mining assignment 2
BarryK88
 
Approximation Methods Of Solutions For Equilibrium Problem In Hilbert Spaces
Approximation Methods Of Solutions For Equilibrium Problem In Hilbert SpacesApproximation Methods Of Solutions For Equilibrium Problem In Hilbert Spaces
Approximation Methods Of Solutions For Equilibrium Problem In Hilbert Spaces
Lisa Garcia
 
Proba stats-r1-2017
Proba stats-r1-2017Proba stats-r1-2017
Proba stats-r1-2017
Arthur Charpentier
 
Chs4
Chs4Chs4
Chs4
CAALAAA
 
Probability/Statistics Lecture Notes 4: Hypothesis Testing
Probability/Statistics Lecture Notes 4: Hypothesis TestingProbability/Statistics Lecture Notes 4: Hypothesis Testing
Probability/Statistics Lecture Notes 4: Hypothesis Testing
jemille6
 
Reading Birnbaum's (1962) paper, by Li Chenlu
Reading Birnbaum's (1962) paper, by Li ChenluReading Birnbaum's (1962) paper, by Li Chenlu
Reading Birnbaum's (1962) paper, by Li Chenlu
Christian Robert
 
Unique fixed point theorems for generalized weakly contractive condition in o...
Unique fixed point theorems for generalized weakly contractive condition in o...Unique fixed point theorems for generalized weakly contractive condition in o...
Unique fixed point theorems for generalized weakly contractive condition in o...
Alexander Decker
 
Lecture 2: Entropy and Mutual Information
Lecture 2: Entropy and Mutual InformationLecture 2: Entropy and Mutual Information
Lecture 2: Entropy and Mutual Information
ssuserb83554
 
Expectation of Discrete Random Variable.ppt
Expectation of Discrete Random Variable.pptExpectation of Discrete Random Variable.ppt
Expectation of Discrete Random Variable.ppt
AlyasarJabbarli
 
Lecture7 channel capacity
Lecture7   channel capacityLecture7   channel capacity
Lecture7 channel capacity
Frank Katta
 
Bayesian Learning- part of machine learning
Bayesian Learning-  part of machine learningBayesian Learning-  part of machine learning
Bayesian Learning- part of machine learning
kensaleste
 
Probability and Statistics
Probability and StatisticsProbability and Statistics
Probability and Statistics
Malik Sb
 
Fisher_info_ppt and mathematical process to find time domain and frequency do...
Fisher_info_ppt and mathematical process to find time domain and frequency do...Fisher_info_ppt and mathematical process to find time domain and frequency do...
Fisher_info_ppt and mathematical process to find time domain and frequency do...
praveenyadav2020
 
Lecture13p.pdf.pdfThedeepness of freedom are threevalues.docx
Lecture13p.pdf.pdfThedeepness of freedom are threevalues.docxLecture13p.pdf.pdfThedeepness of freedom are threevalues.docx
Lecture13p.pdf.pdfThedeepness of freedom are threevalues.docx
croysierkathey
 
Lecture13p.pdf.pdfThedeepness of freedom are threevalues.docx
Lecture13p.pdf.pdfThedeepness of freedom are threevalues.docxLecture13p.pdf.pdfThedeepness of freedom are threevalues.docx
Lecture13p.pdf.pdfThedeepness of freedom are threevalues.docx
jeremylockett77
 
Statistics Exam Help
Statistics Exam HelpStatistics Exam Help
Statistics Exam Help
Statistics Exam Help
 
Discussion about random variable ad its characterization
Discussion about random variable ad its characterizationDiscussion about random variable ad its characterization
Discussion about random variable ad its characterization
Geeta Arora
 
U unit7 ssb
U unit7 ssbU unit7 ssb
U unit7 ssb
Akhilesh Deshpande
 

Similar to 0202 fmc3 (20)

Data mining assignment 2
Data mining assignment 2Data mining assignment 2
Data mining assignment 2
 
Approximation Methods Of Solutions For Equilibrium Problem In Hilbert Spaces
Approximation Methods Of Solutions For Equilibrium Problem In Hilbert SpacesApproximation Methods Of Solutions For Equilibrium Problem In Hilbert Spaces
Approximation Methods Of Solutions For Equilibrium Problem In Hilbert Spaces
 
Proba stats-r1-2017
Proba stats-r1-2017Proba stats-r1-2017
Proba stats-r1-2017
 
Chs4
Chs4Chs4
Chs4
 
Probability/Statistics Lecture Notes 4: Hypothesis Testing
Probability/Statistics Lecture Notes 4: Hypothesis TestingProbability/Statistics Lecture Notes 4: Hypothesis Testing
Probability/Statistics Lecture Notes 4: Hypothesis Testing
 
Reading Birnbaum's (1962) paper, by Li Chenlu
Reading Birnbaum's (1962) paper, by Li ChenluReading Birnbaum's (1962) paper, by Li Chenlu
Reading Birnbaum's (1962) paper, by Li Chenlu
 
Unique fixed point theorems for generalized weakly contractive condition in o...
Unique fixed point theorems for generalized weakly contractive condition in o...Unique fixed point theorems for generalized weakly contractive condition in o...
Unique fixed point theorems for generalized weakly contractive condition in o...
 
Pata contraction
Pata contractionPata contraction
Pata contraction
 
Lecture 2: Entropy and Mutual Information
Lecture 2: Entropy and Mutual InformationLecture 2: Entropy and Mutual Information
Lecture 2: Entropy and Mutual Information
 
Expectation of Discrete Random Variable.ppt
Expectation of Discrete Random Variable.pptExpectation of Discrete Random Variable.ppt
Expectation of Discrete Random Variable.ppt
 
Lecture7 channel capacity
Lecture7   channel capacityLecture7   channel capacity
Lecture7 channel capacity
 
Bayesian Learning- part of machine learning
Bayesian Learning-  part of machine learningBayesian Learning-  part of machine learning
Bayesian Learning- part of machine learning
 
Probability and Statistics
Probability and StatisticsProbability and Statistics
Probability and Statistics
 
Fisher_info_ppt and mathematical process to find time domain and frequency do...
Fisher_info_ppt and mathematical process to find time domain and frequency do...Fisher_info_ppt and mathematical process to find time domain and frequency do...
Fisher_info_ppt and mathematical process to find time domain and frequency do...
 
Lecture13p.pdf.pdfThedeepness of freedom are threevalues.docx
Lecture13p.pdf.pdfThedeepness of freedom are threevalues.docxLecture13p.pdf.pdfThedeepness of freedom are threevalues.docx
Lecture13p.pdf.pdfThedeepness of freedom are threevalues.docx
 
Lecture13p.pdf.pdfThedeepness of freedom are threevalues.docx
Lecture13p.pdf.pdfThedeepness of freedom are threevalues.docxLecture13p.pdf.pdfThedeepness of freedom are threevalues.docx
Lecture13p.pdf.pdfThedeepness of freedom are threevalues.docx
 
Statistics Exam Help
Statistics Exam HelpStatistics Exam Help
Statistics Exam Help
 
Slides lln-risques
Slides lln-risquesSlides lln-risques
Slides lln-risques
 
Discussion about random variable ad its characterization
Discussion about random variable ad its characterizationDiscussion about random variable ad its characterization
Discussion about random variable ad its characterization
 
U unit7 ssb
U unit7 ssbU unit7 ssb
U unit7 ssb
 

More from mustafa sarac

Uluslararasilasma son
Uluslararasilasma sonUluslararasilasma son
Uluslararasilasma son
mustafa sarac
 
Real time machine learning proposers day v3
Real time machine learning proposers day v3Real time machine learning proposers day v3
Real time machine learning proposers day v3
mustafa sarac
 
Latka december digital
Latka december digitalLatka december digital
Latka december digital
mustafa sarac
 
Axial RC SCX10 AE2 ESC user manual
Axial RC SCX10 AE2 ESC user manualAxial RC SCX10 AE2 ESC user manual
Axial RC SCX10 AE2 ESC user manual
mustafa sarac
 
Array programming with Numpy
Array programming with NumpyArray programming with Numpy
Array programming with Numpy
mustafa sarac
 
Math for programmers
Math for programmersMath for programmers
Math for programmers
mustafa sarac
 
The book of Why
The book of WhyThe book of Why
The book of Why
mustafa sarac
 
BM sgk meslek kodu
BM sgk meslek koduBM sgk meslek kodu
BM sgk meslek kodu
mustafa sarac
 
TEGV 2020 Bireysel bagiscilarimiz
TEGV 2020 Bireysel bagiscilarimizTEGV 2020 Bireysel bagiscilarimiz
TEGV 2020 Bireysel bagiscilarimiz
mustafa sarac
 
How to make and manage a bee hotel?
How to make and manage a bee hotel?How to make and manage a bee hotel?
How to make and manage a bee hotel?
mustafa sarac
 
Cahit arf makineler dusunebilir mi
Cahit arf makineler dusunebilir miCahit arf makineler dusunebilir mi
Cahit arf makineler dusunebilir mi
mustafa sarac
 
How did Software Got So Reliable Without Proof?
How did Software Got So Reliable Without Proof?How did Software Got So Reliable Without Proof?
How did Software Got So Reliable Without Proof?
mustafa sarac
 
Staff Report on Algorithmic Trading in US Capital Markets
Staff Report on Algorithmic Trading in US Capital MarketsStaff Report on Algorithmic Trading in US Capital Markets
Staff Report on Algorithmic Trading in US Capital Markets
mustafa sarac
 
Yetiskinler icin okuma yazma egitimi
Yetiskinler icin okuma yazma egitimiYetiskinler icin okuma yazma egitimi
Yetiskinler icin okuma yazma egitimi
mustafa sarac
 
Consumer centric api design v0.4.0
Consumer centric api design v0.4.0Consumer centric api design v0.4.0
Consumer centric api design v0.4.0
mustafa sarac
 
State of microservices 2020 by tsh
State of microservices 2020 by tshState of microservices 2020 by tsh
State of microservices 2020 by tsh
mustafa sarac
 
Uber pitch deck 2008
Uber pitch deck 2008Uber pitch deck 2008
Uber pitch deck 2008
mustafa sarac
 
Wireless solar keyboard k760 quickstart guide
Wireless solar keyboard k760 quickstart guideWireless solar keyboard k760 quickstart guide
Wireless solar keyboard k760 quickstart guide
mustafa sarac
 
State of Serverless Report 2020
State of Serverless Report 2020State of Serverless Report 2020
State of Serverless Report 2020
mustafa sarac
 
Dont just roll the dice
Dont just roll the diceDont just roll the dice
Dont just roll the dice
mustafa sarac
 

More from mustafa sarac (20)

Uluslararasilasma son
Uluslararasilasma sonUluslararasilasma son
Uluslararasilasma son
 
Real time machine learning proposers day v3
Real time machine learning proposers day v3Real time machine learning proposers day v3
Real time machine learning proposers day v3
 
Latka december digital
Latka december digitalLatka december digital
Latka december digital
 
Axial RC SCX10 AE2 ESC user manual
Axial RC SCX10 AE2 ESC user manualAxial RC SCX10 AE2 ESC user manual
Axial RC SCX10 AE2 ESC user manual
 
Array programming with Numpy
Array programming with NumpyArray programming with Numpy
Array programming with Numpy
 
Math for programmers
Math for programmersMath for programmers
Math for programmers
 
The book of Why
The book of WhyThe book of Why
The book of Why
 
BM sgk meslek kodu
BM sgk meslek koduBM sgk meslek kodu
BM sgk meslek kodu
 
TEGV 2020 Bireysel bagiscilarimiz
TEGV 2020 Bireysel bagiscilarimizTEGV 2020 Bireysel bagiscilarimiz
TEGV 2020 Bireysel bagiscilarimiz
 
How to make and manage a bee hotel?
How to make and manage a bee hotel?How to make and manage a bee hotel?
How to make and manage a bee hotel?
 
Cahit arf makineler dusunebilir mi
Cahit arf makineler dusunebilir miCahit arf makineler dusunebilir mi
Cahit arf makineler dusunebilir mi
 
How did Software Got So Reliable Without Proof?
How did Software Got So Reliable Without Proof?How did Software Got So Reliable Without Proof?
How did Software Got So Reliable Without Proof?
 
Staff Report on Algorithmic Trading in US Capital Markets
Staff Report on Algorithmic Trading in US Capital MarketsStaff Report on Algorithmic Trading in US Capital Markets
Staff Report on Algorithmic Trading in US Capital Markets
 
Yetiskinler icin okuma yazma egitimi
Yetiskinler icin okuma yazma egitimiYetiskinler icin okuma yazma egitimi
Yetiskinler icin okuma yazma egitimi
 
Consumer centric api design v0.4.0
Consumer centric api design v0.4.0Consumer centric api design v0.4.0
Consumer centric api design v0.4.0
 
State of microservices 2020 by tsh
State of microservices 2020 by tshState of microservices 2020 by tsh
State of microservices 2020 by tsh
 
Uber pitch deck 2008
Uber pitch deck 2008Uber pitch deck 2008
Uber pitch deck 2008
 
Wireless solar keyboard k760 quickstart guide
Wireless solar keyboard k760 quickstart guideWireless solar keyboard k760 quickstart guide
Wireless solar keyboard k760 quickstart guide
 
State of Serverless Report 2020
State of Serverless Report 2020State of Serverless Report 2020
State of Serverless Report 2020
 
Dont just roll the dice
Dont just roll the diceDont just roll the dice
Dont just roll the dice
 

Recently uploaded

一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
zwunae
 
The Benefits and Techniques of Trenchless Pipe Repair.pdf
The Benefits and Techniques of Trenchless Pipe Repair.pdfThe Benefits and Techniques of Trenchless Pipe Repair.pdf
The Benefits and Techniques of Trenchless Pipe Repair.pdf
Pipe Restoration Solutions
 
Architectural Portfolio Sean Lockwood
Architectural Portfolio Sean LockwoodArchitectural Portfolio Sean Lockwood
Architectural Portfolio Sean Lockwood
seandesed
 
Gen AI Study Jams _ For the GDSC Leads in India.pdf
Gen AI Study Jams _ For the GDSC Leads in India.pdfGen AI Study Jams _ For the GDSC Leads in India.pdf
Gen AI Study Jams _ For the GDSC Leads in India.pdf
gdsczhcet
 
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdfWater Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation & Control
 
DESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docxDESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docx
FluxPrime1
 
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdfAKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
SamSarthak3
 
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
MdTanvirMahtab2
 
weather web application report.pdf
weather web application report.pdfweather web application report.pdf
weather web application report.pdf
Pratik Pawar
 
Immunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary AttacksImmunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary Attacks
gerogepatton
 
MCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdfMCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdf
Osamah Alsalih
 
CME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional ElectiveCME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional Elective
karthi keyan
 
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&BDesign and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Sreedhar Chowdam
 
road safety engineering r s e unit 3.pdf
road safety engineering  r s e unit 3.pdfroad safety engineering  r s e unit 3.pdf
road safety engineering r s e unit 3.pdf
VENKATESHvenky89705
 
power quality voltage fluctuation UNIT - I.pptx
power quality voltage fluctuation UNIT - I.pptxpower quality voltage fluctuation UNIT - I.pptx
power quality voltage fluctuation UNIT - I.pptx
ViniHema
 
J.Yang, ICLR 2024, MLILAB, KAIST AI.pdf
J.Yang,  ICLR 2024, MLILAB, KAIST AI.pdfJ.Yang,  ICLR 2024, MLILAB, KAIST AI.pdf
J.Yang, ICLR 2024, MLILAB, KAIST AI.pdf
MLILAB
 
block diagram and signal flow graph representation
block diagram and signal flow graph representationblock diagram and signal flow graph representation
block diagram and signal flow graph representation
Divya Somashekar
 
The role of big data in decision making.
The role of big data in decision making.The role of big data in decision making.
The role of big data in decision making.
ankuprajapati0525
 
Final project report on grocery store management system..pdf
Final project report on grocery store management system..pdfFinal project report on grocery store management system..pdf
Final project report on grocery store management system..pdf
Kamal Acharya
 
Standard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - NeometrixStandard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - Neometrix
Neometrix_Engineering_Pvt_Ltd
 

Recently uploaded (20)

一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
 
The Benefits and Techniques of Trenchless Pipe Repair.pdf
The Benefits and Techniques of Trenchless Pipe Repair.pdfThe Benefits and Techniques of Trenchless Pipe Repair.pdf
The Benefits and Techniques of Trenchless Pipe Repair.pdf
 
Architectural Portfolio Sean Lockwood
Architectural Portfolio Sean LockwoodArchitectural Portfolio Sean Lockwood
Architectural Portfolio Sean Lockwood
 
Gen AI Study Jams _ For the GDSC Leads in India.pdf
Gen AI Study Jams _ For the GDSC Leads in India.pdfGen AI Study Jams _ For the GDSC Leads in India.pdf
Gen AI Study Jams _ For the GDSC Leads in India.pdf
 
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdfWater Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdf
 
DESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docxDESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docx
 
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdfAKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
 
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
 
weather web application report.pdf
weather web application report.pdfweather web application report.pdf
weather web application report.pdf
 
Immunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary AttacksImmunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary Attacks
 
MCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdfMCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdf
 
CME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional ElectiveCME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional Elective
 
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&BDesign and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
Design and Analysis of Algorithms-DP,Backtracking,Graphs,B&B
 
road safety engineering r s e unit 3.pdf
road safety engineering  r s e unit 3.pdfroad safety engineering  r s e unit 3.pdf
road safety engineering r s e unit 3.pdf
 
power quality voltage fluctuation UNIT - I.pptx
power quality voltage fluctuation UNIT - I.pptxpower quality voltage fluctuation UNIT - I.pptx
power quality voltage fluctuation UNIT - I.pptx
 
J.Yang, ICLR 2024, MLILAB, KAIST AI.pdf
J.Yang,  ICLR 2024, MLILAB, KAIST AI.pdfJ.Yang,  ICLR 2024, MLILAB, KAIST AI.pdf
J.Yang, ICLR 2024, MLILAB, KAIST AI.pdf
 
block diagram and signal flow graph representation
block diagram and signal flow graph representationblock diagram and signal flow graph representation
block diagram and signal flow graph representation
 
The role of big data in decision making.
The role of big data in decision making.The role of big data in decision making.
The role of big data in decision making.
 
Final project report on grocery store management system..pdf
Final project report on grocery store management system..pdfFinal project report on grocery store management system..pdf
Final project report on grocery store management system..pdf
 
Standard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - NeometrixStandard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - Neometrix
 

0202 fmc3

  • 1. ECE 562: Information Theory Spring 2006 Lecture 4 — February 2 Lecturer: Sergio D. Servetto Scribe: Frank Ciaramello 4.1 Some Useful Information Inequalities This section proves some useful inequalities that will be used often. First, we will show that conditioning a random variable cannot increase its entropy. In- tuitively, this makes sense. The act of conditioning adds information about a particular random variable. Therefore, the uncertainty must go down (or stay the same, if the condi- tioning added no information, i.e. the conditioning variable is independent of the random variable.) Theorem 4.1. “Conditioning Does Not Increase Entropy” H(X|Y ) ≤ H(X); for any random variable X, Y (4.1) Proof: H(X|Y ) = H(X) − I(X; Y ) I(X; Y ) ≥ 0 ∴ H(X|Y ) ≤ H(X) Two results that can be taken from theorem 4.1 are that equality holds only for the case of independence and that we can condition on more than one random variable: 1. H(X|Y ) = H(X) ⇐⇒ X and Y are independent 2. H(X|Y Z) ≤ H(X|Y ) ≤ H(X) The next inequality we will prove shows that mutual entropy can be upper bounded by the case when each random variable is independent. This means that dependence among random variables decreases entropy. We can prove it using two different methods. 4-1
  • 2. ECE 562 Lecture 4 — February 2 Spring 2006 Theorem 4.2. “Independence Bound” H(X1, X2, ..., Xn) ≤ n i=1 H(Xi) (4.2) Proof: Method 1 uses the chain rule for entropy. H(X1, X2, ..., Xn) = n i=1 H(Xi|X1...Xi−1) ≤ n i=1 H(Xi) Proof: Method 2 expands the entropies and relates them to a relative entropy, or divergence. n i=1 H(Xi) − H(X1, ..., Xn) = − n i=1 E(log p(Xi)) + E(log p(X1...Xn)) = −E(log p(X1)...p(Xn)) + E(log p(X1...Xn)) = E(log p(X1...Xn) p(X1)...p(Xn) ) = D(p(X1...Xn)||p(X1)...p(Xn)) ≥ 0 4.2 Data Processing Inequality This section provides the necessary theorems and lemmas to prove the data processing in- equality. Theorem 4.3. I(X; Y, Z) ≥ I(X; Y ) (4.3) equality holds ⇐⇒ X-Y-Z forms a Markov chain. Proof: Using the chain rule for mutual information, we show I(X; Y, Z) = I(X; Y ) + I(X; Z|Y ) ≥0 ≥ I(X; Y ) 4-2
  • 3. ECE 562 Lecture 4 — February 2 Spring 2006 The following theorem, theorem 4.4 shows that the closer in the Markov Chain the variables are, the more information they share between them. I.e. variables that are far apart are closer to being independent. Theorem 4.4. X-Y-Z forms a Markov Chain ⇐⇒ I(X; Z) ≤ I(X; Y ) (4.4) I(X; Z) ≤ I(Y ; Z) (4.5) Proof: Prove by expanding mutual information in two different ways. I(X; Y, Z) = I(X; Z) + I(X; Y |Z) I(X; Y, Z) = I(X; Y ) + I(X; Z|Y ) By the definition of a Markov chain, X⊥Z|Y , therefore, I(X; Z|Y ) = 0 and I(X; Y ) = I(X; Z) + I(X; Y |Z) Mutual information is always greater than or equal to zero, therefore I(X; Y ) ≥ I(X; Z) Since X-Y-Z is equivalent to Z-Y-X, the same method can be used to prove (4.5) Theorem 4.5. “Data Processing Inequality” If U-X-Y-V is a Markov Chain, then I(U; V ) ≤ I(X; Y ) (4.6) Proof: Since U-X-Y-V is a MC, then U-X-Y and X-Y-V are MCs. The proof follows simply from theorem 4.4 I(U; Y ) ≤ I(X; Y ) I(U; V ) ≤ I(U; Y ) ∴ I(U; V ) ≤ I(X; Y ) The data processing inequality shows us that if we want to infer X using Y, the best we can do is simply to use an unprocessed version of Y. By processing Y (either deterministically or probabilistically), we increase uncertainty in X, given the processed version of Y. 4-3
  • 4. ECE 562 Lecture 4 — February 2 Spring 2006 4.3 Fano’s Inequality The following are lemmas and definitions required for Fano’s Inequality. Lemma 4.6 shows that the entropy of a random variable is always less than or equal to the log of the size of its alphabet. Lemma 4.6. H(X) ≤ log |X| (4.7) equality holds ⇐⇒ P(X = x) = 1 |X| , ∀x Proof: We prove this by expanding the terms into their summations and relating them to a relative entropy measure. log |X| − H(X) = − x∈X p(x) log |X|−1 + x∈X p(x) log p(x) = − x∈X p(x) log u(x) + x∈X p(x) log p(x); u(x) = 1 |X| = x∈X p(x) log p(x) u(x) =D(p(x)||q(x)) ≥ 0 One consequence is that equality holds if and only if p(x) = u(x), i.e. p(x) is a uniform distribution. This says that entropy is maximum when all outcomes are equally likely. This makes sense, intuitively, since entropy measures uncertainty in the random variable X. Another consequence is the following corollary: Corollary 4.7. H(X) can be any non-negative real number. Proof: The proof follows from the intermediate value theorem. We know that H(X) = 0 for a deterministic signal and H(X) = log |X| for a uniform distribution. For any value 0 < a < log |X|, ∃X such that H(X) = a. For |X| sufficiently large, H(X) can take any positive value. Theorem 4.8. “Fano’s Inequality” 4-4
  • 5. ECE 562 Lecture 4 — February 2 Spring 2006 First, we define Pe, probability of error: Let X, ˆX two random variables on X Pe = P(X = ˆX) (4.8) Fano’s Inequality: H(X| ˆX) ≤ hb(Pe) + Pe log |X| − 1 (4.9) Proof: We will prove Fano’s inequality by expanding the entropy and by using theorem 4.1. Define an indicator function: Y = 0, X = ˆX 1, X = ˆX Note: p(Y = 1) = Pe p(Y = 0) = 1 − Pe H(Y ) = hb(Pe) H(Y |X, ˆX) = 0 H(X| ˆX) = I(X; Y | ˆX) + H(X| ˆX, Y ) = H(X| ˆX) − H(X| ˆX, Y ) + H(X| ˆX, Y ) = H(Y | ˆX) − H(Y | ˆX, X) + H(X| ˆX, Y ) = H(Y | ˆX) + H(X| ˆX, Y ) ≤ H(Y ) + H(X| ˆX, Y ) = H(Y ) + ˆx∈X P( ˆX = ˆx, Y = 0)H(X| ˆX = ˆx, Y = 0) + P( ˆX = ˆx, Y = 1)H(X| ˆX = ˆx, Y = 1) Since Y = 0, X = ˆX. Therefore, the first term in the summation is 0: H(X| ˆX = ˆx, Y = 0) = 0 (4.10) Lemma 4.6 says that the entropy is less than or equal to the log of the size of the alphabet of a random variable. Since we know that X = ˆX(Y = 1) then the possible alphabet size for X is |X| − 1, the original minus the value that ˆX has taken. Therefore, H(X| ˆX = ˆx, Y = 1) ≤ log(|X| − 1) (4.11) 4-5
  • 6. ECE 562 Lecture 4 — February 2 Spring 2006 Using (4.10) and (4.11), we can show H(X| ˆX) ≤ hb(Pe) + log(|X| − 1) ˆx∈X P( ˆX = ˆx, Y = 1) p(Y =1)=Pe H(X| ˆX) ≤ hb(Pe) + Pe log(|X| − 1) Corollary 4.9. Weak Fano’s Inequality H(X| ˆX) ≤ 1 + Pe log |X| (4.12) Proof: We know that binary entropy is upper bounded by 1 and that log is an increasing function in X , therefore log(|X| − 1) ≤ log(|X|). The corollary is proven. 4-6