SlideShare a Scribd company logo
1 of 5
Download to read offline
Proof of Kraft-McMillan theorem∗
Nguyen Hung Vu†
平成 16 年 2 月 24 日
1 Kraft-McMillan theorem
Let C be a code with n codewords with lenghth l1, l2, ..., lN . If
C is uniquely decodable, then
K(C) =
NX
i=1
2−li
≤ 1 (1)
2 Proof
The proof works by looking at the nth power of K(C). If K(C)
is greater than one, then K(C)n
should grw expoentially with n.
If it does not grow expoentially with n, then this is proof thatPN
i=1 2−li
≤ 1.
Let n be and arbitrary integer. Then
hPN
i=1 2−li
in
=
=
“PN
i1=1 2−li1
”
· · ·
“PN
i2=1 2−li2
” “PN
iN =1 2−lin
”
and then
" NX
i=1
2−li
#n
=
NX
i1=1
NX
i2=1
· · ·
NX
in=1
2−(li1+i2+···+in)
(2)
The exponent li1 + li2 + · · · + lin is simply the length of n
codewords from the code C. The smallest value this exponent
can take is greater than or equal to n, which would be the case if
codwords were 1 bit long. If
l = max{l1, l2, · · · , lN }
then the largest value that the exponent can take is less than or
equal to nl. Therefore, we can write this summation as
K(C)n
=
nlX
n
Ak2−k
where Ak is the number of combinations of n codewords that
have a combined length of k. Let’s take a look at the size of this
coeficient. The number of possibledistinct binary sequences of
length k is 2k
. If this code is uniquely decodable, then each se-
quence can represent one and only one sequence of codewords.
Therefore, the number of possible combinations of codewords
whose combined length is k cannot be greater than 2k
. In other
words,
Ak ≤ 2k
.
∗Introduction to Data Compression, Khalid Sayood
†email: vuhung@fedu.uec.ac.jp
This means that
K(C)n
=
nlX
k=n
Ak2−k
≤
nlX
k=n
2k
2−k
= nl − n + 1 (3)
But K(C) is greater than one, it will grow exponentially with
n, while n(l−1)+1 can only grow linearly, So if K(C) is greater
than one, we can always find an n large enough that the inequal-
ity (3) is violated. Therefore, for an uniquely decodable code C,
K(C) is less than or equal ot one.
This part of of Kraft-McMillan inequality provides a necces-
sary condition for uniquely decodable codes. That is, if a code
is uniquely decodable, the codeword lengths have to satisfy the
inequality.
3 Construction of prefix code
Given a set of integers l1, l2, · · · , lN that satisfy the inequality
NX
i=1
2−li
≤ 1 (4)
we can always find a prefix code with codeword lengths
l1, l2, · · · , lN .
4 Proof of prefix code constructing
theorem
We will prove this assertion by developing a procedure for con-
structing a prefix code with codeword lengths l1, l2, · · · , lN that
satisfy the given inequality. Without loss of generality, we can
assume that
l1 ≤ l2 ≤ · · · ≤ lN . (5)
Define a sequence of numbers w1, w2, · · · , wN as follows:
w1 = 0
wj =
Pj−1
i=1 2lj −li
j > 1.
The binary representation of wj for j > 1 would take up
log2 wj bits. We will use this binary representation to construct
a prefix code. We first note that the number of bits in the binary
representation of wj is less than or equal to lj. This is obviously
true for w1. For j > 1,
log2 wj = log2
hPj−1
i=1 2lj −li
i
= log2
h
2lj
Pj−1
i=1 2−li
i
= lj + log2
hPj−1
i=1 2−li
i
≤ lj.
1
The last inequality results from the hypothesis of the theo-
rem that
PN
i=1 2−li
≤ 1, which implies that
Pj−1
i=1 2li
≤ 1.
As the logarithm of a number less than one is negative, lj +
log2
ˆP
i = 1j−1
2−li
˜
has to be less than lj.
Using the binary representation of wj, we can devise a binary
code in the following manner: If log2 wj = lj, then the jth
codeword cj is the binary representation of wj. If log2 wj < lj,
then cj is the binary representation of wj, with lj − log2 wj
zeros appended to the right. This certainly a code, but is it a prefix
code? If we can show that the code C = {c1, c2, · · · , cN } is a
prefix, then we will have proved the theorem by construction.
Suppose that our claim is not true.. The for some j < k, cj is a
refix of ck. This means that lj most significant bits of wk form the
binary representation of wj. Therefore if we right–shift the binary
of wk by lk − lj bits, we should get the binary representation for
wj. We can write this as
wj =
wk
2lk−lj
.
However,
wk =
k−1X
i=1
2lk−li
.
Therefore,
wk
2lk−lj
=
k−1X
i=0
2lj −li
= wj +
k−1X
i=j
2lj −li
= wj + 20
+
k−1X
i=j+1
2lj −lk
≥ wj + 1
(6)
That is, the smalleset value for wk
2
lk−lj
is wj + 1. This constra-
dicts the requirement for cj being the prefix of ck. Therefore, cj
cannot be the prefix of ck. As j and k were arbitrary, this means
that no codword is a prefix of another codeword, and the code C
is a prefix code.
5 Projects and Problems
1. Suppose X is a random variable that takes on values from an
M–letter alphabet. Show that 0 ≤ H(X) ≤ log2 M.
Hints: H(X) = −
Pn
i=1 P(αi) log P(αi) where, M =
{α1 × m1, α2 × m2, · · · , αn × mn}, αi = αj∀i =
j,
Pn
i=1 mi = M. H(X) = −
Pn
i=1
mi
M
log mi
M
=
− 1
M
Pn
i=1 mi(log mi − log M) = − 1
M
P
mi log mi +
1
M
log M
P
mi = − 1
M
P
mi log mi + log M
2. Show that for the case where the elements of an observed
sequence are idd 1
,the entropy is equal to the first-order
entropy.
Hints: First–order entropy is defined by:
H1(S) = limn→∞
1
n
Gn, where Gn =
−
Pi1=m
i1=1
Pi2=m
i2=1 · · ·
P
in=1 in = mP(X1 =
i1, X2 = i2, · · · , Xn = in) log P(X1 = i1, X2 =
1iid: independent and identical distributed
i2, · · · , Xn = in) . And second–order entropy is defined
by H2(S) = −
P
P(X1) log P(X1). In case of iid,
Gn = −n
Pi1=m
i1=1 P(X1 = i1) log P(X1 = i1). Prove it!
3. Given an alphabet A = {a1, a2, a3, a4}, find the first–order
entropy in the following cases:
(a) P(a1) = P(a2) = P(a3) = P(a4) = 1
4
.
(b) P(a1) = 1
2
, P(a2) = 1
4
, P(a3) = P(a4) = 1
8
.
(c) P(a1) = 0.505, P(a2) = 1
4
, P(a3) = 1
8
, P(a4) =
0.12.
4. Suppose we have a source with a probability model P =
{p0, p1, · · · , pm} and entropy HP . Suppose we have an-
other sourcewith probability model Q = {q0, q1, · · · qm}
and entroy HQ, where
qi = pi i = 0, 1, · · · , j − 2, j + 1, · · · , m
and
qj = qj−1 =
pj +pj−1
2
How is HQ related to HP ( greater, equal, or less)? Prove
your answer.
Hints: HQ − HP =
P
pi log pi −
P
qi log qi =
pj−1 log pj−1 + pj log pj − 2
pj−1+pj
2
log
pj−1+pj
2
=
φ(pj−1) + φ(pj) − 2φ(
pj−1+pj
2
). Where φ(x) = x log x
and φ (x) = 1/x > 0 ∀x > 0.
5. There are several image and speech files among the accom-
panying data sets.
(a) Write a program to compute the first–order entropy of
some of the image and speech files.
(b) Pick one of the image files and compute is second–
order entropy. Comment on the difference between
the first–order and second–order entropies.
(c) Compute the entropy of the difference between neigh-
boring pixels for the image you used in part (b). Com-
ment on what you discovered.
6. Conduct an experiemnt to see how well a model can describe
a source.
(a) Write a program that randomly selects letters from
a 26–letter alphabet {a, b, c, · · · , z} and forms four–
letter words. Form 100 such words and see how many
of these words make sence.
(b) Among the accompanying data sets is a file called
4letter.words , which contains a list of four–
letter words. Using this file, obtain a probability
model for the alphabet. Now repeat part (a) gener-
ating the words using probability model. To pick let-
ters according to a probability model, construct the
cumulative density function (cdf)FX (x)2
. Using a
uniform pseudorandom number generator to generate
a value r, where 0 ≤ r < 1, pick the letter xk if
FX (xk − 1) ≤ r < FX (xk). Compare your results
with those of part (a).
(c) Repeat (b) using a single–letter context.
(d) Repeat (b) using a two–letter context.
7. Determine whether the following codes are uniquely decod-
able:
2cumulative distribution function (cdf) FX (x) = P(X ≤ x)
(a) {0, 01, 11, 111}( 01 11 = 0 111)
(b) {0, 01, 110, 111} (01 110 = 0 111 0)
(c) {0, 10, 110, 111}, Yes. Prove it!
(d) {1, 10, 110, 111}, (1 10 = 110)
8. Using a text file compute the probabilities of each letter pi.
(a) Assume that we need a codeword of length 1
log2 pi
to encode the letter i. Determine the number of bits
needed to encode the file.
(b) Compute the conditional probability P(i/j) of a let-
ter i given that the previous letter is j. Assume that
we need 1
log2 P (i/j)
to represent a letter i that fol-
lows a letter j. Determine the number of bits needed
to encode the file.
6 Huffman coding
6.1 Definition
A minimal variable-length character encoding based on the fre-
quency of each character. First, each character becomes a trivial
tree, with the character as the only node. The character’s fre-
quency is the tree’s frequency. The two trees with the least fre-
quencies are joined with a new root which is assigned the sum
of their frequencies. This is repeated until all characters are in
one tree. One code bit represents each level. Thus more frequent
characters are near the root and are encoded with few bits, and
rare characters are far from the root and are encoded with many
bits.
6.2 Example of Huffman encoding design
Alphabet A = {a1, a2, a3, a4} with P(a1) = P(a2) = 0.2
and P(a4) = P(a5) = 0.1. The entropy of this source is
2.122bits/symbol. To design the Huffman code, we first sort the
letters in a descending probability order as shown in table below.
Here c(ai) denotes the codewords as
c(a4) = α1 ∗ 0
c(a5) = α1 ∗ 1
where α1 is a binary string, and * denotes concatnation.
The initial five–letter alphabet
Letter Probability Codeword
a2 0.4 c(a2)
a1 0.2 c(a1)
a3 0.2 c(a3)
a4 0.1 c(a4)
a5 0.1 c(a5)
Now we define a new alphabe A with a four–letter
a1, a2, a3, a4 where a4 is composed of a4 and a5 and has a proba-
bility P(a4) = P(a4) + P(a5) = 0.2. We sort this new alphabet
in descending order to obtain following table.
The reduced four–letter alphabet
Letter Probability Codewords
a2 0.4 c(a2)
a1 0.2 c(a1)
a3 0.2 c(a3)
a4 0.2 α1
In this alphabet, a3 and a4 are the two letters at the bottom of
the sorted list. We assign their codewords as
c(a3) = α2 ∗ 0
c(a4) = α2 ∗ 1
but c(a4) = α1. Therefore
α1 = α2 ∗ 1
which means that
c(a4) = α2 ∗ 10
c(a5) = α2 ∗ 11.
At this stage, we again define a new alphabet A that consists
of three letters a1, a2, a3, where a3 is composed of a3 and a4 and
has a probability P(a3) = P(a3) + P(a4) = 0.4. We sort this
new alphabet in descending order to obtain the following table.
The reduced four–letter alphabet
Letter Probability Codewords
a2 0.4 c(a2)
a3 0.4 α2
a1 0.2 c(a1)
In this case, the two least probable symbols are a1 and a3.
Therefore,
c(a3) = α3 ∗ 0
c(a1) = α3 ∗ 1
But c(a3) = α2. Therefore,
α2 = α3 ∗ 0
which means that.
c(a3) = α3 ∗ 00
c(a4) = α4 ∗ 010
c(a5) = α5 ∗ 011.
Again we define a new alphabet, this time with only two let-
ter a3 , a2. Here a3 is composed of letters a3 and a1 and has
probability P(a3 ) = P(a3) + P(a1) = 0.6. We now have the
following table
The reduced four–letter alphabet
Letter Probability Codewords
a3 0.6 α3
a2 0.4 c(a2)
As we have only two letters, the codeword assignment is
straghtforward:
c(a3 ) = 0
c(a2) = 1
which means that α3 = 0, which in turn means that
c(a1) = 01
c(a3) = 000
c(a4) = 0010
c(a5) = 0011
and the Huffman finally is given here.
The reduced four–letter alphabet
Letter Probability Codewords
a2 0.4 1
a1 0.2 01
a3 0.2 000
a4 0.1 0010
a5 0.1 0011
6.3 Huffman code is optimal
The neccessary conditions for an optimal variable binary code
are as follows:
1. Condition 1: Give any two letter aj and ak, if P(aj) ≥
P(ak), then lj ≤ lk, where lj is the number of bits in the
codeword for aj.
2. Condition 2: The two least probable letters have codewords
with the same maximum length lm.
3. Condition 3: In the tree corresponding to the optimum code,
there must be two branches stemming3
from each interme-
diate node.
4. Condition 4: Suppose we change an intermediate node into
a leaf node by combining all the leaves descending from
it into a composite word of a reduced alphabet. Then, if
the original tree was optimal for the original alphabet, the
reduced tree is optimal for reduced alphabet.
The Huffman code satisfies all conditions above and therefore
be optimal.
6.4 Average codeword length
For a source S with alphabet A = {a1, a2, · · · , aK }, and prob-
ability model {P(a1), P(a2), · · · , P(aK )}, the average code-
word length is given by
¯l =
KX
i=1
P(ai)li.
The difference between the entropy ofthe sourcce H(S) and the
average length is
H(S) − ¯l = −
PK
i=1 P(ai) log2 P(ai) −
PK
i=1 P(ai)li
=
P
P(ai)
“
log
h
1
P (ai)
− li
i”
=
P
P(ai)
“
log
h
1
P (ai)
i
− log2(2li
)
”
=
P
P(ai) log
h
2−li
P (ai)
i
≤ log2
ˆP
2li
˜
6.5 Length of Huffman code
It has been proven that
H(S) ≤ ¯(l) < H(S) + 1 (7)
where H(S) is the entropy of the source S.
3生じる
6.6 Extended Huffman code
Consider an alphabet A = {a1, a2, · · · , am}. Each element of
the sequence is generated independently of the other elements in
the sequence. The entropyo f this source is give by
H(S) = −
NX
i=1
P(ai) log2 P(ai)
We know that we can generate a Huffman code for this source
with rate r such that
H(S) ≤ R < H(S) + 1 (8)
We have used the looser bound here; the same argument can
be made with the tighter bound. Notice thatwe have used ”rate
R” to denote the number of bits per symbol. This is a standard
convention in the data compression literate. However, in the co-
munication literature, the word ”rate” ofter refers to the number
of bits per second.
Suppose we now encode the sequence by generating one code-
word for every n symbols. As there are mn
combinations of
n symbols, we will need mn
codewords in our Huffman code.
We could generate this code by viewing mn
symbols as let-
ters of an extended alphabet. A(n)
= {
ntimes
z }| {
a1a1...a1, a1a1...a2,
..., a2, a1a1, ...am, a1a1...a2a1, ..., amam...am} from a source
S(n)
. Denote the rate for the new source as R(n)
. We know
H(S(n)
) ≤ R(n)
< H(S(n)
) + 1. (9)
R =
1
n
R(n)
and
H(S(n)
)
n
≤ R <
H(S(n)
)
n
+
1
n
It can be proven that
H(S(n)
) = nH(S)
and therefore
H(S) ≤ R ≤ H(S) +
1
n
6.6.1 Example of Huffman Extended Code
A = {a1, a2, a3} with probabilities model P(a1) =
0.8, P(a2) = 0.02 and P(a3) = 0.18. The entropy for this
source is 0.816 bits per symbol. Huffman code this this source is
shown below
a1 0
a2 11
a3 10
and the extended code is
a1a1 0.64 0
a1a2 0.016 10101
a1a3 0.144 11
a2a1 0.016 101000
a2a2 0.0004 10100101
a2a3 0.0036 1010011
a3a1 0.1440 100
a3a2 0.0036 10100100
a3a3 0.0324 1011
6.7 Nonbinary Huffman codes
6.8 Adaptive Huffman Coding
7 Arithmetic coding
4
Use the mapping
X(ai) = i, ai ∈ A (10)
where A = {a1, a2, ...am} is the alphabet for a discrete source
and X is a random variable. This mapping means that given a
probability model mathcalP for the source, we also have a prob-
ability density function for the random variable
P(X = i) = P(ai)
and the curmulative density function can be defined as
FX (i) =
iX
k=1
P(X = k).
Define ¯TX (ai) as
¯TX (ai) =
i−1X
k=1
P(X = k) +
1
2
P(X = i) (11)
= FX (i − 1) +
1
2
P(X = i) (12)
For each ai, ¯TX (ai) will have a unique value. This value can be
used as a unique tag for ai. In general
¯T
(m)
X (xi) =
X
y<xi
P(y) +
1
2
P(xi) (13)
4Arithmetic: 算数, 算術

More Related Content

What's hot

lecture 30
lecture 30lecture 30
lecture 30sajinsc
 
04 greedyalgorithmsii 2x2
04 greedyalgorithmsii 2x204 greedyalgorithmsii 2x2
04 greedyalgorithmsii 2x2MuradAmn
 
Aaex4 group2(中英夾雜)
Aaex4 group2(中英夾雜)Aaex4 group2(中英夾雜)
Aaex4 group2(中英夾雜)Shiang-Yun Yang
 
1 2 3
1 2 31 2 3
1 2 3wfke
 
20110319 parameterized algorithms_fomin_lecture01-02
20110319 parameterized algorithms_fomin_lecture01-0220110319 parameterized algorithms_fomin_lecture01-02
20110319 parameterized algorithms_fomin_lecture01-02Computer Science Club
 
Elliptical curve cryptography
Elliptical curve cryptographyElliptical curve cryptography
Elliptical curve cryptographyBarani Tharan
 
Random Oracle Model & Hashing - Cryptography & Network Security
Random Oracle Model & Hashing - Cryptography & Network SecurityRandom Oracle Model & Hashing - Cryptography & Network Security
Random Oracle Model & Hashing - Cryptography & Network SecurityMahbubur Rahman
 
Gentle Introduction to Dirichlet Processes
Gentle Introduction to Dirichlet ProcessesGentle Introduction to Dirichlet Processes
Gentle Introduction to Dirichlet ProcessesYap Wooi Hen
 
International Journal of Computational Engineering Research(IJCER)
 International Journal of Computational Engineering Research(IJCER)  International Journal of Computational Engineering Research(IJCER)
International Journal of Computational Engineering Research(IJCER) ijceronline
 
Elliptic Curve Cryptography for those who are afraid of maths
Elliptic Curve Cryptography for those who are afraid of mathsElliptic Curve Cryptography for those who are afraid of maths
Elliptic Curve Cryptography for those who are afraid of mathsMartijn Grooten
 
Theory of Automata and formal languages unit 1
Theory of Automata and formal languages unit 1Theory of Automata and formal languages unit 1
Theory of Automata and formal languages unit 1Abhimanyu Mishra
 
Elliptic Curves and Elliptic Curve Cryptography
Elliptic Curves and Elliptic Curve CryptographyElliptic Curves and Elliptic Curve Cryptography
Elliptic Curves and Elliptic Curve CryptographyMd. Al-Amin Khandaker Nipu
 
Aaex5 group2(中英夾雜)
Aaex5 group2(中英夾雜)Aaex5 group2(中英夾雜)
Aaex5 group2(中英夾雜)Shiang-Yun Yang
 
Programacion Cuadratica
Programacion CuadraticaProgramacion Cuadratica
Programacion Cuadraticapaquitootd
 
Presentation F Sharp
Presentation F Sharp Presentation F Sharp
Presentation F Sharp guestbfe8bf
 

What's hot (20)

Ki2518101816
Ki2518101816Ki2518101816
Ki2518101816
 
lecture 30
lecture 30lecture 30
lecture 30
 
04 greedyalgorithmsii 2x2
04 greedyalgorithmsii 2x204 greedyalgorithmsii 2x2
04 greedyalgorithmsii 2x2
 
Aaex4 group2(中英夾雜)
Aaex4 group2(中英夾雜)Aaex4 group2(中英夾雜)
Aaex4 group2(中英夾雜)
 
Daa notes 3
Daa notes 3Daa notes 3
Daa notes 3
 
1 2 3
1 2 31 2 3
1 2 3
 
20110319 parameterized algorithms_fomin_lecture01-02
20110319 parameterized algorithms_fomin_lecture01-0220110319 parameterized algorithms_fomin_lecture01-02
20110319 parameterized algorithms_fomin_lecture01-02
 
Lecture26
Lecture26Lecture26
Lecture26
 
Elliptical curve cryptography
Elliptical curve cryptographyElliptical curve cryptography
Elliptical curve cryptography
 
Random Oracle Model & Hashing - Cryptography & Network Security
Random Oracle Model & Hashing - Cryptography & Network SecurityRandom Oracle Model & Hashing - Cryptography & Network Security
Random Oracle Model & Hashing - Cryptography & Network Security
 
Gentle Introduction to Dirichlet Processes
Gentle Introduction to Dirichlet ProcessesGentle Introduction to Dirichlet Processes
Gentle Introduction to Dirichlet Processes
 
International Journal of Computational Engineering Research(IJCER)
 International Journal of Computational Engineering Research(IJCER)  International Journal of Computational Engineering Research(IJCER)
International Journal of Computational Engineering Research(IJCER)
 
Elliptic Curve Cryptography for those who are afraid of maths
Elliptic Curve Cryptography for those who are afraid of mathsElliptic Curve Cryptography for those who are afraid of maths
Elliptic Curve Cryptography for those who are afraid of maths
 
Theory of Automata and formal languages unit 1
Theory of Automata and formal languages unit 1Theory of Automata and formal languages unit 1
Theory of Automata and formal languages unit 1
 
Elliptic Curves and Elliptic Curve Cryptography
Elliptic Curves and Elliptic Curve CryptographyElliptic Curves and Elliptic Curve Cryptography
Elliptic Curves and Elliptic Curve Cryptography
 
Elliptic curve cryptography
Elliptic curve cryptographyElliptic curve cryptography
Elliptic curve cryptography
 
Chapter 03 cyclic codes
Chapter 03   cyclic codesChapter 03   cyclic codes
Chapter 03 cyclic codes
 
Aaex5 group2(中英夾雜)
Aaex5 group2(中英夾雜)Aaex5 group2(中英夾雜)
Aaex5 group2(中英夾雜)
 
Programacion Cuadratica
Programacion CuadraticaProgramacion Cuadratica
Programacion Cuadratica
 
Presentation F Sharp
Presentation F Sharp Presentation F Sharp
Presentation F Sharp
 

Viewers also liked

เทคโนโลยีปัจจุบันกับการใช้งานใน มข.
เทคโนโลยีปัจจุบันกับการใช้งานใน มข.เทคโนโลยีปัจจุบันกับการใช้งานใน มข.
เทคโนโลยีปัจจุบันกับการใช้งานใน มข.Kitt Tientanopajai
 
Nguyễn Vũ Hưng: LibreOffice/ODF on Smartphone/Tablet/Cloud
Nguyễn Vũ Hưng: LibreOffice/ODF on Smartphone/Tablet/Cloud Nguyễn Vũ Hưng: LibreOffice/ODF on Smartphone/Tablet/Cloud
Nguyễn Vũ Hưng: LibreOffice/ODF on Smartphone/Tablet/Cloud Vu Hung Nguyen
 
LibreOffice/OpenOffice.org - non coding extensions
LibreOffice/OpenOffice.org - non coding extensionsLibreOffice/OpenOffice.org - non coding extensions
LibreOffice/OpenOffice.org - non coding extensionsKálmán "KAMI" Szalai
 
Itlc hanoi ba day 3 - thai son - data modelling
Itlc hanoi   ba day 3 - thai son - data modellingItlc hanoi   ba day 3 - thai son - data modelling
Itlc hanoi ba day 3 - thai son - data modellingVu Hung Nguyen
 
Thói ngụy biện ở người Việt
Thói ngụy biện ở người ViệtThói ngụy biện ở người Việt
Thói ngụy biện ở người ViệtVu Hung Nguyen
 
Basic configuration fortigate v4.0 mr2
Basic configuration fortigate v4.0 mr2Basic configuration fortigate v4.0 mr2
Basic configuration fortigate v4.0 mr2Gol D Roger
 
FPT Univ. Talkshow IT khong chi la lap trinh
FPT Univ. Talkshow IT khong chi la lap trinhFPT Univ. Talkshow IT khong chi la lap trinh
FPT Univ. Talkshow IT khong chi la lap trinhVu Hung Nguyen
 
IT Public Speaking Guidelines
IT Public Speaking GuidelinesIT Public Speaking Guidelines
IT Public Speaking GuidelinesVu Hung Nguyen
 
Phương pháp phát triển phần mềm: Truyền thống và Agile
Phương pháp phát triển phần mềm: Truyền thống và AgilePhương pháp phát triển phần mềm: Truyền thống và Agile
Phương pháp phát triển phần mềm: Truyền thống và AgileVu Hung Nguyen
 
Basic & Advanced Scrum Framework
Basic & Advanced Scrum FrameworkBasic & Advanced Scrum Framework
Basic & Advanced Scrum FrameworkVu Hung Nguyen
 
Fuji Technology Workshop: Learning Skills
Fuji Technology Workshop: Learning SkillsFuji Technology Workshop: Learning Skills
Fuji Technology Workshop: Learning SkillsVu Hung Nguyen
 
Tự học sử dụng Linux
Tự học sử dụng LinuxTự học sử dụng Linux
Tự học sử dụng LinuxVu Hung Nguyen
 
Distributed Transaction in Microservice
Distributed Transaction in MicroserviceDistributed Transaction in Microservice
Distributed Transaction in MicroserviceNghia Minh
 
My idol: Magnus Carlsen vs. Ky Anh 2G1 NGS Newton
My idol: Magnus Carlsen vs. Ky Anh 2G1 NGS NewtonMy idol: Magnus Carlsen vs. Ky Anh 2G1 NGS Newton
My idol: Magnus Carlsen vs. Ky Anh 2G1 NGS NewtonVu Hung Nguyen
 
Basic advanced scrum framework
Basic advanced scrum frameworkBasic advanced scrum framework
Basic advanced scrum frameworkVu Hung Nguyen
 
Databazove systemy6
Databazove systemy6Databazove systemy6
Databazove systemy6olc_user
 

Viewers also liked (20)

เทคโนโลยีปัจจุบันกับการใช้งานใน มข.
เทคโนโลยีปัจจุบันกับการใช้งานใน มข.เทคโนโลยีปัจจุบันกับการใช้งานใน มข.
เทคโนโลยีปัจจุบันกับการใช้งานใน มข.
 
KKU ICT
KKU ICTKKU ICT
KKU ICT
 
Nguyễn Vũ Hưng: LibreOffice/ODF on Smartphone/Tablet/Cloud
Nguyễn Vũ Hưng: LibreOffice/ODF on Smartphone/Tablet/Cloud Nguyễn Vũ Hưng: LibreOffice/ODF on Smartphone/Tablet/Cloud
Nguyễn Vũ Hưng: LibreOffice/ODF on Smartphone/Tablet/Cloud
 
LibreOffice/OpenOffice.org - non coding extensions
LibreOffice/OpenOffice.org - non coding extensionsLibreOffice/OpenOffice.org - non coding extensions
LibreOffice/OpenOffice.org - non coding extensions
 
Itlc hanoi ba day 3 - thai son - data modelling
Itlc hanoi   ba day 3 - thai son - data modellingItlc hanoi   ba day 3 - thai son - data modelling
Itlc hanoi ba day 3 - thai son - data modelling
 
Thói ngụy biện ở người Việt
Thói ngụy biện ở người ViệtThói ngụy biện ở người Việt
Thói ngụy biện ở người Việt
 
Basic configuration fortigate v4.0 mr2
Basic configuration fortigate v4.0 mr2Basic configuration fortigate v4.0 mr2
Basic configuration fortigate v4.0 mr2
 
FPT Univ. Talkshow IT khong chi la lap trinh
FPT Univ. Talkshow IT khong chi la lap trinhFPT Univ. Talkshow IT khong chi la lap trinh
FPT Univ. Talkshow IT khong chi la lap trinh
 
IT Public Speaking Guidelines
IT Public Speaking GuidelinesIT Public Speaking Guidelines
IT Public Speaking Guidelines
 
Phương pháp phát triển phần mềm: Truyền thống và Agile
Phương pháp phát triển phần mềm: Truyền thống và AgilePhương pháp phát triển phần mềm: Truyền thống và Agile
Phương pháp phát triển phần mềm: Truyền thống và Agile
 
Basic & Advanced Scrum Framework
Basic & Advanced Scrum FrameworkBasic & Advanced Scrum Framework
Basic & Advanced Scrum Framework
 
Fuji Technology Workshop: Learning Skills
Fuji Technology Workshop: Learning SkillsFuji Technology Workshop: Learning Skills
Fuji Technology Workshop: Learning Skills
 
Tự học sử dụng Linux
Tự học sử dụng LinuxTự học sử dụng Linux
Tự học sử dụng Linux
 
Distributed Transaction in Microservice
Distributed Transaction in MicroserviceDistributed Transaction in Microservice
Distributed Transaction in Microservice
 
My idol: Magnus Carlsen vs. Ky Anh 2G1 NGS Newton
My idol: Magnus Carlsen vs. Ky Anh 2G1 NGS NewtonMy idol: Magnus Carlsen vs. Ky Anh 2G1 NGS Newton
My idol: Magnus Carlsen vs. Ky Anh 2G1 NGS Newton
 
Basic advanced scrum framework
Basic advanced scrum frameworkBasic advanced scrum framework
Basic advanced scrum framework
 
D11jl
D11jlD11jl
D11jl
 
Caliac disease
Caliac diseaseCaliac disease
Caliac disease
 
Lesson #3
Lesson #3Lesson #3
Lesson #3
 
Databazove systemy6
Databazove systemy6Databazove systemy6
Databazove systemy6
 

Similar to Proof of Kraft Mc-Millan theorem - nguyen vu hung

Analysis Of Algorithms Ii
Analysis Of Algorithms IiAnalysis Of Algorithms Ii
Analysis Of Algorithms IiSri Prasanna
 
Public Key Cryptography
Public Key CryptographyPublic Key Cryptography
Public Key CryptographyAbhijit Mondal
 
Prin digcommselectedsoln
Prin digcommselectedsolnPrin digcommselectedsoln
Prin digcommselectedsolnAhmed Alshomi
 
Stochastic Processes Homework Help
Stochastic Processes Homework HelpStochastic Processes Homework Help
Stochastic Processes Homework HelpExcel Homework Help
 
Mit2 092 f09_lec23
Mit2 092 f09_lec23Mit2 092 f09_lec23
Mit2 092 f09_lec23Rahman Hakim
 
RSA final notation change2
RSA final notation change2RSA final notation change2
RSA final notation change2Coleman Gorham
 
Csr2011 june15 11_00_sima
Csr2011 june15 11_00_simaCsr2011 june15 11_00_sima
Csr2011 june15 11_00_simaCSR2011
 
SMB_2012_HR_VAN_ST-last version
SMB_2012_HR_VAN_ST-last versionSMB_2012_HR_VAN_ST-last version
SMB_2012_HR_VAN_ST-last versionLilyana Vankova
 
17-dynprog2 17-dynprog2 17-dynprog2 17-dynprog2
17-dynprog2 17-dynprog2 17-dynprog2 17-dynprog217-dynprog2 17-dynprog2 17-dynprog2 17-dynprog2
17-dynprog2 17-dynprog2 17-dynprog2 17-dynprog2Shanmuganathan C
 
Heuristics for counterexamples to the Agrawal Conjecture
Heuristics for counterexamples to the Agrawal ConjectureHeuristics for counterexamples to the Agrawal Conjecture
Heuristics for counterexamples to the Agrawal ConjectureAmshuman Hegde
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Alexander Litvinenko
 

Similar to Proof of Kraft Mc-Millan theorem - nguyen vu hung (20)

Programming Exam Help
Programming Exam Help Programming Exam Help
Programming Exam Help
 
Final
Final Final
Final
 
Analysis Of Algorithms Ii
Analysis Of Algorithms IiAnalysis Of Algorithms Ii
Analysis Of Algorithms Ii
 
Unit 3
Unit 3Unit 3
Unit 3
 
Unit 3
Unit 3Unit 3
Unit 3
 
Public Key Cryptography
Public Key CryptographyPublic Key Cryptography
Public Key Cryptography
 
Quiz 1 solution
Quiz 1 solutionQuiz 1 solution
Quiz 1 solution
 
Prin digcommselectedsoln
Prin digcommselectedsolnPrin digcommselectedsoln
Prin digcommselectedsoln
 
Stochastic Processes Homework Help
Stochastic Processes Homework HelpStochastic Processes Homework Help
Stochastic Processes Homework Help
 
Mit2 092 f09_lec23
Mit2 092 f09_lec23Mit2 092 f09_lec23
Mit2 092 f09_lec23
 
RSA final notation change2
RSA final notation change2RSA final notation change2
RSA final notation change2
 
Csr2011 june15 11_00_sima
Csr2011 june15 11_00_simaCsr2011 june15 11_00_sima
Csr2011 june15 11_00_sima
 
4.3e.pptx
4.3e.pptx4.3e.pptx
4.3e.pptx
 
SMB_2012_HR_VAN_ST-last version
SMB_2012_HR_VAN_ST-last versionSMB_2012_HR_VAN_ST-last version
SMB_2012_HR_VAN_ST-last version
 
Jere Koskela slides
Jere Koskela slidesJere Koskela slides
Jere Koskela slides
 
17-dynprog2.ppt
17-dynprog2.ppt17-dynprog2.ppt
17-dynprog2.ppt
 
17-dynprog2 17-dynprog2 17-dynprog2 17-dynprog2
17-dynprog2 17-dynprog2 17-dynprog2 17-dynprog217-dynprog2 17-dynprog2 17-dynprog2 17-dynprog2
17-dynprog2 17-dynprog2 17-dynprog2 17-dynprog2
 
ch3.ppt
ch3.pptch3.ppt
ch3.ppt
 
Heuristics for counterexamples to the Agrawal Conjecture
Heuristics for counterexamples to the Agrawal ConjectureHeuristics for counterexamples to the Agrawal Conjecture
Heuristics for counterexamples to the Agrawal Conjecture
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...
 

More from Vu Hung Nguyen

Co ban horenso - Tai lieu training noi bo
Co ban horenso - Tai lieu training noi boCo ban horenso - Tai lieu training noi bo
Co ban horenso - Tai lieu training noi boVu Hung Nguyen
 
Funix techtalk: Tự học hiệu quả thời 4.0
Funix techtalk: Tự học hiệu quả thời 4.0Funix techtalk: Tự học hiệu quả thời 4.0
Funix techtalk: Tự học hiệu quả thời 4.0Vu Hung Nguyen
 
Học cờ cùng con - Nguyễn Vỹ Kỳ Anh [U8]
Học cờ cùng con - Nguyễn Vỹ Kỳ Anh [U8]Học cờ cùng con - Nguyễn Vỹ Kỳ Anh [U8]
Học cờ cùng con - Nguyễn Vỹ Kỳ Anh [U8]Vu Hung Nguyen
 
Japanese for it bridge engineers
Japanese for it bridge engineersJapanese for it bridge engineers
Japanese for it bridge engineersVu Hung Nguyen
 
Basic IT Project Management Terminologies
Basic IT Project Management TerminologiesBasic IT Project Management Terminologies
Basic IT Project Management TerminologiesVu Hung Nguyen
 
2018 Học cờ cùng con - Nguyễn Vũ Kỳ Anh [U7]
2018 Học cờ cùng con - Nguyễn Vũ Kỳ Anh [U7]2018 Học cờ cùng con - Nguyễn Vũ Kỳ Anh [U7]
2018 Học cờ cùng con - Nguyễn Vũ Kỳ Anh [U7]Vu Hung Nguyen
 
Làm việc hiệu quả với sếp Nhật (2017)
Làm việc hiệu quả với sếp Nhật (2017)Làm việc hiệu quả với sếp Nhật (2017)
Làm việc hiệu quả với sếp Nhật (2017)Vu Hung Nguyen
 
Problem Solving Skills (for IT Engineers)
Problem Solving Skills (for IT Engineers)Problem Solving Skills (for IT Engineers)
Problem Solving Skills (for IT Engineers)Vu Hung Nguyen
 
Using Shader in cocos2d-x
Using Shader in cocos2d-xUsing Shader in cocos2d-x
Using Shader in cocos2d-xVu Hung Nguyen
 
Pham Anh Tu - TK Framework
Pham Anh Tu - TK FrameworkPham Anh Tu - TK Framework
Pham Anh Tu - TK FrameworkVu Hung Nguyen
 
Agile Vietnam Conference 2016: Recap
Agile Vietnam Conference 2016: RecapAgile Vietnam Conference 2016: Recap
Agile Vietnam Conference 2016: RecapVu Hung Nguyen
 
Kanban: Cơ bản và Nâng cao
Kanban: Cơ bản và Nâng caoKanban: Cơ bản và Nâng cao
Kanban: Cơ bản và Nâng caoVu Hung Nguyen
 
Học cờ vua cùng con Nguyễn Vũ Kỳ Anh (U6)
Học cờ vua cùng con Nguyễn Vũ Kỳ Anh (U6)Học cờ vua cùng con Nguyễn Vũ Kỳ Anh (U6)
Học cờ vua cùng con Nguyễn Vũ Kỳ Anh (U6)Vu Hung Nguyen
 
Anti patterns in it project management
Anti patterns in it project managementAnti patterns in it project management
Anti patterns in it project managementVu Hung Nguyen
 
Mindmap and Plan Planning
Mindmap and Plan PlanningMindmap and Plan Planning
Mindmap and Plan PlanningVu Hung Nguyen
 
xDay 2016/08/07 Giới thiệu Chương trình FUNiX Career Advising (tư vấn ...
xDay 2016/08/07 Giới thiệu Chương trình FUNiX Career Advising (tư vấn ...xDay 2016/08/07 Giới thiệu Chương trình FUNiX Career Advising (tư vấn ...
xDay 2016/08/07 Giới thiệu Chương trình FUNiX Career Advising (tư vấn ...Vu Hung Nguyen
 
Các loại nghề Công nghệ Thông tin - Học gì lương cao
Các loại nghề Công nghệ Thông tin - Học gì lương caoCác loại nghề Công nghệ Thông tin - Học gì lương cao
Các loại nghề Công nghệ Thông tin - Học gì lương caoVu Hung Nguyen
 
69 câu hỏi phỏng vấn kỹ sư Công nghệ Thông tin
69 câu hỏi phỏng vấn kỹ  sư Công nghệ Thông tin69 câu hỏi phỏng vấn kỹ  sư Công nghệ Thông tin
69 câu hỏi phỏng vấn kỹ sư Công nghệ Thông tinVu Hung Nguyen
 
Luan an tien si Nguyen Vu Hung
Luan an tien si Nguyen Vu HungLuan an tien si Nguyen Vu Hung
Luan an tien si Nguyen Vu HungVu Hung Nguyen
 
2016 04-21 Chia sẻ cùng AltPlus (về quản lý)
2016 04-21 Chia sẻ cùng AltPlus (về quản lý)2016 04-21 Chia sẻ cùng AltPlus (về quản lý)
2016 04-21 Chia sẻ cùng AltPlus (về quản lý)Vu Hung Nguyen
 

More from Vu Hung Nguyen (20)

Co ban horenso - Tai lieu training noi bo
Co ban horenso - Tai lieu training noi boCo ban horenso - Tai lieu training noi bo
Co ban horenso - Tai lieu training noi bo
 
Funix techtalk: Tự học hiệu quả thời 4.0
Funix techtalk: Tự học hiệu quả thời 4.0Funix techtalk: Tự học hiệu quả thời 4.0
Funix techtalk: Tự học hiệu quả thời 4.0
 
Học cờ cùng con - Nguyễn Vỹ Kỳ Anh [U8]
Học cờ cùng con - Nguyễn Vỹ Kỳ Anh [U8]Học cờ cùng con - Nguyễn Vỹ Kỳ Anh [U8]
Học cờ cùng con - Nguyễn Vỹ Kỳ Anh [U8]
 
Japanese for it bridge engineers
Japanese for it bridge engineersJapanese for it bridge engineers
Japanese for it bridge engineers
 
Basic IT Project Management Terminologies
Basic IT Project Management TerminologiesBasic IT Project Management Terminologies
Basic IT Project Management Terminologies
 
2018 Học cờ cùng con - Nguyễn Vũ Kỳ Anh [U7]
2018 Học cờ cùng con - Nguyễn Vũ Kỳ Anh [U7]2018 Học cờ cùng con - Nguyễn Vũ Kỳ Anh [U7]
2018 Học cờ cùng con - Nguyễn Vũ Kỳ Anh [U7]
 
Làm việc hiệu quả với sếp Nhật (2017)
Làm việc hiệu quả với sếp Nhật (2017)Làm việc hiệu quả với sếp Nhật (2017)
Làm việc hiệu quả với sếp Nhật (2017)
 
Problem Solving Skills (for IT Engineers)
Problem Solving Skills (for IT Engineers)Problem Solving Skills (for IT Engineers)
Problem Solving Skills (for IT Engineers)
 
Using Shader in cocos2d-x
Using Shader in cocos2d-xUsing Shader in cocos2d-x
Using Shader in cocos2d-x
 
Pham Anh Tu - TK Framework
Pham Anh Tu - TK FrameworkPham Anh Tu - TK Framework
Pham Anh Tu - TK Framework
 
Agile Vietnam Conference 2016: Recap
Agile Vietnam Conference 2016: RecapAgile Vietnam Conference 2016: Recap
Agile Vietnam Conference 2016: Recap
 
Kanban: Cơ bản và Nâng cao
Kanban: Cơ bản và Nâng caoKanban: Cơ bản và Nâng cao
Kanban: Cơ bản và Nâng cao
 
Học cờ vua cùng con Nguyễn Vũ Kỳ Anh (U6)
Học cờ vua cùng con Nguyễn Vũ Kỳ Anh (U6)Học cờ vua cùng con Nguyễn Vũ Kỳ Anh (U6)
Học cờ vua cùng con Nguyễn Vũ Kỳ Anh (U6)
 
Anti patterns in it project management
Anti patterns in it project managementAnti patterns in it project management
Anti patterns in it project management
 
Mindmap and Plan Planning
Mindmap and Plan PlanningMindmap and Plan Planning
Mindmap and Plan Planning
 
xDay 2016/08/07 Giới thiệu Chương trình FUNiX Career Advising (tư vấn ...
xDay 2016/08/07 Giới thiệu Chương trình FUNiX Career Advising (tư vấn ...xDay 2016/08/07 Giới thiệu Chương trình FUNiX Career Advising (tư vấn ...
xDay 2016/08/07 Giới thiệu Chương trình FUNiX Career Advising (tư vấn ...
 
Các loại nghề Công nghệ Thông tin - Học gì lương cao
Các loại nghề Công nghệ Thông tin - Học gì lương caoCác loại nghề Công nghệ Thông tin - Học gì lương cao
Các loại nghề Công nghệ Thông tin - Học gì lương cao
 
69 câu hỏi phỏng vấn kỹ sư Công nghệ Thông tin
69 câu hỏi phỏng vấn kỹ  sư Công nghệ Thông tin69 câu hỏi phỏng vấn kỹ  sư Công nghệ Thông tin
69 câu hỏi phỏng vấn kỹ sư Công nghệ Thông tin
 
Luan an tien si Nguyen Vu Hung
Luan an tien si Nguyen Vu HungLuan an tien si Nguyen Vu Hung
Luan an tien si Nguyen Vu Hung
 
2016 04-21 Chia sẻ cùng AltPlus (về quản lý)
2016 04-21 Chia sẻ cùng AltPlus (về quản lý)2016 04-21 Chia sẻ cùng AltPlus (về quản lý)
2016 04-21 Chia sẻ cùng AltPlus (về quản lý)
 

Recently uploaded

GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxOnBoard
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsMemoori
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure servicePooja Nehwal
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 
Azure Monitor & Application Insight to monitor Infrastructure & Application
Azure Monitor & Application Insight to monitor Infrastructure & ApplicationAzure Monitor & Application Insight to monitor Infrastructure & Application
Azure Monitor & Application Insight to monitor Infrastructure & ApplicationAndikSusilo4
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 3652toLead Limited
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...HostedbyConfluent
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksSoftradix Technologies
 

Recently uploaded (20)

GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping Elbows
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptx
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial Buildings
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 
Azure Monitor & Application Insight to monitor Infrastructure & Application
Azure Monitor & Application Insight to monitor Infrastructure & ApplicationAzure Monitor & Application Insight to monitor Infrastructure & Application
Azure Monitor & Application Insight to monitor Infrastructure & Application
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other Frameworks
 

Proof of Kraft Mc-Millan theorem - nguyen vu hung

  • 1. Proof of Kraft-McMillan theorem∗ Nguyen Hung Vu† 平成 16 年 2 月 24 日 1 Kraft-McMillan theorem Let C be a code with n codewords with lenghth l1, l2, ..., lN . If C is uniquely decodable, then K(C) = NX i=1 2−li ≤ 1 (1) 2 Proof The proof works by looking at the nth power of K(C). If K(C) is greater than one, then K(C)n should grw expoentially with n. If it does not grow expoentially with n, then this is proof thatPN i=1 2−li ≤ 1. Let n be and arbitrary integer. Then hPN i=1 2−li in = = “PN i1=1 2−li1 ” · · · “PN i2=1 2−li2 ” “PN iN =1 2−lin ” and then " NX i=1 2−li #n = NX i1=1 NX i2=1 · · · NX in=1 2−(li1+i2+···+in) (2) The exponent li1 + li2 + · · · + lin is simply the length of n codewords from the code C. The smallest value this exponent can take is greater than or equal to n, which would be the case if codwords were 1 bit long. If l = max{l1, l2, · · · , lN } then the largest value that the exponent can take is less than or equal to nl. Therefore, we can write this summation as K(C)n = nlX n Ak2−k where Ak is the number of combinations of n codewords that have a combined length of k. Let’s take a look at the size of this coeficient. The number of possibledistinct binary sequences of length k is 2k . If this code is uniquely decodable, then each se- quence can represent one and only one sequence of codewords. Therefore, the number of possible combinations of codewords whose combined length is k cannot be greater than 2k . In other words, Ak ≤ 2k . ∗Introduction to Data Compression, Khalid Sayood †email: vuhung@fedu.uec.ac.jp This means that K(C)n = nlX k=n Ak2−k ≤ nlX k=n 2k 2−k = nl − n + 1 (3) But K(C) is greater than one, it will grow exponentially with n, while n(l−1)+1 can only grow linearly, So if K(C) is greater than one, we can always find an n large enough that the inequal- ity (3) is violated. Therefore, for an uniquely decodable code C, K(C) is less than or equal ot one. This part of of Kraft-McMillan inequality provides a necces- sary condition for uniquely decodable codes. That is, if a code is uniquely decodable, the codeword lengths have to satisfy the inequality. 3 Construction of prefix code Given a set of integers l1, l2, · · · , lN that satisfy the inequality NX i=1 2−li ≤ 1 (4) we can always find a prefix code with codeword lengths l1, l2, · · · , lN . 4 Proof of prefix code constructing theorem We will prove this assertion by developing a procedure for con- structing a prefix code with codeword lengths l1, l2, · · · , lN that satisfy the given inequality. Without loss of generality, we can assume that l1 ≤ l2 ≤ · · · ≤ lN . (5) Define a sequence of numbers w1, w2, · · · , wN as follows: w1 = 0 wj = Pj−1 i=1 2lj −li j > 1. The binary representation of wj for j > 1 would take up log2 wj bits. We will use this binary representation to construct a prefix code. We first note that the number of bits in the binary representation of wj is less than or equal to lj. This is obviously true for w1. For j > 1, log2 wj = log2 hPj−1 i=1 2lj −li i = log2 h 2lj Pj−1 i=1 2−li i = lj + log2 hPj−1 i=1 2−li i ≤ lj. 1
  • 2. The last inequality results from the hypothesis of the theo- rem that PN i=1 2−li ≤ 1, which implies that Pj−1 i=1 2li ≤ 1. As the logarithm of a number less than one is negative, lj + log2 ˆP i = 1j−1 2−li ˜ has to be less than lj. Using the binary representation of wj, we can devise a binary code in the following manner: If log2 wj = lj, then the jth codeword cj is the binary representation of wj. If log2 wj < lj, then cj is the binary representation of wj, with lj − log2 wj zeros appended to the right. This certainly a code, but is it a prefix code? If we can show that the code C = {c1, c2, · · · , cN } is a prefix, then we will have proved the theorem by construction. Suppose that our claim is not true.. The for some j < k, cj is a refix of ck. This means that lj most significant bits of wk form the binary representation of wj. Therefore if we right–shift the binary of wk by lk − lj bits, we should get the binary representation for wj. We can write this as wj = wk 2lk−lj . However, wk = k−1X i=1 2lk−li . Therefore, wk 2lk−lj = k−1X i=0 2lj −li = wj + k−1X i=j 2lj −li = wj + 20 + k−1X i=j+1 2lj −lk ≥ wj + 1 (6) That is, the smalleset value for wk 2 lk−lj is wj + 1. This constra- dicts the requirement for cj being the prefix of ck. Therefore, cj cannot be the prefix of ck. As j and k were arbitrary, this means that no codword is a prefix of another codeword, and the code C is a prefix code. 5 Projects and Problems 1. Suppose X is a random variable that takes on values from an M–letter alphabet. Show that 0 ≤ H(X) ≤ log2 M. Hints: H(X) = − Pn i=1 P(αi) log P(αi) where, M = {α1 × m1, α2 × m2, · · · , αn × mn}, αi = αj∀i = j, Pn i=1 mi = M. H(X) = − Pn i=1 mi M log mi M = − 1 M Pn i=1 mi(log mi − log M) = − 1 M P mi log mi + 1 M log M P mi = − 1 M P mi log mi + log M 2. Show that for the case where the elements of an observed sequence are idd 1 ,the entropy is equal to the first-order entropy. Hints: First–order entropy is defined by: H1(S) = limn→∞ 1 n Gn, where Gn = − Pi1=m i1=1 Pi2=m i2=1 · · · P in=1 in = mP(X1 = i1, X2 = i2, · · · , Xn = in) log P(X1 = i1, X2 = 1iid: independent and identical distributed i2, · · · , Xn = in) . And second–order entropy is defined by H2(S) = − P P(X1) log P(X1). In case of iid, Gn = −n Pi1=m i1=1 P(X1 = i1) log P(X1 = i1). Prove it! 3. Given an alphabet A = {a1, a2, a3, a4}, find the first–order entropy in the following cases: (a) P(a1) = P(a2) = P(a3) = P(a4) = 1 4 . (b) P(a1) = 1 2 , P(a2) = 1 4 , P(a3) = P(a4) = 1 8 . (c) P(a1) = 0.505, P(a2) = 1 4 , P(a3) = 1 8 , P(a4) = 0.12. 4. Suppose we have a source with a probability model P = {p0, p1, · · · , pm} and entropy HP . Suppose we have an- other sourcewith probability model Q = {q0, q1, · · · qm} and entroy HQ, where qi = pi i = 0, 1, · · · , j − 2, j + 1, · · · , m and qj = qj−1 = pj +pj−1 2 How is HQ related to HP ( greater, equal, or less)? Prove your answer. Hints: HQ − HP = P pi log pi − P qi log qi = pj−1 log pj−1 + pj log pj − 2 pj−1+pj 2 log pj−1+pj 2 = φ(pj−1) + φ(pj) − 2φ( pj−1+pj 2 ). Where φ(x) = x log x and φ (x) = 1/x > 0 ∀x > 0. 5. There are several image and speech files among the accom- panying data sets. (a) Write a program to compute the first–order entropy of some of the image and speech files. (b) Pick one of the image files and compute is second– order entropy. Comment on the difference between the first–order and second–order entropies. (c) Compute the entropy of the difference between neigh- boring pixels for the image you used in part (b). Com- ment on what you discovered. 6. Conduct an experiemnt to see how well a model can describe a source. (a) Write a program that randomly selects letters from a 26–letter alphabet {a, b, c, · · · , z} and forms four– letter words. Form 100 such words and see how many of these words make sence. (b) Among the accompanying data sets is a file called 4letter.words , which contains a list of four– letter words. Using this file, obtain a probability model for the alphabet. Now repeat part (a) gener- ating the words using probability model. To pick let- ters according to a probability model, construct the cumulative density function (cdf)FX (x)2 . Using a uniform pseudorandom number generator to generate a value r, where 0 ≤ r < 1, pick the letter xk if FX (xk − 1) ≤ r < FX (xk). Compare your results with those of part (a). (c) Repeat (b) using a single–letter context. (d) Repeat (b) using a two–letter context. 7. Determine whether the following codes are uniquely decod- able: 2cumulative distribution function (cdf) FX (x) = P(X ≤ x)
  • 3. (a) {0, 01, 11, 111}( 01 11 = 0 111) (b) {0, 01, 110, 111} (01 110 = 0 111 0) (c) {0, 10, 110, 111}, Yes. Prove it! (d) {1, 10, 110, 111}, (1 10 = 110) 8. Using a text file compute the probabilities of each letter pi. (a) Assume that we need a codeword of length 1 log2 pi to encode the letter i. Determine the number of bits needed to encode the file. (b) Compute the conditional probability P(i/j) of a let- ter i given that the previous letter is j. Assume that we need 1 log2 P (i/j) to represent a letter i that fol- lows a letter j. Determine the number of bits needed to encode the file. 6 Huffman coding 6.1 Definition A minimal variable-length character encoding based on the fre- quency of each character. First, each character becomes a trivial tree, with the character as the only node. The character’s fre- quency is the tree’s frequency. The two trees with the least fre- quencies are joined with a new root which is assigned the sum of their frequencies. This is repeated until all characters are in one tree. One code bit represents each level. Thus more frequent characters are near the root and are encoded with few bits, and rare characters are far from the root and are encoded with many bits. 6.2 Example of Huffman encoding design Alphabet A = {a1, a2, a3, a4} with P(a1) = P(a2) = 0.2 and P(a4) = P(a5) = 0.1. The entropy of this source is 2.122bits/symbol. To design the Huffman code, we first sort the letters in a descending probability order as shown in table below. Here c(ai) denotes the codewords as c(a4) = α1 ∗ 0 c(a5) = α1 ∗ 1 where α1 is a binary string, and * denotes concatnation. The initial five–letter alphabet Letter Probability Codeword a2 0.4 c(a2) a1 0.2 c(a1) a3 0.2 c(a3) a4 0.1 c(a4) a5 0.1 c(a5) Now we define a new alphabe A with a four–letter a1, a2, a3, a4 where a4 is composed of a4 and a5 and has a proba- bility P(a4) = P(a4) + P(a5) = 0.2. We sort this new alphabet in descending order to obtain following table. The reduced four–letter alphabet Letter Probability Codewords a2 0.4 c(a2) a1 0.2 c(a1) a3 0.2 c(a3) a4 0.2 α1 In this alphabet, a3 and a4 are the two letters at the bottom of the sorted list. We assign their codewords as c(a3) = α2 ∗ 0 c(a4) = α2 ∗ 1 but c(a4) = α1. Therefore α1 = α2 ∗ 1 which means that c(a4) = α2 ∗ 10 c(a5) = α2 ∗ 11. At this stage, we again define a new alphabet A that consists of three letters a1, a2, a3, where a3 is composed of a3 and a4 and has a probability P(a3) = P(a3) + P(a4) = 0.4. We sort this new alphabet in descending order to obtain the following table. The reduced four–letter alphabet Letter Probability Codewords a2 0.4 c(a2) a3 0.4 α2 a1 0.2 c(a1) In this case, the two least probable symbols are a1 and a3. Therefore, c(a3) = α3 ∗ 0 c(a1) = α3 ∗ 1 But c(a3) = α2. Therefore, α2 = α3 ∗ 0 which means that. c(a3) = α3 ∗ 00 c(a4) = α4 ∗ 010 c(a5) = α5 ∗ 011. Again we define a new alphabet, this time with only two let- ter a3 , a2. Here a3 is composed of letters a3 and a1 and has probability P(a3 ) = P(a3) + P(a1) = 0.6. We now have the following table The reduced four–letter alphabet Letter Probability Codewords a3 0.6 α3 a2 0.4 c(a2) As we have only two letters, the codeword assignment is straghtforward: c(a3 ) = 0 c(a2) = 1 which means that α3 = 0, which in turn means that c(a1) = 01 c(a3) = 000 c(a4) = 0010 c(a5) = 0011 and the Huffman finally is given here.
  • 4. The reduced four–letter alphabet Letter Probability Codewords a2 0.4 1 a1 0.2 01 a3 0.2 000 a4 0.1 0010 a5 0.1 0011 6.3 Huffman code is optimal The neccessary conditions for an optimal variable binary code are as follows: 1. Condition 1: Give any two letter aj and ak, if P(aj) ≥ P(ak), then lj ≤ lk, where lj is the number of bits in the codeword for aj. 2. Condition 2: The two least probable letters have codewords with the same maximum length lm. 3. Condition 3: In the tree corresponding to the optimum code, there must be two branches stemming3 from each interme- diate node. 4. Condition 4: Suppose we change an intermediate node into a leaf node by combining all the leaves descending from it into a composite word of a reduced alphabet. Then, if the original tree was optimal for the original alphabet, the reduced tree is optimal for reduced alphabet. The Huffman code satisfies all conditions above and therefore be optimal. 6.4 Average codeword length For a source S with alphabet A = {a1, a2, · · · , aK }, and prob- ability model {P(a1), P(a2), · · · , P(aK )}, the average code- word length is given by ¯l = KX i=1 P(ai)li. The difference between the entropy ofthe sourcce H(S) and the average length is H(S) − ¯l = − PK i=1 P(ai) log2 P(ai) − PK i=1 P(ai)li = P P(ai) “ log h 1 P (ai) − li i” = P P(ai) “ log h 1 P (ai) i − log2(2li ) ” = P P(ai) log h 2−li P (ai) i ≤ log2 ˆP 2li ˜ 6.5 Length of Huffman code It has been proven that H(S) ≤ ¯(l) < H(S) + 1 (7) where H(S) is the entropy of the source S. 3生じる 6.6 Extended Huffman code Consider an alphabet A = {a1, a2, · · · , am}. Each element of the sequence is generated independently of the other elements in the sequence. The entropyo f this source is give by H(S) = − NX i=1 P(ai) log2 P(ai) We know that we can generate a Huffman code for this source with rate r such that H(S) ≤ R < H(S) + 1 (8) We have used the looser bound here; the same argument can be made with the tighter bound. Notice thatwe have used ”rate R” to denote the number of bits per symbol. This is a standard convention in the data compression literate. However, in the co- munication literature, the word ”rate” ofter refers to the number of bits per second. Suppose we now encode the sequence by generating one code- word for every n symbols. As there are mn combinations of n symbols, we will need mn codewords in our Huffman code. We could generate this code by viewing mn symbols as let- ters of an extended alphabet. A(n) = { ntimes z }| { a1a1...a1, a1a1...a2, ..., a2, a1a1, ...am, a1a1...a2a1, ..., amam...am} from a source S(n) . Denote the rate for the new source as R(n) . We know H(S(n) ) ≤ R(n) < H(S(n) ) + 1. (9) R = 1 n R(n) and H(S(n) ) n ≤ R < H(S(n) ) n + 1 n It can be proven that H(S(n) ) = nH(S) and therefore H(S) ≤ R ≤ H(S) + 1 n 6.6.1 Example of Huffman Extended Code A = {a1, a2, a3} with probabilities model P(a1) = 0.8, P(a2) = 0.02 and P(a3) = 0.18. The entropy for this source is 0.816 bits per symbol. Huffman code this this source is shown below a1 0 a2 11 a3 10 and the extended code is a1a1 0.64 0 a1a2 0.016 10101 a1a3 0.144 11 a2a1 0.016 101000 a2a2 0.0004 10100101 a2a3 0.0036 1010011 a3a1 0.1440 100 a3a2 0.0036 10100100 a3a3 0.0324 1011
  • 5. 6.7 Nonbinary Huffman codes 6.8 Adaptive Huffman Coding 7 Arithmetic coding 4 Use the mapping X(ai) = i, ai ∈ A (10) where A = {a1, a2, ...am} is the alphabet for a discrete source and X is a random variable. This mapping means that given a probability model mathcalP for the source, we also have a prob- ability density function for the random variable P(X = i) = P(ai) and the curmulative density function can be defined as FX (i) = iX k=1 P(X = k). Define ¯TX (ai) as ¯TX (ai) = i−1X k=1 P(X = k) + 1 2 P(X = i) (11) = FX (i − 1) + 1 2 P(X = i) (12) For each ai, ¯TX (ai) will have a unique value. This value can be used as a unique tag for ai. In general ¯T (m) X (xi) = X y<xi P(y) + 1 2 P(xi) (13) 4Arithmetic: 算数, 算術