1. UNIT IV
SOURCE AND ERROR CONTROL CODING
Entropy, Source encoding theorem,
Shannon fano coding, Huffman coding,
mutual information, channel capacity,
Error Control Coding, linear block codes,
cyclic codes - ARQ Techniques.
2. UNIT IV
SOURCE AND ERROR CONTROL CODING
Information Theory
Mathematical modeling and analysis of a communication
system
Signal Compression and data rate
Entropy and channel capacity
Entropy – Probabilistic behavior of source
Capacity – Ability to convey information
3. Mutual Information
I(x,y) = H(x)-H(x/y)
I(y,x) = H(y)-H(y/x)
Properties
1. Symmetric ie I(x,y) = I(y,x)
2. Always non negative
3. I(x,y) = H(x)+H(y)-H(x,y)
UNIT IV
SOURCE AND ERROR CONTROL CODING
4. Channel Capacity
Maximum mutual information in any single use of a channel
C= maxP(xj) I(x,y)
UNIT IV
SOURCE AND ERROR CONTROL CODING
5. Automatic Repeat Request
Stop and Wait
Continuous ARQ with pullback
Continuous ARQ with selective repeat
UNIT IV
SOURCE AND ERROR CONTROL CODING
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21. Trellis diagram
a a
b b
c c
d d
m2,m
1
m X1=m X2=
m+m1+m2
X3=
m+m2
m1,
m
00 a
0 0 0+0+0= 0 0+0=0 00 a
1 1 1+0+0=1 1+0=1 01 b
01 b
0 0 1 0 10 c
1 1 0 1 11 d
10 c
0 0 1 1 00 a
1 1 0 0 01 b
11 d
0 0 0 1 10 c
1 1 1 0 11 d
24. Entropy = pklog2(1/pk) where k ranges 1 to 6
= p1log2(1/p1) + p2log2(1/p2) + p3log2(1/p3) +….
p6log2(1/p6)
= 0.3log2(1/0.3) + 0.1log2(1/0.1) +
I R = H* No of symbols
= H*500
= H*9600