Chapter 4 
Channel Coding 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 1
Outline 
„ FEC (Forward Error Correction) 
„ Block Codes 
„ Convolutional Codes 
„ Interleaving 
„ Information Capacity Theorem 
„ Turbo Codes 
„ CRC (Cyclic Redundancy Check) 
„ ARQ (Automatic Repeat Request) 
„ Stop-and-wait ARQ 
„ Go-back-N ARQ 
„ Selective-repeat ARQ 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 2
Channel Coding in Digital Communication Systems 
Information to 
be transmitted Source 
coding 
Channel 
Channel 
coding 
Modulation Transmitter 
Channel 
Information 
received Source 
decoding 
Channel 
decoding 
Channel 
decoding 
Demodulation Receiver 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 3
Forward Error Correction (FEC) 
„ The key idea of FEC is to transmit enough 
redundant data to allow receiver to recover 
from errors all by itself. No sender 
retransmission required. 
„ The major categories of FEC codes are 
„ Block codes, 
„ Cyclic codes, 
„ Reed-Solomon codes (Not covered here), 
„ Convolutional codes, and 
„ Turbo codes, etc. 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 4
Block Codes 
„ Information is divided into blocks of length k 
„ r parity bits or check bits are added to each block 
(total length n = k + r),. 
„ Code rate R = k/n 
„ Decoder looks for codeword closest to received 
vector (code vector + error vector) 
„ Tradeoffs between 
„ Efficiency 
„ Reliability 
„ Encoding/Decoding complexity 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 5
Block Codes: Linear Block Codes 
ƒ Linear Block Code 
The block length c(x) or C of the Linear Block Code is 
c(x) = m(x) g(x) or C = m G 
where m(x) or m is the information codeword block length, 
g(x) is the generator polynomial, G is the generator matrix. 
G = [p | I], 
where pi = Remainder of [xn-k+i-1/g(x)] for i=1, 2, .., k, and I 
is unit matrix. 
ƒ The parity check matrix 
H = [pT | I ], where pT is the transpose of the matrix p. 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 6
Block Codes: Linear Block Codes 
Generator 
matrix 
G 
Code 
Vector 
C 
Message 
vector 
m 
Parity 
check 
matrix 
HT 
Code 
Vector 
C 
Null 
vector 
0 
Operations of the generator matrix and the parity check matrix 
The parity check matrix H is used to detect errors in the received code by using the fact 
that c * HT = 0 ( null vector) 
Let x = c 
e be the received message where c is the correct code and e is the error 
Compute S = x * HT =( c 
e ) * HT =c HT e HT = e HT 
If S is 0 then message is correct else there are errors in it, from common known error 
patterns the correct message can be decoded. 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 7
Block Codes: Example 
Example : Find linear block code encoder G if code generator 
polynomial g(x)=1+x+x3 for a (7, 4) code. 
We have n = Total number of bits = 7, k = Number of information bits = 4, 
r = Number of parity bits = n - k = 3. 
10 L 
0 
01 0 
p 
1 
 
    
 
    
L 
p 
[ ] , 
LLLL 
0 0 1 
| 2 
 
 
G P I 
= = 
L 
k p 
∴ 
where 
L =   
i k 
n k i 
Re 
1 
 
= 
− + − 
i , 1, 2, , 
g x 
p mainder of x 
( ) 
 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 8
Block Codes: Example (Continued) 
1 [110] 
 
3 
+ + 
 
p x 
= x 
1 Re 3 
= + →  
 
1 
x x 
[011] 
 
4 
+ + 
 
p x 
Re 2 
= x x 
2 → + =  
 
1 
3 
x x 
1 [111] 
 
5 
+ + 
 
p x 
Re 2 
= x x 
3 → + + =  
 
1 
3 
x x 
1 [101] 
 
6 
+ + 
 
p x 
Re 2 
= x 
4 → + =  
 
1 
3 
x x 
 
    
 
1101000 
 
= 
1010001 
    
0110100 
1110010 
 
G 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 9
Cyclic Codes 
It is a block code which uses a shift register to perform encoding and 
decoding 
The code word with n bits is expressed as 
c(x)=c1xn-1 +c2xn-2……+cn 
where each ci is either a 1 or 0. 
c(x) = m(x) xn-k + cp(x) 
where cp(x) = remainder from dividing m(x) xn-k by generator g(x) 
if the received signal is c(x) + e(x) where e(x) is the error. 
To check if received signal is error free, the remainder from dividing 
c(x) + e(x) by g(x) is obtained(syndrome). If this is 0 then the received 
signal is considered error free else error pattern is detected from 
known error syndromes. 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 10
Cyclic Redundancy Check (CRC) 
„ Using parity, some errors are masked - careful 
choice of bit combinations can lead to better 
detection. 
„ Binary (n, k) CRC codes can detect the following 
error patterns 
1. All error bursts of length n-k or less. 
2. All combinations of minimum Hamming distance 
dmin- 1 or fewer errors. 
3. All error patters with an odd number of errors if the 
generator polynomial g(x) has an even number of 
nonzero coefficients. 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 11
Common CRC Codes 
Parity check 
bits 
Generator polynomial 
g(x) 
Code 
CRC-12 1+x+x2+x3+x11+x12 12 
CRC-16 1+x2+x15+x16 16 
CRC-CCITT 1+x5+x15+x16 16 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 12
Convolutional Codes 
„ Encoding of information stream rather than 
information blocks 
„ Value of certain information symbol also affects 
the encoding of next M information symbols, 
i.e., memory M 
„ Easy implementation using shift register 
Æ Assuming k inputs and n outputs 
„ Decoding is mostly performed by the Viterbi 
Algorithm (not covered here) 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 13
Convolutional Codes: (n=2, k=1, M=2) 
Encoder 
y1 
Input Output 
D1 D2 
c 
Di -- Register 
x 
y2 
Input: 1 1 1 0 0 0 … 
Output: 11 01 10 01 11 00 … 
Input: 1 0 1 0 0 0 … 
Output: 11 10 00 10 11 00 … 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 14
State Diagram 
10/1 
11 
01/1 01/0 
10/0 
00/1 
10 01 
11/1 11/0 
00 
00/0 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 15
Tree Diagram 
00 
11 
00 
10 
01 
11 
11 
00 
10 
01 
10 
01 
00 
11 
11 
10 
01 
00 
11 
00 
01 
01 
10 
10 
…… 
11 
00 
01 
10 
10 
01 
00 
11 
0 
1 
First input 
… 1 1 0 0 1 
First output 
… 10 11 11 01 11 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 16
Trellis 
… 11 0 0 1 
00 00 00 00 00 00 00 00 00 00 00 
11 
11 11 11 11 11 
11 
11 
10 10 10 10 10 10 
10 
00 00 00 
10 10 10 
01 01 01 01 01 01 
01 
01 01 01 01 01 01 
11 11 11 11 11 11 
… 
10 10 10 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 17
Interleaving 
Input Data a1, a2, a3, a4, a5, a6, a7, a8, a9, … 
Write 
a1, a2, a3, a4 
a5, a6, a7, a8 
a9, a10, a11, a12 
a13, a14, a15, a16 
Read 
Interleaving 
Transmitting a1, a5, a9, a13, a2, a6, a10, a14, a3, … 
Data 
Read 
a1, a2, a3, a4 
a5, a6, a7, a8 
a9, a10, a11, a12 
a13, a14, a15, a16 
Write 
De-Interleaving 
Output Data a1, a2, a3, a4, a5, a6, a7, a8, a9, … 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 18
Interleaving (Example) 
Burst error 
Transmitting 0, 0, 0, 1, 1, 1, 1, 0, 0, 0, 0,… 
Data 
0, 1, 0, 0 
0, 1, 0, 0 
0, 1, 0, 0 
1, 0, 0, 0 
Read 
Write De-Interleaving 
Output Data 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, … 
Discrete error 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 19
Information Capacity Theorem 
(Shannon Limit) 
„ The information capacity (or channel capacity) 
C of a continuous channel with bandwidth B 
Hertz can be perturbed by additive Gaussian 
white noise of power spectral density N0/2, 
provided bandwidth B satisfies 
 
 
C Blog 1 P / sec 
bits ond 
2 N  0 
B 
 
  
= + 
where P is the average transmitted power P = 
EbRb (for an ideal system, Rb = C). 
Eb is the transmitted energy per bit, 
Rb is transmission rate. 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 20
Shannon Limit 
Region for which Rb>C 
Capacity boundary Rb<C 
Region for which Rb<C 
Shannon Limit 
0 10 20 30 Eb/N0 dB 
Rb/B 
-1.6 
20 
10 
1 
0.1 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 21
Turbo Codes 
„ A brief historic of turbo codes : 
The turbo code concept was first introduced by C. Berrou in 
1993. Today, Turbo Codes are considered as the most 
efficient coding schemes for FEC. 
„ Scheme with known components (simple convolutional or 
block codes, interleaver, soft-decision decoder, etc.) 
„ Performance close to the Shannon Limit (Eb/N0 = -1.6 db 
if Rb Æ0) at modest complexity! 
„ Turbo codes have been proposed for low-power applications 
such as deep-space and satellite communications, as well as 
for interference limited applications such as third generation 
cellular, personal communication services, ad hoc and sensor 
networks. 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 22
Turbo Codes: Encoder 
Convolutional Encoder 
1 
X 
Interleaving 
Convolutional Encoder 
2 
Data 
Source 
X 
Y 
Y1 
Y2 
(Y1, Y2) 
X: Information 
Yi: Redundancy Information 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 23
Turbo Codes: Decoder 
Convolutional 
Decoder 1 
Convolutional 
Decoder 2 
Interleaving 
De-interleaving 
Interleaver 
De-interleaving 
Y1 
X 
Y2 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 24 
X’ 
X’: Decoded Information
Automatic Repeat Request (ARQ) 
Source Transmitter Channel Receiver Destination 
Encoder 
Transmit 
Controller 
Modulation Demodulation Decoder 
Transmit 
Controller 
Acknowledge 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 25
Stop-And-Wait ARQ (SAW ARQ) 
Transmitting 
Data 
Retransmission 
1 2 3 3 
Time 
ACK 
ACK 
NAK 
Received Data 1 2 3 Time 
Error 
Output Data 1 2 3 Time 
ACK: Acknowledge 
NAK: Negative ACK 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 26
Stop-And-Wait ARQ (SAW ARQ) 
Throughput: 
S = (1/T) * (k/n) = [(1- Pb)n / (1 + D * Rb/ n) ] * (k/n) 
where T is the average transmission time in terms of a block duration 
T = (1 + D * Rb/ n) * PACK + 2 * (1 + D * Rb/ n) * PACK * (1- PACK) 
+ 3 * (1 + D * Rb/ n) * PACK * (1- PACK)2 + ….. 
= (1 + D * Rb/ n) / (1- Pb)n 
where n = number of bits in a block, k = number of information bits in a block, 
D = round trip delay, Rb= bit rate, Pb = BER of the channel, and PACK = (1- Pb)n 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 27
Go-Back-N ARQ (GBN ARQ) 
Transmitting 
Data 
1 
Time 
Received Data 
Go-back 3 
Go-back 5 
2 3 4 5 3 4 5 6 7 5 
NAK 
NAK 
2 3 4 5 
Error 
Error 
1 
Time 
1 2 3 4 5 
Output Data Time 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 28
Go-Back-N ARQ (GBN ARQ) 
Throughput 
S = (1/T) * (k/n) 
= [(1- Pb)n / ((1- Pb)n + N * (1-(1- Pb)n ) )]* (k/n) 
where 
T = 1 * PACK + (N+1) * PACK * (1- PACK) +2 * (N+1) * PACK * 
(1- PACK)2 + …. 
= 1 + (N * [1 - (1- Pb)n ])/ (1- Pb)n 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 29
Selective-Repeat ARQ (SR ARQ) 
Transmitting 
Data 
1 
Retransmission 
Time 
Received Data 
Retransmission 
2 3 4 5 3 6 7 8 9 7 
NAK 
NAK 
1 2 4 5 3 6 8 9 
7 
Time Error 
Error 
Buffer 1 Time 2 4 5 3 6 8 9 7 
Output Data 1 Time 2 3 4 5 6 7 8 9 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 30
Selective-Repeat ARQ (SR ARQ) 
Throughput 
S = (1/T) * (k/n) 
= (1- Pb)n * (k/n) 
where 
T = 1 * PACK + 2 * PACK * (1- PACK) + 3 * PACK * (1- PACK)2 
+ …. 
= 1/(1- Pb)n 
Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 31

Coding

  • 1.
    Chapter 4 ChannelCoding Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 1
  • 2.
    Outline „ FEC(Forward Error Correction) „ Block Codes „ Convolutional Codes „ Interleaving „ Information Capacity Theorem „ Turbo Codes „ CRC (Cyclic Redundancy Check) „ ARQ (Automatic Repeat Request) „ Stop-and-wait ARQ „ Go-back-N ARQ „ Selective-repeat ARQ Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 2
  • 3.
    Channel Coding inDigital Communication Systems Information to be transmitted Source coding Channel Channel coding Modulation Transmitter Channel Information received Source decoding Channel decoding Channel decoding Demodulation Receiver Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 3
  • 4.
    Forward Error Correction(FEC) „ The key idea of FEC is to transmit enough redundant data to allow receiver to recover from errors all by itself. No sender retransmission required. „ The major categories of FEC codes are „ Block codes, „ Cyclic codes, „ Reed-Solomon codes (Not covered here), „ Convolutional codes, and „ Turbo codes, etc. Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 4
  • 5.
    Block Codes „Information is divided into blocks of length k „ r parity bits or check bits are added to each block (total length n = k + r),. „ Code rate R = k/n „ Decoder looks for codeword closest to received vector (code vector + error vector) „ Tradeoffs between „ Efficiency „ Reliability „ Encoding/Decoding complexity Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 5
  • 6.
    Block Codes: LinearBlock Codes ƒ Linear Block Code The block length c(x) or C of the Linear Block Code is c(x) = m(x) g(x) or C = m G where m(x) or m is the information codeword block length, g(x) is the generator polynomial, G is the generator matrix. G = [p | I], where pi = Remainder of [xn-k+i-1/g(x)] for i=1, 2, .., k, and I is unit matrix. ƒ The parity check matrix H = [pT | I ], where pT is the transpose of the matrix p. Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 6
  • 7.
    Block Codes: LinearBlock Codes Generator matrix G Code Vector C Message vector m Parity check matrix HT Code Vector C Null vector 0 Operations of the generator matrix and the parity check matrix The parity check matrix H is used to detect errors in the received code by using the fact that c * HT = 0 ( null vector) Let x = c e be the received message where c is the correct code and e is the error Compute S = x * HT =( c e ) * HT =c HT e HT = e HT If S is 0 then message is correct else there are errors in it, from common known error patterns the correct message can be decoded. Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 7
  • 8.
    Block Codes: Example Example : Find linear block code encoder G if code generator polynomial g(x)=1+x+x3 for a (7, 4) code. We have n = Total number of bits = 7, k = Number of information bits = 4, r = Number of parity bits = n - k = 3. 10 L 0 01 0 p 1           L p [ ] , LLLL 0 0 1 | 2   G P I = = L k p ∴ where L =   i k n k i Re 1  = − + − i , 1, 2, , g x p mainder of x ( )  Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 8
  • 9.
    Block Codes: Example(Continued) 1 [110]  3 + +  p x = x 1 Re 3 = + →   1 x x [011]  4 + +  p x Re 2 = x x 2 → + =   1 3 x x 1 [111]  5 + +  p x Re 2 = x x 3 → + + =   1 3 x x 1 [101]  6 + +  p x Re 2 = x 4 → + =   1 3 x x       1101000  = 1010001     0110100 1110010  G Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 9
  • 10.
    Cyclic Codes Itis a block code which uses a shift register to perform encoding and decoding The code word with n bits is expressed as c(x)=c1xn-1 +c2xn-2……+cn where each ci is either a 1 or 0. c(x) = m(x) xn-k + cp(x) where cp(x) = remainder from dividing m(x) xn-k by generator g(x) if the received signal is c(x) + e(x) where e(x) is the error. To check if received signal is error free, the remainder from dividing c(x) + e(x) by g(x) is obtained(syndrome). If this is 0 then the received signal is considered error free else error pattern is detected from known error syndromes. Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 10
  • 11.
    Cyclic Redundancy Check(CRC) „ Using parity, some errors are masked - careful choice of bit combinations can lead to better detection. „ Binary (n, k) CRC codes can detect the following error patterns 1. All error bursts of length n-k or less. 2. All combinations of minimum Hamming distance dmin- 1 or fewer errors. 3. All error patters with an odd number of errors if the generator polynomial g(x) has an even number of nonzero coefficients. Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 11
  • 12.
    Common CRC Codes Parity check bits Generator polynomial g(x) Code CRC-12 1+x+x2+x3+x11+x12 12 CRC-16 1+x2+x15+x16 16 CRC-CCITT 1+x5+x15+x16 16 Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 12
  • 13.
    Convolutional Codes „Encoding of information stream rather than information blocks „ Value of certain information symbol also affects the encoding of next M information symbols, i.e., memory M „ Easy implementation using shift register Æ Assuming k inputs and n outputs „ Decoding is mostly performed by the Viterbi Algorithm (not covered here) Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 13
  • 14.
    Convolutional Codes: (n=2,k=1, M=2) Encoder y1 Input Output D1 D2 c Di -- Register x y2 Input: 1 1 1 0 0 0 … Output: 11 01 10 01 11 00 … Input: 1 0 1 0 0 0 … Output: 11 10 00 10 11 00 … Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 14
  • 15.
    State Diagram 10/1 11 01/1 01/0 10/0 00/1 10 01 11/1 11/0 00 00/0 Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 15
  • 16.
    Tree Diagram 00 11 00 10 01 11 11 00 10 01 10 01 00 11 11 10 01 00 11 00 01 01 10 10 …… 11 00 01 10 10 01 00 11 0 1 First input … 1 1 0 0 1 First output … 10 11 11 01 11 Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 16
  • 17.
    Trellis … 110 0 1 00 00 00 00 00 00 00 00 00 00 00 11 11 11 11 11 11 11 11 10 10 10 10 10 10 10 00 00 00 10 10 10 01 01 01 01 01 01 01 01 01 01 01 01 01 11 11 11 11 11 11 … 10 10 10 Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 17
  • 18.
    Interleaving Input Dataa1, a2, a3, a4, a5, a6, a7, a8, a9, … Write a1, a2, a3, a4 a5, a6, a7, a8 a9, a10, a11, a12 a13, a14, a15, a16 Read Interleaving Transmitting a1, a5, a9, a13, a2, a6, a10, a14, a3, … Data Read a1, a2, a3, a4 a5, a6, a7, a8 a9, a10, a11, a12 a13, a14, a15, a16 Write De-Interleaving Output Data a1, a2, a3, a4, a5, a6, a7, a8, a9, … Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 18
  • 19.
    Interleaving (Example) Bursterror Transmitting 0, 0, 0, 1, 1, 1, 1, 0, 0, 0, 0,… Data 0, 1, 0, 0 0, 1, 0, 0 0, 1, 0, 0 1, 0, 0, 0 Read Write De-Interleaving Output Data 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, … Discrete error Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 19
  • 20.
    Information Capacity Theorem (Shannon Limit) „ The information capacity (or channel capacity) C of a continuous channel with bandwidth B Hertz can be perturbed by additive Gaussian white noise of power spectral density N0/2, provided bandwidth B satisfies   C Blog 1 P / sec bits ond 2 N  0 B    = + where P is the average transmitted power P = EbRb (for an ideal system, Rb = C). Eb is the transmitted energy per bit, Rb is transmission rate. Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 20
  • 21.
    Shannon Limit Regionfor which Rb>C Capacity boundary Rb<C Region for which Rb<C Shannon Limit 0 10 20 30 Eb/N0 dB Rb/B -1.6 20 10 1 0.1 Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 21
  • 22.
    Turbo Codes „A brief historic of turbo codes : The turbo code concept was first introduced by C. Berrou in 1993. Today, Turbo Codes are considered as the most efficient coding schemes for FEC. „ Scheme with known components (simple convolutional or block codes, interleaver, soft-decision decoder, etc.) „ Performance close to the Shannon Limit (Eb/N0 = -1.6 db if Rb Æ0) at modest complexity! „ Turbo codes have been proposed for low-power applications such as deep-space and satellite communications, as well as for interference limited applications such as third generation cellular, personal communication services, ad hoc and sensor networks. Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 22
  • 23.
    Turbo Codes: Encoder Convolutional Encoder 1 X Interleaving Convolutional Encoder 2 Data Source X Y Y1 Y2 (Y1, Y2) X: Information Yi: Redundancy Information Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 23
  • 24.
    Turbo Codes: Decoder Convolutional Decoder 1 Convolutional Decoder 2 Interleaving De-interleaving Interleaver De-interleaving Y1 X Y2 Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 24 X’ X’: Decoded Information
  • 25.
    Automatic Repeat Request(ARQ) Source Transmitter Channel Receiver Destination Encoder Transmit Controller Modulation Demodulation Decoder Transmit Controller Acknowledge Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 25
  • 26.
    Stop-And-Wait ARQ (SAWARQ) Transmitting Data Retransmission 1 2 3 3 Time ACK ACK NAK Received Data 1 2 3 Time Error Output Data 1 2 3 Time ACK: Acknowledge NAK: Negative ACK Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 26
  • 27.
    Stop-And-Wait ARQ (SAWARQ) Throughput: S = (1/T) * (k/n) = [(1- Pb)n / (1 + D * Rb/ n) ] * (k/n) where T is the average transmission time in terms of a block duration T = (1 + D * Rb/ n) * PACK + 2 * (1 + D * Rb/ n) * PACK * (1- PACK) + 3 * (1 + D * Rb/ n) * PACK * (1- PACK)2 + ….. = (1 + D * Rb/ n) / (1- Pb)n where n = number of bits in a block, k = number of information bits in a block, D = round trip delay, Rb= bit rate, Pb = BER of the channel, and PACK = (1- Pb)n Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 27
  • 28.
    Go-Back-N ARQ (GBNARQ) Transmitting Data 1 Time Received Data Go-back 3 Go-back 5 2 3 4 5 3 4 5 6 7 5 NAK NAK 2 3 4 5 Error Error 1 Time 1 2 3 4 5 Output Data Time Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 28
  • 29.
    Go-Back-N ARQ (GBNARQ) Throughput S = (1/T) * (k/n) = [(1- Pb)n / ((1- Pb)n + N * (1-(1- Pb)n ) )]* (k/n) where T = 1 * PACK + (N+1) * PACK * (1- PACK) +2 * (N+1) * PACK * (1- PACK)2 + …. = 1 + (N * [1 - (1- Pb)n ])/ (1- Pb)n Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 29
  • 30.
    Selective-Repeat ARQ (SRARQ) Transmitting Data 1 Retransmission Time Received Data Retransmission 2 3 4 5 3 6 7 8 9 7 NAK NAK 1 2 4 5 3 6 8 9 7 Time Error Error Buffer 1 Time 2 4 5 3 6 8 9 7 Output Data 1 Time 2 3 4 5 6 7 8 9 Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 30
  • 31.
    Selective-Repeat ARQ (SRARQ) Throughput S = (1/T) * (k/n) = (1- Pb)n * (k/n) where T = 1 * PACK + 2 * PACK * (1- PACK) + 3 * PACK * (1- PACK)2 + …. = 1/(1- Pb)n Copyright © 2002, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 31