• Like
Tele4653 l11
Upcoming SlideShare
Loading in...5
×

Tele4653 l11

  • 504 views
Uploaded on

 

More in: Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
504
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
30
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. TELE4653 Digital Modulation & Coding Convolutional Codes Wei Zhang w.zhang@unsw.edu.au School of EET, UNSW
  • 2. Last time, we talked about: Channel coding Linear block codes  The error detection and correction capability  Encoding and decoding  Hamming codes2010-May-17 TELE4653 - Convolutional Codes 2
  • 3. Today, we are going to talk about: Another class of linear codes, known as Convolutional codes. Part I – Encoder Part II -- Decoding2010-May-17 TELE4653 - Convolutional Codes 3
  • 4. PART I -- Encoder We’ll study the structure of the encoder. We’ll study different ways for representing the encoder. We’ll study in particular, state diagram and trellis representation of the code.2010-May-17 TELE4653 - Convolutional Codes 4
  • 5. Convolutional codes Convolutional codes offer an approach to error control coding substantially different from that of block codes.  A convolutional encoder:  encodes the entire data stream, into a single codeword.  does not need to segment the data stream into blocks of fixed size.  is a machine with memory. 2010-May-17 TELE4653 - Convolutional Codes 5
  • 6. Convolutional codes-cont’d A Convolutional code is specified by three parameters (n, k , K ) or (k / n, K ) where  Rc  k / n is the coding rate, determining the number of data bits per coded bit.  In practice, usually k=1 is chosen.  K is the constraint length of the encoder and the encoder has K-1 memory elements.2010-May-17 TELE4653 - Convolutional Codes 6
  • 7. A Rate ½ Convolutional encoder Convolutional encoder (rate ½, K=3)  3 shift-registers where the first one takes the incoming data bit and the rest, form the memory of the encoder. u1 First coded bit (Branch word) Input data bits Output coded bits m u1 ,u2 u2 Second coded bit2010-May-17 TELE4653 - Convolutional Codes 7
  • 8. A Rate ½ Convolutional encoder Message sequence: m  (101)Time Output Time Output (Branch word) (Branch word) u1 u1 u1 u 2 u1 u 2 t1 1 0 0 t2 0 1 0 1 1 1 0 u2 u2 u1 u1 u1 u 2 u1 u 2 t3 1 0 1 t4 0 1 0 0 0 1 0 u2 u2 2010-May-17 TELE4653 - Convolutional Codes 8
  • 9. A Rate ½ Convolutional encoderTime Output Time Output (Branch word) (Branch word) u1 u1 u1 u 2 u1 u 2 t5 0 0 1 t6 0 0 0 1 1 0 0 u2 u2 m  (101) Encoder U  (11 10 00 10 11)2010-May-17 TELE4653 - Convolutional Codes 9
  • 10. Effective code rate Initialize the memory before encoding the first bit (all- zero) Clear out the memory after encoding the last bit (all- zero)  Hence, a tail of zero-bits is appended to data bits. data tail Encoder codeword Effective code rate :  L is the number of data bits and k=1 is assumed: L 1 Reff   Rc  n( L  K  1) n2010-May-17 TELE4653 - Convolutional Codes 10
  • 11. Encoder representation Vector representation:  We define n binary vector with K elements (one vector for each modulo-2 adder). The i-th element in each vector, is “1” if the i-th stage in the shift register is connected to the corresponding modulo- 2 adder, and “0” otherwise.  Example: u1 g1  (111) m u1 u 2 g 2  (101) u22010-May-17 TELE4653 - Convolutional Codes 11
  • 12. State diagram A finite-state machine only encounters a finite number of states. State of a machine: the smallest amount of information that, together with a current input to the machine, can predict the output of the machine. In a Convolutional encoder, the state is represented by the content of the memory. Hence, there are 2 states. K 12010-May-17 TELE4653 - Convolutional Codes 12
  • 13. State diagram – cont’d A state diagram is a way to represent the encoder. A state diagram contains all the states and all possible transitions between them. Only two transitions initiating from a state Only two transitions ending up in a state2010-May-17 TELE4653 - Convolutional Codes 13
  • 14. State diagram – cont’d Current input Next output 0/00 Output state state (Branch word) S0 Input S0 0 S0 00 1/11 00 0 / 11 00 1 S2 11 1/00 S1 0 S0 11 S2 S1 10 01 01 1 S2 00 0/10 S2 0 S1 10 1/01 S3 0/01 10 1 S3 01 0 01 11 S3 S1 1/10 11 1 S3 102010-May-17 TELE4653 - Convolutional Codes 14
  • 15. Trellis – cont’d Trellis diagram is an extension of the state diagram that shows the passage of time.  Example of a section of trellis for the rate ½ code State S 0  00 0/00 1/11 S 2  10 0/11 1/00 S1  01 1/01 0/10 0/01 1/10 S3  11 ti ti 1 Time2010-May-17 TELE4653 - Convolutional Codes 15
  • 16. Trellis –cont’d  A trellis diagram for the example code. Input bits Tail bits 1 0 1 0 0 Output bits 11 10 00 10 11 0/00 0/00 0/00 0/00 0/00 1/11 1/11 1/11 1/11 1/11 0/11 0/11 0/11 0/11 0/11 1/00 1/00 1/00 1/00 1/00 0/10 0/10 0/10 0/10 0/10 1/01 1/01 1/01 1/01 1/01 0/01 0/01 0/01 0/01 0/01 1/10 1/10 1/10 1/10 1/10t1 t 2 t 3 t 4 t 5 t 6 2010-May-17 TELE4653 - Convolutional Codes 16
  • 17. Trellis – cont’d Input bits Tail bits 1 0 1 0 0 Output bits 11 10 00 10 11 0/00 0/00 0/00 0/00 0/00 1/11 1/11 1/11 0/11 0/11 0/11 0/10 1/00 0/10 0/10 1/01 1/01 0/01 0/01 1/10t1 t 2 t 3 t 4 t 5 t 6 2010-May-17 TELE4653 - Convolutional Codes 17
  • 18. PART II -- Decoding How the decoding is performed for Convolutional codes? What is a Maximum likelihood decoder? How does the Viterbi algorithm work?2010-May-17 TELE4653 - Convolutional Codes 18
  • 19. Block diagram of the DCSInformation Rate 1/n Modulator source Conv. encoder m  (m1 , m2 ,..., mi ,...) U  G(m)      (U1 , U 2 , U 3 ,..., U i ,...) Channel Input sequence     Codeword sequence U i  u1i ,...,u ji ,...,u ni    Branch word ( n coded bits)Information Rate 1/n Demodulator sink Conv. decoder m  (m1 , m2 ,..., mi ,...) ˆ ˆ ˆ ˆ Z  ( Z1 , Z 2 , Z 3 ,..., Z i ,...)     received sequence Zi  z1i ,...,z ji ,...,zni     Demodulator outputs n outputs per Branch word for Branch word i2010-May-17 TELE4653 - Convolutional Codes 19
  • 20. Optimum decoding If the input sequence messages are equally likely, the optimum decoder which minimizes the probability of error is the Maximum Likelihood (ML) decoder. ML decoder, selects a codeword among all the possible codewords which maximizes the likelihood function p (Z | U (m) ) where Z is the received sequence and U (m) is one of the possible codewords: 2 L codewords to search!!! ML decoding rule: Choose U ( m) if p (Z | U ( m) )  max(m) p (Z | U ( m ) ) over all U2010-May-17 TELE4653 - Convolutional Codes 20
  • 21. The Viterbi algorithm The Viterbi algorithm performs Maximum Likelihood decoding. It finds a path through trellis with the largest metric (maximum correlation or minimum distance).  It processes the demodulator outputs in an iterative manner.  At each step in the trellis, it compares the metric of all paths entering each state, and keeps only the path with the largest metric, called the survivor, together with its metric.  It proceeds in the trellis by eliminating the least likely paths. It reduces the decoding complexity to L 2 K 1 !2010-May-17 TELE4653 - Convolutional Codes 21
  • 22. The Viterbi algorithm - cont’d Viterbi algorithm:A. Do the following set up:  For a data block of L bits, form the trellis. The trellis has L+K-1 sections or levels and starts at time t1 and ends up at time t L  K .  Label all the branches in the trellis with their corresponding branch metric.  For each state in the trellis at the time ti which is denoted by S (ti )  {0,1,...,2 K 1} , define a parameter S (ti ), ti B. Then, do the following: 2010-May-17 TELE4653 - Convolutional Codes 22
  • 23. The Viterbi algorithm - cont’d 1. Set (0, t1 )  0 and i  2. 2. At time ti , compute the partial path metrics for all the paths entering each state. 3. Set S (ti ), ti  equal to the best partial path metric entering each state at time ti . Keep the survivor path and delete the dead paths from the trellis. 4. If i  L  K , increase i by 1 and return to step 2.C. Start at state zero at time t L  K . Follow the surviving branches backwards through the trellis. The path thus defined is unique and correspond to the ML codeword.2010-May-17 TELE4653 - Convolutional Codes 23
  • 24. Example of Viterbi decoding m  (101) Source message U  (11 10 00 10 11) Codeword to be transmitted Z  (11 10 11 10 01) Received codeword 0/00 0/00 0/00 0/00 0/00 1/11 1/11 1/11 0/11 0/11 0/11 0/10 1/00 0/10 0/10 1/01 1/01 0/01 0/01 1/10 t1 t2 t3 t4 t5 t62010-May-17 TELE4653 - Convolutional Codes 24
  • 25. Example of Viterbi decoding-cont’d Label all the branches with the branch metric (Hamming distance) S (ti ), ti  0 2 1 2 1 1 0 1 0 0 1 1 2 0 1 0 1 2 2 1 1 t1 t2 t3 t4 t5 t62010-May-17 TELE4653 - Convolutional Codes 25
  • 26. Example of Viterbi decoding-cont’d i=2 0 2 2 1 2 1 1 0 1 0 0 0 1 1 2 0 1 0 1 2 2 1 1 t1 t2 t3 t4 t5 t62010-May-17 TELE4653 - Convolutional Codes 26
  • 27. Example of Viterbi decoding-cont’d i=3 0 2 2 1 3 2 1 1 0 1 0 0 3 0 1 1 2 0 1 0 0 1 2 2 2 1 1 t1 t2 t3 t4 t5 t62010-May-17 TELE4653 - Convolutional Codes 27
  • 28. Example of Viterbi decoding-cont’d i=4 0 2 2 1 3 2 0 1 1 0 1 0 0 3 2 0 1 1 1 2 0 0 0 3 1 2 2 2 1 3 1 t1 t2 t3 t4 t5 t62010-May-17 TELE4653 - Convolutional Codes 28
  • 29. Example of Viterbi decoding-cont’d i=5 0 2 2 1 3 2 0 1 1 1 0 1 0 0 3 2 0 1 1 1 2 0 0 2 0 3 1 2 2 2 1 3 1 t1 t2 t3 t4 t5 t62010-May-17 TELE4653 - Convolutional Codes 29
  • 30. Example of Viterbi decoding-cont’d i=6 0 2 2 1 3 2 0 1 1 1 2 0 1 0 0 3 2 0 1 1 1 2 0 0 2 0 3 1 2 2 2 1 3 1 t1 t2 t3 t4 t5 t62010-May-17 TELE4653 - Convolutional Codes 30
  • 31. Example of Viterbi decoding-cont’d Trace back and then: m  (100) ˆ U  (11 10 11 00 00) There are some decoding errors. ˆ 0 2 2 1 3 2 0 1 1 1 2 0 1 0 0 3 2 0 1 1 1 2 0 0 2 0 3 1 2 2 2 1 3 1 t1 t2 t3 t4 t5 t62010-May-17 TELE4653 - Convolutional Codes 31