I Tlecture 13a

1,130 views

Published on

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,130
On SlideShare
0
From Embeds
0
Number of Embeds
13
Actions
Shares
0
Downloads
59
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

I Tlecture 13a

  1. 1. Convolutional Codes Representation and Encoding <ul><li>Many known codes can be modified by an extra code symbol or by </li></ul><ul><li>deleting a symbol </li></ul><ul><li>* Can create codes of almost any desired rate </li></ul><ul><li>* Can create codes with slightly improved performance </li></ul><ul><li>The resulting code can usually be decoded with only a slight </li></ul><ul><li>modification to the decoder algorithm. </li></ul><ul><li>Sometimes modification process can be applied multiple times in </li></ul><ul><li>succession </li></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  2. 2. Modification to Known Codes <ul><li>Puncturing: delete a parity symbol </li></ul><ul><ul><li>(n,k) code  (n-1,k) code </li></ul></ul><ul><li>Shortening: delete a message symbol </li></ul><ul><ul><li>(n,k) code  (n-1,k-1) code </li></ul></ul><ul><li>Expurgating: delete some subset of codewords </li></ul><ul><ul><li>(n,k) code  (n,k-1) code </li></ul></ul><ul><li>Extending: add an additional parity symbol </li></ul><ul><ul><li>(n,k) code  (n+1,k) code </li></ul></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  3. 3. Modification to Known Codes… <ul><li>5. Lengthening: add an additional message symbol </li></ul><ul><ul><li>(n,k) code  (n+1,k+1) code </li></ul></ul><ul><li>6. Augmenting: add a subset of additional code words </li></ul><ul><ul><li>(n,k) code  (n,k+1) code </li></ul></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  4. 4. Interleaving <ul><li>We have assumed so far that bit errors are independent from one </li></ul><ul><li>bit to the next </li></ul><ul><li>In mobile radio, fading makes bursts of error likely. </li></ul><ul><li>Interleaving is used to try to make these errors independent again </li></ul>Depth Of Interleaving Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE 1 1 6 2 11 3 16 4 21 5  31 7  26 6 5 29 10 30 30 34 35 35 Length Order Bits Transmitted Order Bits Received
  5. 5. Concatenated Codes <ul><li>Two levels of coding </li></ul><ul><li>Achieves performance of very long code rates while maintaining </li></ul><ul><li>shorter decoding complexity </li></ul><ul><li>Overall rate is product of individual code rates </li></ul><ul><li>Codeword error occurs if both codes fail . </li></ul><ul><li>Error probability is found by first evaluating the error probability of </li></ul><ul><li>“ inner” decoder and then evaluating the error probability of “outer” </li></ul><ul><li>decoder. </li></ul><ul><li>Interleaving is always used with concatenated coding </li></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  6. 6. Block Diagram of Concatenated Coding Systems Data Bits Outer Encoder Interleave Inner Encoder Modulator Channel De - Modulator Inner Decoder De- Interleave Outer Decoder Data Out Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  7. 7. Practical Application : Coding for CD <ul><li>Each channel is sampled at 44000 samples/second </li></ul><ul><li>Each sample is quantized with 16 bits </li></ul><ul><li>Uses a concatenated RS code </li></ul><ul><ul><li>Both codes constructed over GF(256) (8-bits/symbol) </li></ul></ul><ul><ul><li>Outer code is a (28,24) shortened RS code </li></ul></ul><ul><ul><li>Inner code is a (32,28) extended RS code </li></ul></ul><ul><ul><li>In between coders is a (28,4) cross-interleaver </li></ul></ul><ul><ul><li>Overall code rate is r = 0.75 </li></ul></ul><ul><li>Most commercial CD players don’t exploit full power of the error correction coder </li></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  8. 8. Practical Application: Galileo Deep Space Probe <ul><li>Uses concatenated coding </li></ul><ul><ul><li>Inner code rate is ½, constraint length 7 convolutinal encoder </li></ul></ul><ul><ul><li>Outer Code (255,223) RS code over GF(256) – corrects any burst errors from convolutional codes </li></ul></ul><ul><ul><li>Overall Code Rate is r= 0.437 </li></ul></ul><ul><ul><li>A block interleaver held 2RS Code words </li></ul></ul><ul><ul><li>Deep space channel is severely energy limited but not bandwidth limited </li></ul></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  9. 9. IS-95 CDMA <ul><li>The IS-95 standard employs the rate (64,6) orthogonal (Walsh) code on the reverse link </li></ul><ul><li>The inner Walsh Code is concatenated with a rate 1/3, constraint length 9 convolutional code </li></ul>Data Transmission in a 3 rd Generation PCS <ul><li>Proposed ETSI standard employs RS Codes concatenated with </li></ul><ul><li>convolutional codes for data communication </li></ul><ul><li>Requirements; </li></ul><ul><ul><li>Ber of the order of 10 -6 </li></ul></ul><ul><ul><li>Moderate Latency is acceptable </li></ul></ul><ul><li>CDMA2000 uses turbo codes for data transmission </li></ul><ul><ul><li>ETSI has optional provisions for Turbo Coding </li></ul></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  10. 10. <ul><li>A Common Theme from Coding Theory </li></ul><ul><li>The real issue is the complexity of the decoder. </li></ul><ul><ul><li>For a binary code, we must match 2 n possible received sequences with code words </li></ul></ul><ul><li>Only a few practical decoding algorithms have been found: </li></ul><ul><ul><li>Berlekamp-Massey algorithm for clock codes </li></ul></ul><ul><ul><li>Viterbi algorithm (and similar technique) for </li></ul></ul><ul><ul><li>convolutional codes </li></ul></ul><ul><li>Code designers have focused on finding new codes that work with known algorithms </li></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  11. 11. <ul><li>Block Versus Convolutional Codes </li></ul><ul><li>Block codes take k input bits and produce n output bits, where k and n are large </li></ul><ul><ul><li>there is no data dependency between blocks </li></ul></ul><ul><ul><li>useful for data communcations </li></ul></ul><ul><li>Convolutional codes take a small number of input bits and produce a </li></ul><ul><li>small number of output bits each time period </li></ul><ul><ul><li>data passes through convolutional codes in a continuous stream </li></ul></ul><ul><ul><li>useful for low- latency communications </li></ul></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  12. 12. <ul><li>Convolutional Codes </li></ul><ul><li>k bits are input, n bits are output </li></ul><ul><li>Now k & n are very small (usually k=1-3, n=2-6) </li></ul><ul><li>Input depends not only on current set of k input bits, but also on past </li></ul><ul><li>input. </li></ul><ul><li>The number of bits which input depends on is called the &quot;constraint </li></ul><ul><li>length&quot; K. </li></ul><ul><li>Frequently, we will see that k= 1 </li></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  13. 13. Example of Convolutional Code k=1, n=2, K=3 convolutional code Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  14. 14. Example of Convolutional Code k=2, n=3, K=2 convolutional code Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  15. 15. <ul><li>Representations of Convolutional Codes </li></ul><ul><li>Encoder Block Diagram (shown above) </li></ul><ul><li>Generator Representation </li></ul><ul><li>Trellis Representation </li></ul><ul><li>State Diagram Representation </li></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  16. 16. <ul><li>Convolutional Code Generators </li></ul><ul><li>One generator vector for each of the n output bits: </li></ul><ul><li>The length of the generator vector for a rate r = k/n </li></ul><ul><li>code with constraint length K is K </li></ul><ul><li>The bits in the generator from left to right represent the </li></ul><ul><li>connections in the encoder circuit. A “1” represents a link from </li></ul><ul><li>the shift register. A “0” represents no link. </li></ul><ul><li>Encoder vectors are often given in octal representation </li></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  17. 17. Example of Convolutional Code k=1, n=2, K=3 convolutional code Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  18. 18. Example of Convolutional Code k=2, n=3, K=2 convolutional code Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  19. 19. <ul><li>State Diagram Representation </li></ul><ul><li>Contents of shift registers make up &quot;state&quot; of code: </li></ul><ul><ul><li>Most recent input is most significant bit of state. </li></ul></ul><ul><ul><li>Oldest input is least significant bit of state. </li></ul></ul><ul><ul><li>(this convention is sometimes reverse) </li></ul></ul><ul><li>Arcs connecting states represent allowable transitions </li></ul><ul><ul><li>Arcs are labeled with output bits transmitted during transition </li></ul></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  20. 20. Example of State Diagram Representation Of Convolutional Codes k=1, n=2, K=3 convolutional code Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  21. 21. <ul><li>Trellis Representation of Convolutional Code </li></ul><ul><li>State diagram is “unfolded” a function of time </li></ul><ul><li>Time indicated by movement towards right </li></ul><ul><li>Contents of shift registers make up &quot;state&quot; of code: </li></ul><ul><ul><li>Most recent input is most significant bit of state. </li></ul></ul><ul><ul><li>Oldest input is least significant bit of state. </li></ul></ul><ul><li>Allowable transitions are denoted by connects between </li></ul><ul><li>states </li></ul><ul><ul><li>transitions may be labeled with transmitted bits </li></ul></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  22. 22. Example of Trellis Diagram k=1, n=2, K=3 convolutional code Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  23. 23. Encoding Example Using Trellis Representation k=1, n=2, K=3 convolutional code <ul><li>We begin in state 00: </li></ul><ul><li>Input Data: 0 1 0 1 1 0 0 </li></ul><ul><li>Output: 0 0 1 1 0 1 0 0 10 10 1 1 </li></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  24. 24. <ul><li>Distance Structure of a Convolutional Code </li></ul><ul><li>The Hamming Distance between any two distinct code sequences </li></ul><ul><li>and is the number of bits in which they differ: </li></ul><ul><li>The minimum free Hamming distance d free of a convolutional code is the smallest Hamming distance separating any two distinct code sequences: </li></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  25. 25. <ul><li>Search for good codes </li></ul><ul><li>We would like convolutional codes with large free distance </li></ul><ul><ul><li>must avoid “catastrophic codes” </li></ul></ul><ul><li>Generators for best convolutional codes are generally found via computer search </li></ul><ul><ul><li>search is constrained to codes with regular structure </li></ul></ul><ul><ul><li>search is simplified because any permutation of identical </li></ul></ul><ul><li>generators is equivalent </li></ul><ul><ul><li>search is simplified because of linearity. </li></ul></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  26. 26. Best Rate 1/2 Codes Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  27. 27. Best Rate 1/3 Codes Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  28. 28. Best Rate 2/3 Codes Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  29. 29. <ul><li>Summary of Convolutional Codes </li></ul><ul><li>Convolutional Codes are useful for real-time applications because </li></ul><ul><li>they can be continously encoded and decoded </li></ul><ul><li>We can represent convolutional codes as generators, block </li></ul><ul><li>diagrams, state diagrams, and trellis diagrams </li></ul><ul><li>We want to design convolutional codes to maximize free distance </li></ul><ul><li>while maintaining non-catastrophic performance </li></ul>Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE

×