Your SlideShare is downloading. ×
I Tlecture 13a
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

I Tlecture 13a

811
views

Published on

Published in: Technology

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
811
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
42
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Convolutional Codes Representation and Encoding
    • Many known codes can be modified by an extra code symbol or by
    • deleting a symbol
    • * Can create codes of almost any desired rate
    • * Can create codes with slightly improved performance
    • The resulting code can usually be decoded with only a slight
    • modification to the decoder algorithm.
    • Sometimes modification process can be applied multiple times in
    • succession
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 2. Modification to Known Codes
    • Puncturing: delete a parity symbol
      • (n,k) code  (n-1,k) code
    • Shortening: delete a message symbol
      • (n,k) code  (n-1,k-1) code
    • Expurgating: delete some subset of codewords
      • (n,k) code  (n,k-1) code
    • Extending: add an additional parity symbol
      • (n,k) code  (n+1,k) code
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 3. Modification to Known Codes…
    • 5. Lengthening: add an additional message symbol
      • (n,k) code  (n+1,k+1) code
    • 6. Augmenting: add a subset of additional code words
      • (n,k) code  (n,k+1) code
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 4. Interleaving
    • We have assumed so far that bit errors are independent from one
    • bit to the next
    • In mobile radio, fading makes bursts of error likely.
    • Interleaving is used to try to make these errors independent again
    Depth Of Interleaving Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE 1 1 6 2 11 3 16 4 21 5  31 7  26 6 5 29 10 30 30 34 35 35 Length Order Bits Transmitted Order Bits Received
  • 5. Concatenated Codes
    • Two levels of coding
    • Achieves performance of very long code rates while maintaining
    • shorter decoding complexity
    • Overall rate is product of individual code rates
    • Codeword error occurs if both codes fail .
    • Error probability is found by first evaluating the error probability of
    • “ inner” decoder and then evaluating the error probability of “outer”
    • decoder.
    • Interleaving is always used with concatenated coding
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 6. Block Diagram of Concatenated Coding Systems Data Bits Outer Encoder Interleave Inner Encoder Modulator Channel De - Modulator Inner Decoder De- Interleave Outer Decoder Data Out Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 7. Practical Application : Coding for CD
    • Each channel is sampled at 44000 samples/second
    • Each sample is quantized with 16 bits
    • Uses a concatenated RS code
      • Both codes constructed over GF(256) (8-bits/symbol)
      • Outer code is a (28,24) shortened RS code
      • Inner code is a (32,28) extended RS code
      • In between coders is a (28,4) cross-interleaver
      • Overall code rate is r = 0.75
    • Most commercial CD players don’t exploit full power of the error correction coder
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 8. Practical Application: Galileo Deep Space Probe
    • Uses concatenated coding
      • Inner code rate is ½, constraint length 7 convolutinal encoder
      • Outer Code (255,223) RS code over GF(256) – corrects any burst errors from convolutional codes
      • Overall Code Rate is r= 0.437
      • A block interleaver held 2RS Code words
      • Deep space channel is severely energy limited but not bandwidth limited
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 9. IS-95 CDMA
    • The IS-95 standard employs the rate (64,6) orthogonal (Walsh) code on the reverse link
    • The inner Walsh Code is concatenated with a rate 1/3, constraint length 9 convolutional code
    Data Transmission in a 3 rd Generation PCS
    • Proposed ETSI standard employs RS Codes concatenated with
    • convolutional codes for data communication
    • Requirements;
      • Ber of the order of 10 -6
      • Moderate Latency is acceptable
    • CDMA2000 uses turbo codes for data transmission
      • ETSI has optional provisions for Turbo Coding
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 10.
    • A Common Theme from Coding Theory
    • The real issue is the complexity of the decoder.
      • For a binary code, we must match 2 n possible received sequences with code words
    • Only a few practical decoding algorithms have been found:
      • Berlekamp-Massey algorithm for clock codes
      • Viterbi algorithm (and similar technique) for
      • convolutional codes
    • Code designers have focused on finding new codes that work with known algorithms
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 11.
    • Block Versus Convolutional Codes
    • Block codes take k input bits and produce n output bits, where k and n are large
      • there is no data dependency between blocks
      • useful for data communcations
    • Convolutional codes take a small number of input bits and produce a
    • small number of output bits each time period
      • data passes through convolutional codes in a continuous stream
      • useful for low- latency communications
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 12.
    • Convolutional Codes
    • k bits are input, n bits are output
    • Now k & n are very small (usually k=1-3, n=2-6)
    • Input depends not only on current set of k input bits, but also on past
    • input.
    • The number of bits which input depends on is called the "constraint
    • length" K.
    • Frequently, we will see that k= 1
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 13. Example of Convolutional Code k=1, n=2, K=3 convolutional code Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 14. Example of Convolutional Code k=2, n=3, K=2 convolutional code Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 15.
    • Representations of Convolutional Codes
    • Encoder Block Diagram (shown above)
    • Generator Representation
    • Trellis Representation
    • State Diagram Representation
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 16.
    • Convolutional Code Generators
    • One generator vector for each of the n output bits:
    • The length of the generator vector for a rate r = k/n
    • code with constraint length K is K
    • The bits in the generator from left to right represent the
    • connections in the encoder circuit. A “1” represents a link from
    • the shift register. A “0” represents no link.
    • Encoder vectors are often given in octal representation
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 17. Example of Convolutional Code k=1, n=2, K=3 convolutional code Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 18. Example of Convolutional Code k=2, n=3, K=2 convolutional code Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 19.
    • State Diagram Representation
    • Contents of shift registers make up "state" of code:
      • Most recent input is most significant bit of state.
      • Oldest input is least significant bit of state.
      • (this convention is sometimes reverse)
    • Arcs connecting states represent allowable transitions
      • Arcs are labeled with output bits transmitted during transition
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 20. Example of State Diagram Representation Of Convolutional Codes k=1, n=2, K=3 convolutional code Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 21.
    • Trellis Representation of Convolutional Code
    • State diagram is “unfolded” a function of time
    • Time indicated by movement towards right
    • Contents of shift registers make up "state" of code:
      • Most recent input is most significant bit of state.
      • Oldest input is least significant bit of state.
    • Allowable transitions are denoted by connects between
    • states
      • transitions may be labeled with transmitted bits
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 22. Example of Trellis Diagram k=1, n=2, K=3 convolutional code Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 23. Encoding Example Using Trellis Representation k=1, n=2, K=3 convolutional code
    • We begin in state 00:
    • Input Data: 0 1 0 1 1 0 0
    • Output: 0 0 1 1 0 1 0 0 10 10 1 1
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 24.
    • Distance Structure of a Convolutional Code
    • The Hamming Distance between any two distinct code sequences
    • and is the number of bits in which they differ:
    • The minimum free Hamming distance d free of a convolutional code is the smallest Hamming distance separating any two distinct code sequences:
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 25.
    • Search for good codes
    • We would like convolutional codes with large free distance
      • must avoid “catastrophic codes”
    • Generators for best convolutional codes are generally found via computer search
      • search is constrained to codes with regular structure
      • search is simplified because any permutation of identical
    • generators is equivalent
      • search is simplified because of linearity.
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 26. Best Rate 1/2 Codes Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 27. Best Rate 1/3 Codes Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 28. Best Rate 2/3 Codes Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
  • 29.
    • Summary of Convolutional Codes
    • Convolutional Codes are useful for real-time applications because
    • they can be continously encoded and decoded
    • We can represent convolutional codes as generators, block
    • diagrams, state diagrams, and trellis diagrams
    • We want to design convolutional codes to maximize free distance
    • while maintaining non-catastrophic performance
    Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE