Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Correlative level coding

12,769 views

Published on

Published in: Technology, Business
  • DOWNLOAD FULL BOOKS, INTO AVAILABLE FORMAT ......................................................................................................................... ......................................................................................................................... 1.DOWNLOAD FULL. PDF EBOOK here { https://tinyurl.com/y8nn3gmc } ......................................................................................................................... 1.DOWNLOAD FULL. EPUB Ebook here { https://tinyurl.com/y8nn3gmc } ......................................................................................................................... 1.DOWNLOAD FULL. doc Ebook here { https://tinyurl.com/y8nn3gmc } ......................................................................................................................... 1.DOWNLOAD FULL. PDF EBOOK here { https://tinyurl.com/y8nn3gmc } ......................................................................................................................... 1.DOWNLOAD FULL. EPUB Ebook here { https://tinyurl.com/y8nn3gmc } ......................................................................................................................... 1.DOWNLOAD FULL. doc Ebook here { https://tinyurl.com/y8nn3gmc } ......................................................................................................................... ......................................................................................................................... ......................................................................................................................... .............. Browse by Genre Available eBooks ......................................................................................................................... Art, Biography, Business, Chick Lit, Children's, Christian, Classics, Comics, Contemporary, Cookbooks, Crime, Ebooks, Fantasy, Fiction, Graphic Novels, Historical Fiction, History, Horror, Humor And Comedy, Manga, Memoir, Music, Mystery, Non Fiction, Paranormal, Philosophy, Poetry, Psychology, Religion, Romance, Science, Science Fiction, Self Help, Suspense, Spirituality, Sports, Thriller, Travel, Young Adult,
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • it is a good ppt sir
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

Correlative level coding

  1. 1. Baseband Pulse Transmission Correlative-Level Coding Baseband M-ary PAM Transmission Tapped-Delay-Line Equalization Eye Pattern Hyeong-Seok Yu Vada Lab. gargoyle@vada1.skku.ac.kr 1
  2. 2. Correlative-Level Coding Correlative-level coding (partial response signaling)  adding ISI to the transmitted signal in a controlled manner Since ISI introduced into the transmitted signal is known, its effect can be interpreted at the receiver A practical method of achieving the theoretical maximum signaling rate of 2W symbol per second in a bandwidth of W Hertz Using realizable and perturbation-tolerant filters 2
  3. 3. Correlative-Level CodingDuobinary Signaling Dou : doubling of the transmission capacity of a straight binary system Binary input sequence {bk} : uncorrelated binary symbol 1, 0 +1 ak =  if symbol bk is 1 ck = ak + ak −1 −1 if symbol bk is 0 3
  4. 4. Correlative-Level CodingDuobinary Signaling  Ideal Nyquist channel of bandwidth W=1/2Tb H I ( f ) = H Nyquist ( f )[1 + exp(− j 2πfTb )] = H Nyquist ( f )[exp( jπfTb ) + exp(− jπfTb )] exp(− jπfTb ) = 2 H Nyquist ( f ) cos(πfTb ) exp(− jπfTb ) 1, | f |≤ 1 / 2Tb H Nyquist ( f ) =  0, otherwise 2 cos(π fTb ) exp(− jπ fTb ), | f |≤ 1/ 2Tb HI ( f ) =   0, otherwise sin(πt / Tb ) sin[π (t − Tb ) / Tb ] hI (t ) = + πt / Tb π (t − Tb ) / Tb Tb2 sin(πt / Tb ) = πt (Tb − t ) 4
  5. 5. Correlative-Level CodingDuobinary Signaling The tails of hI(t) decay as 1/|t|2, which is a faster rate of decay than 1/|t| encountered in the ideal Nyquist channel. ^ Let a represent the estimate of the original pulse ak as k conceived by the receiver at time t=kTb ^ ^ ak = ck − ak −1 Decision feedback : technique of using a stored estimate of the previous symbol Propagate : drawback, once error are made, they tend to propagate through the output Precoding : practical means of avoiding the error propagation phenomenon before the duobinary coding 5
  6. 6. Correlative-Level CodingDuobinary Signalingd k = bk ⊕ d k −1  symbol 1 if either symbol bk or d k −1 is 1dk =   symbol 0 otherwise {dk} is applied to a pulse-amplitude modulator, producing a corresponding two-level sequence of short pulse {ak}, where +1 or –1 as before ck = ak + ak −1  0 if data symbol bk is 1 ck =  ±2 if data symbol bk is 0 6
  7. 7. Correlative-Level CodingDuobinary Signaling |ck|=1 : random guess in favor of symbol 1 or 0 If | ck |< 1, say symbol bk is 1 If | ck |> 1, say symbol bk is 0 7
  8. 8. Correlative-Level CodingModified Duobinary Signaling Nonzero at the origin : undesirable Subtracting amplitude-modulated pulses spaced 2Tb second ck = ak + ak −1 H IV ( f ) = H Nyquist ( f )[1 − exp(− j 4π fTb )] = 2 jH Nyquist ( f ) sin(2π fTb ) exp(− j 2π fTb )  2 j sin(2π fTb ) exp(− j 2π fTb ), | f |≤ 1/ 2Tb H IV ( f ) =   0, elsewhere sin(π t / Tb ) sin[π (t − 2Tb ) / Tb ] hIV (t ) = − π t / Tb π (t − 2Tb ) / Tb 2Tb2 sin(π t / Tb ) = π t (2Tb − t ) 8
  9. 9. Correlative-Level CodingModified Duobinary Signaling precoding d k = bk ⊕ d k − 2  symbol 1 if either symbol bk or d k − 2 is 1 =  symbol 0 otherwise 9
  10. 10. Correlative-Level CodingModified Duobinary Signaling |ck|=1 : random guess in favor of symbol 1 or 0 If | ck |> 1, say symbol bk is 1 If | ck |< 1, say symbol bk is 0 10
  11. 11. Correlative-Level CodingGeneralized form of correlative-level coding |ck|=1 : random guess in favor of symbol 1 or 0 Type of N w0 w1 w2 w3 w4 comments class I 2 1 1 Duobinary II 3 1 2 1 III 3 2 1 –1 IV 3 1 0 –1 Modified V 5 -1 0 2 0 -1 N −1  t  h(t ) = ∑ wn sin c  − n  n  Tb  11
  12. 12. Baseband M-ary PAM Trans.  Produce one of M possible amplitude level  T : symbol duration  1/T: signaling rate, symbol per second, bauds  Equal to log2M bit per second  Tb : bit duration of equivalent binary PAM : T = Tb log 2 M  To realize the same average probability of symbol error, transmitted power must be increased by a factor of M2/log2M compared to binary PAM 12
  13. 13. Tapped-delay-line equalization Approach to high speed transmission  Combination of two basic signal-processing operation  Discrete PAM  Linear modulation scheme The number of detectable amplitude levels is often limited by ISI Residual distortion for ISI : limiting factor on data rate of the system 13
  14. 14. Tapped-delay-line equalization Equalization : to compensate for the residual distortion Equalizer : filter  A device well-suited for the design of a linear equalizer is the tapped- delay-line filter  Total number of taps is chosen to be (2N+1) N h(t ) = ∑ w δ (t − kT ) k =− N k 14
  15. 15. Tapped-delay-line equalization P(t) is equal to the convolution of c(t) and h(t) N p(t ) = c(t ) ∗ h(t ) = c(t ) ∗ ∑ w δ (t − kT ) k =− N k N N = ∑ w c(t ) ∗ δ (t − kT ) = ∑ w c(t − kT ) k =− N k k =− N k nT=t sampling time, discrete convolution sum N p (nT ) = ∑ w c((n − k )T ) k =− N k 15
  16. 16. Tapped-delay-line equalization Nyquist criterion for distortionless transmission, with T used in place of Tb, normalized condition p(0)=1 1, n = 0 1, n=0 p (nT ) =  = 0, n ≠ 0 0, n = ±1, ± 2, .....,± N Zero-forcing equalizer  Optimum in the sense that it minimizes the peak distortion(ISI) – worst case  Simple implementation  The longer equalizer, the more the ideal condition for distortionless transmission 16
  17. 17. Adaptive Equalizer The channel is usually time varying  Difference in the transmission characteristics of the individual links that may be switched together  Differences in the number of links in a connection Adaptive equalization  Adjust itself by operating on the the input signal Training sequence  Precall equalization  Channel changes little during an average data call Prechannel equalization  Require the feedback channel Postchannel equalization synchronous  Tap spacing is the same as the symbol duration of transmitted signal 17
  18. 18. Adaptive Equalizer Least-Mean-Square Algorithm Adaptation may be achieved  By observing the error b/w desired pulse shape and actual pulse shape  Using this error to estimate the direction in which the tap-weight should be changed Mean-square error criterion  More general in application  Less sensitive to timing perturbations an : desired response, en : error signal, yn : actual response Mean-square error is defined by cost fuction ε = E en  2   18
  19. 19. Adaptive Equalizer Least-Mean-Square Algorithm Ensemble-averaged cross-correlation ∂ε  ∂e   ∂y  = 2 E en n  = −2 E en n  = −2 E [ en xn − k ] = −2 Rex (k ) ∂wk  ∂wk   ∂wk  Rex (k ) = E [ en xn − k ] Optimality condition for minimum mean-square error ∂ε =0 for k = 0, ± 1,...., ± N ∂wk 19
  20. 20. Adaptive Equalizer Least-Mean-Square Algorithm Mean-square error is a second-order and a parabolic function of tap weights as a multidimentional bowl-shaped surface Adaptive process is a successive adjustments of tap-weight seeking the bottom of the bowl(minimum value ε min ) Steepest descent algorithm  The successive adjustments to the tap-weight in direction opposite to the vector of gradient ∂ε / ∂wk )  Recursive formular (µ : step size parameter) 1 ∂ε wk (n + 1) = wk (n) − µ , k = 0, ± 1,...., ± N 2 ∂wk = wk (n) − µ Rex (k ), k = 0, ± 1,...., ± N 20
  21. 21. Adaptive Equalizer Least-Mean-Square Algorithm Least-Mean-Square Algorithm  Steepest-descent algorithm is not available in an unknown environment  Approximation to the steepest descent algorithm using instantaneous estimate ) Rex (k ) = en xn − k ) ) wk (n + 1) = wk (n) + µ en xn − k  LMS is a feedback system  In the case of small µ, roughly similar to steepest descent algorithm 21
  22. 22. Adaptive Equalizer Operation of the equalizer Training mode  Known sequence is transmitted and synchorunized version is generated in the receiver  Use the training sequence, so called pseudo-noise(PN) sequence Decision-directed mode  After training sequence is completed  Track relatively slow variation in channel characteristic Large µ : fast tracking, excess mean square error 22
  23. 23. Adaptive Equalizer Implementation Approaches Analog  CCD, Tap-weight is stored in digital memory, analog sample and multiplication  Symbol rate is too high Digital  Sample is quantized and stored in shift register  Tap weight is stored in shift register, digital multiplication Programmable digital  Microprocessor  Flexibility  Same H/W may be time shared 23
  24. 24. Adaptive Equalizer Decision-Feed back equalization Baseband channel impulse response : {hn}, input : {xn} yn = ∑hk xn −k k = h0 xn + ∑hk xn −k + ∑hk xn −k k <0 k >0 Using data decisions made on the basis of precursor to take care of the postcursors  The decision would obviously have to be correct 24
  25. 25. Adaptive EqualizerDecision-Feed back equalization  Feedforward section : tapped- delay-line equalizer  Feedback section : the decision is made on previously detected symbols of the input sequence  Nonlinear feedback loop by decision device ) (1) ) (1) ) (1)  wn   xn  wn +1 = wn +1 − µ1en xn cn =  ) (2)  vn =  )  en = an − cn vn T ) (2) ) (2) )  wn   an  wn +1 = wn +1 − µ1en an 25
  26. 26. Eye Pattern Experimental tool for such an evaluation in an insightful manner  Synchronized superposition of all the signal of interest viewed within a particular signaling interval Eye opening : interior region of the eye pattern In the case of an M-ary system, the eye pattern contains (M-1) eye opening, where M is the number of discreteamplitude levels 26
  27. 27. Eye Pattern 27

×