0
Upcoming SlideShare
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Standard text messaging rates apply

# Serially concatenated chaos-based coded modulations

282

Published on

0 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

• Be the first to like this

Views
Total Views
282
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
13
0
Likes
0
Embeds 0
No embeds

No notes for slide

### Transcript

• 1. Serial Concatenation of Channel and Chaotic EncodersF.J. Escribano, L. López and M.A.F. Sanjuán Universidad Rey Juan Carlos Spain e-mail: francisco.escribano@urjc.es Dijon, 6-9th June 2006
• 2. General Setup: Transmitter  Similar setup as the serial concatenation of channel encoders:  Outer encoder: convolutional (rate 1/2)  Inner encoder: chaotic (rate 1)  Parameters:  Convolutional encoder  Kind of chaotic encoder (map)  Q: quantization level of the chaotic encoder  N: size of interleaver π  Structure of permutations in π.2
• 3. General Setup: Chaotic Encoder (I)  Tent map encoder. Map view:  1  2 xn −1 0 ≤ xn −1 ≤ xn = f ( xn −1 ) =  2 1 2 − 2 n −1 ≤ xn −1 ≤ 1  2  Convolutional encoder view. Quantized version driven by small perturbations: g (un , xn −1 ) xn = f ( xn −1 ) + 2Q +1  1 2Q +1 − 1 xn ∈  Q +1 , , Q +1  2 2 3
• 4. General Setup: Chaotic Encoder (II)  The chaotic encoding can be seen as the following recursion (convolutional encoding view). ri = rQ +1 ⊗ ri −1 i = Q + 1,  ,2  1 u n 0 ≤ xn −1 < Q +1 g (un , xn −1 ) =  2 r1 = un ⊗ rQ +1 xn = ∑ 2 −( Q + 2−i ) ri  un 1 ≤ xn −1 < 1 i =1  2  There are well defined transitions from a starting state to an endind state, driven by an input bit and generating a well defined output symbol. The theory of SISO blocks can thus be applied at the receiver.4
• 5. General Setup: Interleaver  An S-random interleaver is a good choice in many occasions, and we use it here. It ensures that two adjacent bits in the input word are separated at least S positions in the output word. Permutation {π 1 , , π N } Input bit ci is written in position π i → uπ i = ci  The permutation of indexes performed by the interleaver is chosen according to:  Choose destination index for input at position ‘i’ randomly and keep it if it lies at a distance higher than S for all the S preceding indexes  To ensure fast convergence: N S< 25
• 6. General Setup: Decoder  It consists on two SISO (soft input soft output) decoding blocks (working iterativelly) adapted according to the ‘edges’ defined by the respective encoders.  The decoders interchange soft information in the form of log likelihood ratios (LLR’s).  p (bn = 1 c1c2  c N )  π (bn ; O) = log  p(b = 0 c c  c )  6  n 1 2 N 
• 7. General Setup: Channel Model  In the general case, the channel has at least two sources of perturbations:  White noise, in the form of additive white Gaussian noise (AWGN)  Bandlimited transmission, due normally to the presence of filters in the transmitter (to comply with transmission masks) or in the receiver (to reject undesired out-of-band signals)  The signal at the receiver will be affected by intersymbol interference (ISI, given by a lowpass filter, LPF) and Gaussian noise.7
• 8. Analysis: Convergence (I)  Convergence of the decoding algorithm can be analysed using the EXIT (EXtrinsic Information Transfer) charts, that compare the mutual information of the LLR’s at the input and output of each SISO. 1 ∞  2 p ( x / b)  I = ∑ ∫ p ( x / b) log 2   p ( x / b = 1) + p ( x / b = 0)  dx  2 b =0,1 −∞  8
• 9. Analysis: Convergence (II)  Inner encoder: tent map chaotic encoder with Q=5.  Outer encoder: non-recursive non-systematic convolutional encoder, R=1/2, memory 3, with generating polynomials 1001 and 1101.9
• 10. Simulation results  Same encoders + S-random interleaver, N=10000, S=23, 20 iterations.  Previous work with g(uu,xn-1)=un (Proceedings of ISSPA, Sydney, 2005, vol. 1, pp. 275-278) showed that no better BER could be expected than with the convolutional encoder alone.10
• 11. Conclusions  The additional feedback connection g(uu,xn-1) determines a dramatic enhancement in BER performance.  Standard tools like EXIT charts can be used to analyse the system.  There is a around a 0.5 dB penalty in the threshold SNR for decoding convergence as shown by the EXIT chart analysis due to the finite lenght interleaver.  The performance doesn’t degradate much when in presence of ISI due to the reduction of the allowed bandwidth in a 10%.  The concatenated convolutional and chaotic system is nonlinear and sends chaotic-like samples to the channel, which is the main difference face to using only concatenated convolutional or block encoders, or to using trellis coded modulation. This could be an11 advantage in specific environments.
• 12. Future work  Make an exhaustive study of other possible encoding structures, based upon other chaotic maps, and try to find general properties and design criteria.  Verify the influence of the design parameters (S, N, Q…).  Include other kind of channels (e.g. Rayleigh fading channels) and verify the robustness or suitability of the system both theoretically and by simulation.  Look for possible applications of this nonlinear high rate good performing scheme.  Try to exploit the chaotic characteristics of the system in analysis and performance.12