Most of the algebraic hard decision decoders are of similar complexity, which is about o(d^2)
Recording channels are the most important application, perhaps!
Two forms of representations are equivalent!
Basic properties
The motivation of an efficient SISO decoder for RS codes
outline
Generalized Minimum Distance (GMD) Decoding (Forney 1966) Chase decoding (Chase 1972) Related works
Algebraic beyond half dmin decoding, which is the kernel of KV algorithm
Outline: 1) curve fitting! 2) multiplicity assignment, let (x,y) pass through (alpha, beta) m times 3) the larger the m, the larger the decoding radius 4) when beta fails in the decoding radius, (y-f(x)) can factorize Q(x,y) 5) smart way to construct Q(x,y) and factorize it 6) when soft information is available, can do weighted multiplicity assignment
KV 1) performance analysis becomes interesting 2) no longer a fixed radius 3) sufficient condition
KV 1) failure is interesting, can we do better?
KV 1) failure is interesting, can we do better?
Life becomes much easier when we go to bit level
When turbo code comes, people realize convolutional codes are inherently bad codes, but how about algebraic codes?
Asymptotically be optimal, but practically suffer a loss!
More significant difference
Motivation for bit level decomposition
Maximum-likelihood SISO decoding and variations Trellis based decoding using the binary image expansions of RS codes over GF(2 m ) (Vardy & Be’ery 1991) Reduced complexity version (Ponnampalam & Vucetic 2002) SISO (Ponnampalam & Grant 2003)
1) BCH subcode and glue vectors 2) Useful to construct sparse parity check matrices for short codes
Efficient for general linear block codes
The nice property of ldpc codes is that the parity check matrix is sparse, thus, the probability that two erased bits participate in another check diminishes.
Even in the optimistic case, erasure channel, it won’t give good results.
It is impossible to have a sparse representation of RS codes
Iterative decoding for general linear block codes (tough problem!)
Since the parity check matrix is of full rank, we are guaranteed to get (n-k) LRB
Define syndrome as binary summation of all participating bits Define the soft syndrome product of the channel reliabilities J is minimized iff the decoding converges to a valid codeword
A variation of standard gradient descent
Think about erasure channel. This does happen! J is not minimized but we get zero gradient!
Not going deep to details!
KV 0.65dB BMA 1dB With extreme huge complexity, we gain 1.6dB gain at FER = 10^-4.
In practice, there is a huge difference!
Case study!
Coding gain may shrink in practical systems
No really successful bit level soft decoder which can handle long constraint length codes
Acknowledgements
Soft Decision Decoding Algorithms of Reed-Solomon Codes
2.
Historical Review of Reed Solomon CodesDate of birth: 40 years ago (Reed and Solomon 1960)Related to non-binary BCH codes (Gorenstein and Zierler 1961)Efficient decoder: not until 6 years later (Berlekamp 1967)Linear feedback shift register (LFSR) interpretation (Massey 1969)Other algebraic hard decision decoder:Euclid’s Algorithm (Sugiyama et al. 1975)Frequency-domain decoding (Gore 1973 and Blahut 1979)
3.
Wide Range of Applications of Reed Solomon CodesNASA Deep Space: CC + RS(255, 223, 32)Multimedia Storage:CD: RS(32, 28, 4), RS(28, 24, 4) with interleavingDVD: RS(208, 192, 16), RS(182, 172, 10) product codeDigitial Video Broadcasting: DVB-T CC + RS(204, 188)Magnetic Recording: RS(255,239) etc. (nested RS code)
4.
Basic Properties of Reed Solomon Codes∏−+=−==12)()(where,)()()(:formpolynomialGenerator(2)tbbiixxgxmxgxc α=−+−−+−)12)(1(12)1(11:matrixcheckparityThetbNtbbNbHαααα1Nwhere),(overdefinedK)RS(N, −= mmqqGF))(),...,(),((),...,,()(:formevaluationPolynomial(1) 11021 −== NN fffxxxxc ααα1110 ...)(where −−+++= KK xfxffxf)(inelementsnonzerodistinctares mi qGFα
6.
Motivation for RS Soft Decision DecoderHard decision decoder does not fully exploit the decoding capabilityEfficient soft decision decoding of RS codes remains an open problemRS Coded Turbo Equalization System-+a prioriextrinsicinterleavinga prioriextrinsicΠΣsourceRS EncoderinterleavingPR Encodersinkhard decision+AWGN+RSDecoderChannelEqualizerde-interleavingΠ1Π−ΣSoft input soft output (SISO) algorithm is favorable
7.
Presentation OutlineIterative decoding for RS codesSymbol-level algebraic soft decision decodingSimulation resultsBinary expansion of RS codes and soft decoding algorithmsApplications and future works
9.
Reliability Assisted Hard Decision DecodingGeneralized Minimum Distance (GMD) Decoding (Forney 1966):New distance measure: generalized minimum distanceSuccessively erase the least reliable symbols and run the hard decision decoderGMD is shown to be asymptotically optimalChase Type-II decoding (Chase 1972):Exhaustively flip the least reliable symbols and run the hard decision decoderChase algorithm is also shown to be asymptotically optimalRelated works:Fast GMD (Koetter 1996)Efficient Chase (Kamiya 2001)Combined Chase and GMD for RS codes (Tang et al. 2001)Performance analysis of these algorithms for RS codes seems still open
10.
Bounded distance + 1 decoding (Berlekamp 1996)Beyond decoding for low rate RS codes (Sudan 1997)Decoding up errors (Guruswami and Sudan 1999)A good tutorial paper (JPL Report, McEliece 2003)Algebraic Beyond Half dmin List Decoding2/mind NKN −
11.
Outline of Algebraic Beyond Half Distance DecodingComplexity:Interpolation (Koetter’s fast algorithm):Factorization (Roth and Ruckenstein’s algorithm):)( 42mNO)(NKOFactorization Step:generate a list of y-roots, i.e.:Pick up the most likely codeword from the list L}))(deg(),,(|))((:][)({ KxfyxQxfyxFxfL <−∈=)(ˆ xfDecoding:Basic idea: find f(x), which fits as many points in pairs))(),...,(),((:codeworddTransmitte 21 Nfff ααα),...,,(:vectorReceived 21 Nβββ)),(( iif βα),( yxQInterpolation Step:Construct a bivariate polynomial of minimum (1,K-1) degree,which has a zero of order at , i.e.:m Nlll ,...,1),,( =βαmjiyxQ ll =+−− thanlessdegreeoftermnoinvolves),(if βα
12.
Algebraic Soft Interpolation Based List DecodingKoetter and Vardy algorithm (Koetter & Vardy 2003)Based on the Guruswami and Sudan’s algebraic list decodingUse the reliability information to assign multiplicitiesKV is optimal in multiplicity assignment for long RS codesReduced complexity KV (Gross et al. submitted 2003)Re-encoding technique: largely reduce the cost for high rate codesVLSI architecture (Ahmed et al. submitted 2003)
13.
Basic idea: interpolating more symbols using the soft informationThe interpolation and factorization is the same as GS algorithmSufficient condition for successful decoding:The complexity increases with , maximum number of multiplicitySoft Interpolation Based Decoding)()1(2)( MCKcSM −≥Definition:Reliability matrix:Multiplicity matrix:Score:Cost:cMcSM ,)( =∑∑= = +=qinjjimMC1 1,21)(Nq×Π)(Π= gM2)(MC
14.
Recent Works and RemarksThe ultimate gain of algebraic soft decoding (ASD) over AWGNchannel is about 1dBComplexity is scalable but prohibitively huge for large multiplicityThe failure pattern of ASD algorithm and optimal multiplicityassignment scheme is of interestRecent works on performance analysis and multiplicity assignment:Gaussian approximation (Parvaresh and Vardy 2003)Exponential bound (Ratnakar and Koetter 2004)Chernoff bound (El-Khamy and McEliece 2004)Performance analysis over BEC and BSC (Jiang and Narayanan 2005)
15.
Performance Analysis of ASD over Discrete Alphabet ChannelsPerformance Analysis over BEC and BSC (Jiang andNarayanan, accepted by ISIT2005)The analysis gives some intuition about the decoding radius of ASDWe investigate the bit-level decoding radius for high rate codesFor BEC, bit-level radius is twice as large as that of the BM algorithmFor BSC, bit-level radius is slightly larger than that of the BM algorithmIn conclusion, ASD is limited by its algebraic engine
16.
Binary Image Expansion of RSCodes and Soft Decision Decoding
17.
Binary Image Expansion of RS Codes over GF(2m)bmxGFx torbinary vecdim-manasexpressedbecan)2(known thatisIt ∈∀],...,,[ 1101 −× = NN cccC ],...,,,...,,...,,[ )1(1)1(1)0(1)1(0)1(0)0(0)1(−−−−−×= mNNNmNmb ccccccCKm)(Nm,RSexpansionbinaryahas)2(overcodeK)RS(N,ly,Consequent bmGF=−−−−−−−×−NKNKNKNNNKNHHHHHHH),1(1),1(0),1(1,01,00,0)(=−−−−−−−−×−1,1)(1,1)(0,1)(1,01,00,0)(NmmKNmKNmKNNmNmmKNbhhhhhhH
18.
Bit-level Weight Enumerator“The major drawback with RS codes (for satellite use) is that the presentgeneration of decoders do not make full use of bit-based soft decisioninformation” (Berlekamp)How does the binary expansion of RS codes perform under ML decoding?Performance analysis using its weight enumeratorAveraged ensemble weight enumerator of RS codes (Retter 1991)It gives some idea about how RS codes perform under ML decoding
21.
RemarksRS codes themselves are good codeHowever, ML decoding is NP-hard (Guruswami and Vardy 2004)Are there sub-optimal decoding algorithms using the binary expansions?
22.
Trellis based Decoding using BCH Subcode ExpansionMaximum-likelihood decoding and variations:Partition RS codes into BCH subcodes and glue vectors (Vardy and Be’ery 1991)Reduced complexity version (Ponnampalam and Vucetic 2002)Soft input soft output version (Ponnampalam and Grant 2003)
23.
Subfield Subcode DecompositionRemarks:Decomposition greatly reduces the trellis size for short codesImpractical for long codes, since the size of the glue vectors is very largeRelated work:Construct sparse representation for iterative decoding (Milenkovic and Vasic2004)Subspace subcode of Reed Solomon codes (Hattori et al. 1998)BCH subcodesGlue vector=4321000000000000~gluegluegluegluebGGGGBBBBG
24.
Reliability based Ordered Statistic DecodingReliability based decoding:Ordered Statistic Decoding (OSD) (Fossorier and Lin 1995)Box and Match Algorithm (BMA) (Valembois and Fossorier 2004)Ordered Statistic Decoding using preprocessing (Wu et al. 2004)Basic ideas:Order the received bits according to their reliabilitiesPropose hard decision reprocessing based on the most reliable basis (MRB)Remarks:The reliability based scheme is efficient for short to medium length codesThe complexity increases exponentially with the reprocessing orderBMA algorithm trade memory for time complexity
26.
How does the panacea of modern communication, iterativedecoding algorithm work for RS codes?Note that all the codes in the literature, for which we can usesoft decoding algorithms are sparse graph codes with smallconstraint length.A Quick Question
27.
How does standard message passing algorithm work?bit nodes…………. ………... . . . . . . . . …………….check nodes…………….erased bits? If two or more of the incoming messages are erasures the check is erasedFor the AWGN channel, two or more unreliable messages invalidate the check
28.
A Few Unreliable Bits “Saturate” the Non-sparse Parity Check Matrix[ ]000000000000000000000=bcIterative decoding is stuck due to only a few unreliable bits“saturating” the whole non-sparse parity check matrix=011110010001111101100001111101100011110010111101100011110010001001011111110101010100100001011111110101010011111110101010100001bHBinary image expansion of the parity check matrix of RS(7, 5) over GF(23)Consider RS(7, 5) over GF(23) :[ ]1.15.01.08.02.09.06.01.02.09.05.01.03.04.08.01.10.17.06.19.08.0 −−−−−=r
29.
Sparse Parity Check Matrices for RS Codes Can we find an equivalent binary parity check matrix that is sparse? For RS codes, this is not possible! The H matrix is the G matrix of the dual code The dual of an RS code is also an MDS Code Each row has weight at least (K+1)Typically, the row weight is much higher
30.
Iterative Decoding for RS CodesRecent progress on RS codes:Sub-trellis based iterative decoding (Ungerboeck 2003)Stochastic shifting based iterative decoding (Jiang and Narayanan,2004)Sparse representation of RS codes using GFFT (Yedidia, 2004)Iterative decoding for general linear block codes:Iterative decoding for general linear block codes (Hagenauer et al.1996)APP decoding using minimum weight parity checks (Lucas et al. 1998)Generalized belief propagation (Yedidia et al. 2000)
31.
Recent Iterative TechniquesSub-trellis based iterative decoding (Ungerboeck 2003)Self concatenation using sub-trellis constructed from the parity check matrix:Remarks:Performance deteriorates due to large number of short cyclesWork for short codes with small minimum distances=011110010001111101100001111101100011110010111101100011110010001001011111110101010100100001011111110101010011111110101010100001bHBinary image expansion of the parity check matrix of RS(7, 5) over GF(23)
32.
Recent Iterative Techniques (cont’d) Stochastic shifting based iterative decoding (Jiang and Narayanan, 2004) Due to the irregularity in the H matrix, iterative decoding favors some bits Taking advantage of the cyclic structure of RS codes],,,,,,[ 4321065 rrrrrrr ],,,,,,[ 6543210 rrrrrrr Stochastic shift prevent iterative procedure from getting stuckBest result: RS(63,55) about 0.5dB gain from HDD However, for long codes, the performance deterioratesShift by 2=101100101100110001111H
34.
Iterative Decoding Based on Adaptive Parity Check Matrixtransmitted codeword [ ]0011010=cIdea: reduce the sub-matrix corresponding to unreliable bits to asparse nature using Gaussian eliminationFor example, consider (7,4) Hamming code:parity check matrix=101011011001011101010H=101100101100110001111H=101100101100110001111HWe can make the (n-k) less reliable positions sparse!received vector [ ]1.01.02.14.11.06.01.1 −−−−−−=r
36.
Gradient Descent and Adaptive Potential FunctionThe decoding problem is relaxed as minimizing J using gradientdescent with the initial value T observed from the channelJ is also a function of H. It is adapted such that unreliable bits areseparated in order to avoid getting stuck at zero gradient points: ),( )0(THH ψ←Geometric interpretation (suggested by Ralf Koetter)Define the tanh domain transform as:The syndrome of a parity check can be expressed as:Define the soft syndrome as:Define the cost function as:∑⊕==1ijHji rs∏∏ ====11)(ijij HjHji LTS ν∑−=−=kniiSTHJ1),()2/tanh()( jjj LLT ==ν
37.
Two Stage Optimization ProcedureProposed algorithm is a generalization of the iterative decoding schemeproposed by Lucas et al. (1998), two-stage optimization procedure:The damping coefficient serves to control the convergent dynamics)))(()((1 },1|{)(1)(1)1(∑ ∏−= ≠=∈−−++←kni mjHjjtmtmtmijTTT νανν),( )()0()( ttTHH ψ←
38.
Avoid Zero Gradient PointAdaptive schemechanges the gradientand prevents itgetting stuck at zerogradient pointsZero gradientpoint
39.
Variations of the Generic AlgorithmConnect unreliable bits as deg-2Incorporate this algorithm with hard decision decoderAdapting the parity check matrix at symbol levelExchange bits in reliable and unreliable part. Run the decoder multiple timesReduced complexity partial updating scheme
55.
Potential Problems in ApplicationsRespective problems for various decoding schemes:Reliability assisted HDD: Gain is marginal in practical SNRsAlgebraic soft decoding: performance is limited by the algebraic natureReliability based decoding: huge memory, not scalable with SNRSub-code decomposition: only possible for very short codesIterative decoding: adapting Hb at each iteration is a huge costGeneral Problems:Coding gain may shrink down in practical systemsConcatenated with CC: difficult to generate the soft informationPerformance in the practical SNRs should be analyzed“In theory, there is no difference between theory and practice.But, in practice, there is…” (Jan L.A. van de Snepscheut)
56.
A Case Study (System Setups)Forward Error Control of a Digital Television Transmission Standard:Modulation format: 64 or 16 QAM modulation (semi-set partitioning mapping)Inner code: convolutional code rate=2/3 or 8/9Bit-interleaved coded modulation (BICM)Iterative demodulation and decoding (BICM-ID)The decoded bytes from inner decoder are interleaved and fed to outer decoderOuter code: RS(208,188) using hard decision decodingWill soft decoding algorithm significantly improve the overall performance?
58.
Future WorksHow to incorporate the proposed ADP with other soft decodingschemes?Taking advantage of the inherent structure of RS codes at bit levelMore powerful decoding tool, e.g., trellisExtend the idea of adaptive algorithms to demodulation and equalizationApply the ADP algorithm to quantization or to solve K-SAT problems
Be the first to comment