7 Shannon CapacityEvery communication channel is characterized by a single number C, called the channel capacity.It is possible to transmit information over this channel reliably (with probability of error → 0) if and only if:5/ 31/ 07

10 Coding We use a code to communicate over the noisy channel. Code rate:SourceEncoderChannelSinkDecoderCode rate:5/ 31/ 07

11 Shannon’s Coding TheoremsIf C is a code with rate R>C, then the probability of error in decoding this code is bounded away from 0. (In other words, at any rate R>C, reliable communication is not possible.)For any information rate R < C and any δ > 0, there exists a code C of length nδ and rate R, such that the probability of error in maximum likelihood decoding of this code is at most δ.Proof: Non-constructive!5/ 31/ 07

21 Parity-Check EquationsParity-check matrix implies system of linear equations.Parity-check matrix is not unique.Any set of vectors that span the rowspace generated by H can serve as the rows of a parity check matrix (including sets with more than 3 vectors).5/ 31/ 07

25 Review of Gallager’s PaperAnother pioneering work:Gallager, R. G., Low-Density Parity-Check Codes, M.I.T. Press, Cambridge, Mass: 1963.A more enlightened review:Horstein, M., IEEE Trans. Inform. Thoery, vol. 10, no. 2, p. 172, April 1964,“This book is an extremely lucid and circumspect exposition of an important piece of research. A comparison with other coding and decoding procedures designed for high-reliability transmission ... is difficult...Furthermore, many hours of computer simulation are needed to evaluate a probabilistic decoding scheme... It appears, however, that LDPC codes have a sufficient number of desirable features to make them highly competitive with ... other schemes ....”5/ 31/ 07

39 Encoding LDPC Codes Set cn-k+1,,cn equal to the data bits x1,,xk .Solve for parities cℓ, ℓ=1,, n-k, in reverse order; i.e., starting with ℓ=n-k, compute(complexity ~O(n2) )Another general encoding technique based upon “approximate lower triangulation” has complexity no more than O(n2), with the constant coefficient small enough to allow practical encoding for block lengths on the order of n=105.5/ 31/ 07

40 Linear Encoding ComplexityIt has been shown that “optimized” ensembles of irregular LDPC codes can be encoded with preprocessing complexity at most O(n3/2), and subsequent complexity ~O(n).It has been shown that a necessary condition for the ensemble of (, )-irregular LDPC codes to be linear-time encodable isAlternatively, LDPC code ensembles with additional “structure” have linear encoding complexity, such as “irregular repeat-accumulate (IRA)” codes.5/ 31/ 07

41 Decoding of LDPC CodesGallager introduced the idea of iterative, message-passing decoding of LDPC codes.The idea is to iteratively share the results of local node decoding by passing them along edges of the Tanner graph.We will first demonstrate this decoding method for the binary erasure channel BEC(ε).The performance and optimization of LDPC codes for the BEC will tell us a lot about other channels, too.5/ 31/ 07

45 MAP Decoding ComplexityLet E {1,,n} denote the positions of erasures in y, and let F denote its complement in {1,,n}.Let wE and wF denote the corresponding sub-words of word w.Let HE and HF denote the corresponding submatrices of the parity check matrix H.Then X(y), the set of codewords compatible with y, satisfiesSo, optimal (MAP) decoding can be done by solving a set of linear equations, requiring complexity at most O(n3).For large blocklength n, this can be prohibitive!5/ 31/ 07

46 We’ll compare its performance to that of optimal bit-wise decoding. Simpler DecodingWe now describe an alternative decoding procedure that can be implemented very simply.It is a “local” decoding technique that tries to fill in erasures “one parity-check equation at a time.”We will illustrate it using a very simple and familiar linear code, the (7,4) Hamming code.We’ll compare its performance to that of optimal bit-wise decoding.Then, we’ll reformulate it as a “message-passing” decoding algorithm and apply it to LDPC codes.5/ 31/ 07

47 Local Decoding of Erasuresdmin = 3, so any two erasures can be uniquely filled to get a codeword.Decoding can be done locally: Given any pattern of one or two erasures, there will always be a parity-check (circle) involving exactly one erasure.The parity-check represented by the circle can be used to fill in the erased bit.This leaves at most one more erasure. Any parity-check (circle) involving it can be used to fill it in.12347565/ 31/ 07

48 Local Decoding - ExampleAll-0’s codeword transmitted.Two erasures as shown.Start with either the red parity or green parity circle.The red parity circle requires that the erased symbol inside it be 0.??5/ 31/ 07

49 Local Decoding -ExampleNext, the green parity circle or the blue parity circle can be selected.Either one requires that the remaining erased symbol be 0.?5/ 31/ 07

50 Local Decoding -ExampleEstimated codeword:[ ]Decoding successful!!This procedure would have worked no matter which codeword was transmitted.5/ 31/ 07

51 Decoding with the Tanner Graph: an a-Peeling DecoderInitialization:Forward known variable node values along outgoing edgesAccumulate forwarded values at check nodes and “record” the parityDelete known variable nodes and all outgoing edges5/ 31/ 07

54 Decoding with the Tanner Graph: an a-Peeling DecoderDecoding step:Select, if possible, a check node with one edge remaining; forward its parity, thereby determining the connected variable nodeDelete the check node and its outgoing edgeFollow procedure in the initialization process at the known variable nodeTerminationIf remaining graph is empty, the codeword is determinedIf decoding step gets stuck, declare decoding failure5/ 31/ 07

60 Message-Passing DecodingThe local decoding procedure can be described in terms of an iterative, “message-passing” algorithm in which all variable nodes and all check nodes in parallel iteratively pass messages along their adjacent edges.The values of the code bits are updated accordingly.The algorithm continues until all erasures are filled in, or until the completion of a specified number of iterations.5/ 31/ 07

61 Variable-to-Check Node Message?ufrom channelv=?uv=u??edge eedge e???Variable-to-check message on edge eIf all other incoming messages are ?, send message v = ?If any other incoming message u is 0 or 1, send v=u and, if the bit was an erasure, fill it with u, too.(Note that there are no errors on the BEC, so a message that is 0 or 1 must be correct. Messages cannot be inconsistent.)5/ 31/ 07

67 Sub-optimality of Message-Passing DecoderHamming code: decoding of 3 erasuresThere are 7 patterns of 3 erasures that correspond to the support of a weight-3 codeword. These can not be decoded by any decoder!The other 28 patterns of 3 erasures can be uniquely filled in by the optimal decoder.We just saw a pattern of 3 erasures that was corrected by the local decoder. Are there any that it cannot?Test: ? ? ????15/ 31/ 07

68 Sub-optimality of Message-Passing DecoderTest: ? ? ?There is a unique way to fill the erasures and get a codeword:The optimal decoder would find it.But every parity-check has at least 2 erasures, so local decoding will not work!15/ 31/ 07

69 The empty set is a stopping set (trivially). Stopping SetsA stopping set is a subset S of the variable nodes such that every check node connected to S is connected to S at least twice.The empty set is a stopping set (trivially).The support set (i.e., the positions of 1’s) of any codeword is a stopping set (parity condition).A stopping set need not be the support of a codeword.5/ 31/ 07

73 Stopping Set PropertiesEvery set of variable nodes contains a largest stopping set (since the union of stopping sets is also a stopping set).The message-passing decoder needs a check node with at most one edge connected to an erasure to proceed.So, if the remaining erasures form a stopping set, the decoder must stop.Let E be the initial set of erasures. When the message-passing decoder stops, the remaining set of erasures is the largest stopping set S in E.If S is empty, the codeword has been recovered.If not, the decoder has failed.5/ 31/ 07

74 Suboptimality of Message-Passing DecoderAn optimal (MAP) decoder for a code C on the BEC fails if and only if the set of erased variables includes the support set of a codeword.The message-passing decoder fails if and only the set of erased variables includes a non-empty stopping set.Conclusion: Message-passing may fail where optimal decoding succeeds!!Message-passing is suboptimal!!5/ 31/ 07

75 Comments on Message-Passing DecodingBad news:Message-passing decoding on a Tanner graph is not always optimal...Good news:For any code C, there is a parity-check matrix on whose Tanner graph message-passing is optimal, e.g., the matrix of codewords of the dual codeThat Tanner graph may be very dense, so even message-passing decoding is too complex.5/ 31/ 07

77 Comments on Message-Passing DecodingGood news:If a Tanner graph is cycle-free, the message-passing decoder is optimal!Bad news:Binary linear codes with cycle-free Tanner graphs are necessarily weak...The Tanner graph of a long LDPC code behaves almost like a cycle-free graph!5/ 31/ 07

78 Analysis of LDPC Codes on BECIn the spirit of Shannon, we can analyze the performance of message-passing decoding on ensembles of LDPC codes with specified degree distributions (λ,ρ).The results of the analysis allow us to design LDPC codes that transmit reliably with MP decoding at rates approaching the Shannon capacity of the BEC.In fact, sequences of LDPC codes have been designed that actually achieve the Shannon capacity.The analysis can assume the all-0’s codeword is sent.5/ 31/ 07

79 Convergence to cycle-free performance Key Results - 1ConcentrationWith high probability, the performance of ℓ rounds of MP decoding on a randomly selected (n, λ, ρ) code converges to the ensemble average performance as the length n→∞.Convergence to cycle-free performanceThe average performance of ℓ rounds of MP decoding on the (n, λ, ρ) ensemble converges to the performance on a graph with no cycles of length ≤ 2ℓ as the length n→∞.5/ 31/ 07

81 Asymptotic Performance AnalysisWe assume a cycle-free (λ,ρ) Tanner graph.Let p0 = ε, the channel erasure probability.We find a recursion formula for pℓ , the probability that a randomly chosen edge carries a variable-to-check erasure message in round ℓ.We then find the largest ε such that pℓ converges to 0, as ℓ→∞. This value is called the threshold.This procedure is called “density evolution” analysis.5/ 31/ 07

85 Threshold InterpretationOperationally, this means that using a code drawn from the ensemble of length-n LDPC codes with degree distribution pair (λ, ρ), we can transmit as reliably as desired over the BEC(ε) channel iffor sufficiently large block length n .5/ 31/ 07

94 Recall that the MP convergence condition wasEXIT Chart AnalysisRecall that the MP convergence condition wasSince λ(x) is invertible, the condition becomesGraphically, this says that the curve for c(x) must lie below the curve for for all p < p*.5/ 31/ 07

97 EXIT Charts and Density EvolutionEXIT charts can be used to visualize density evolution.Assume initial fraction of erasure messages p0=p.The fraction of erasures emitted successively by check node qi and by variable nodes and pi are obtained by successively applying c(x) and vp(x).5/ 31/ 07

98 EXIT Charts and Density EvolutionGraphically, this computation describes a staircase function.If p < p*, there is a “tunnel” between vp-1(x) and c(x) through which the staircase descends to ground level, i.e., no erasures.If p > p*, the tunnel closes, stopping the staircase descent at a positive fraction of errors.5/ 31/ 07

109 This is called the matching condition. For capacity-achieving sequences of LDPC codes for the BEC, the EXIT chart curves must match.This is called the matching condition.Such sequences have been developed:Tornado codesRight-regular LDPC codesAccumulate-Repeat-Accumulate codes5/ 31/ 07

110 Decoding for Other ChannelsWe now consider analysis and design of LDPC codes for BSC(p) and BiAWGN(σ) channels. We call p and σ the “channel parameter” for these two channels, respectively.Many concepts, results, and design methods have natural (but non-trivial) extensions to these channels.The messages are probability mass functions or log-likelihood ratios.The message-passing paradigm at variable and check nodes will be applied.The decoding method is called “belief propagation” or BP, for short.5/ 31/ 07

112 Belief PropagationFor codes with cycle-free Tanner graphs, there is a message-passing approach to bit-wise MAP decoding.The messages are essentially conditional bit distributions, denoted u = [u(1), u(-1)].The initial messages presented by the channel to the variable nodes are of the formThe variable-to-check and check-to-variable message updates are determined by the “sum-product” update rule.The BEC decoder can be formulated as a BP decoder.5/ 31/ 07

119 Log-Likelihood Formulation – Check NodeTo see this, consider the special case of a degree 3 check node.It is easy to verify thatwhereThis can be generalized to a check node of any degree by a simple inductive argument.uv1v2v0rst5/ 31/ 07

121 Convergence to cycle-free performance Key Results -1ConcentrationWith high probability, the performance of ℓ rounds of BP decoding on a randomly selected (n, λ, ρ) code converges to the ensemble average performance as the length n→∞.Convergence to cycle-free performanceThe average performance of ℓ rounds of MP decoding on the (n, λ, ρ) ensemble converges to the performance on a graph with no cycles of length ≤ 2ℓ as the length n→∞.5/ 31/ 07

122 Computing the cycle-free performance Key Results -2Computing the cycle-free performanceThe cycle-free performance can be computed by a somewhat more complex, but still tractable, algorithm – density evolution.Threshold calculationThere is a threshold channel parameter p*(λ,ρ) such that, for any “better” channel parameter p, the cycle-free error probability approaches 0 as the number of iterations ℓ→∞.5/ 31/ 07

125 Threshold The threshold * is the maximum  such thatOperationally, this represents the minimum SNR such that a code drawn from the (,) ensemble will ensure reliable transmission as the block length approaches infinity.5/ 31/ 07

126 Degree Distribution OptimizationFor a given rate, the objective is to optimize (x) and (x) for the best threshold p*.The maximum left and right degrees are fixed.For some channels, the optimization procedure is not trivial, but there are some techniques that can be applied in practice.5/ 31/ 07

145 LDPC codes are now finding their way into many applications:Concluding RemarksLDPC codes are very powerful codes with enormous practical potential, founded upon deep and rich theory.There continue to be important advances in all of the key aspects of LDPC code design, analysis, and implementation.LDPC codes are now finding their way into many applications:Satellite broadcastCellular wirelessData storageAnd many more …5/ 31/ 07

About project

Feedback

To ensure the functioning of the site, we use cookies. We share information about your activities on the site with our partners and Google partners: social networks and companies engaged in advertising and web analytics. For more information, see the Privacy Policy and Google Privacy &amp Terms.
Your consent to our cookies if you continue to use this website.