Error Control Coding, Second Edition
Error Control Coding, Second Edition
Decoding error-correcting codes via linear programming
Decoding error-correcting codes via linear programming
A link between quasi-cyclic codes and convolutional codes
IEEE Transactions on Information Theory
Good error-correcting codes based on very sparse matrices
IEEE Transactions on Information Theory
Time-varying periodic convolutional codes with low-density parity-check matrix
IEEE Transactions on Information Theory
Factor graphs and the sum-product algorithm
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
LDPC block and convolutional codes based on circulant matrices
IEEE Transactions on Information Theory
Using linear programming to Decode Binary linear codes
IEEE Transactions on Information Theory
LP Decoding Corrects a Constant Fraction of Errors
IEEE Transactions on Information Theory
Pseudo-Codeword Analysis of Tanner Graphs From Projective and Euclidean Planes
IEEE Transactions on Information Theory
An Efficient Pseudocodeword Search Algorithm for Linear Programming Decoding of LDPC Codes
IEEE Transactions on Information Theory
Hi-index | 754.84 |
Message-passing iterative decoders for low-density parity-check (LDPC) block codes are known to be subject to decoding failures due to so-called pseudocodewords. These failures can cause the large signal-to-noise ratio (SNR) performance of message-passing iterative decoding to be worse than that predicted by the maximum-likelihood (ML) decoding union bound. In this paper, we address the pseudocodeword problem from the convolutional code perspective. In particular, we compare the performance of LDPC convolutional codes with that of their "wrapped" quasi-cyclic block versions and we show that the minimum pseudoweight of an LDPC convolutional code is at least as large as the minimum pseudoweight of an underlying quasi-cyclic code. This result, which parallels a well-known relationship between the minimum Hamming weight of convolutional codes and the minimum Hamming weight of their quasi-cyclic counterparts, is due to the fact that every pseudocodeword in the convolutional code induces a pseudocodeword in the block code with pseudoweight no larger than that of the convolutional code's pseudocodeword. This difference in the weight spectra leads to improved performance at low-to-moderate SNRs for the convolutional code, a conclusion supported by simulation results.