Capacity-achieving codes for finite-state channels with maximum-likelihood decoding
IEEE Journal on Selected Areas in Communications - Special issue on capaciyy approaching codes
IEEE Transactions on Signal Processing
Sharp bounds for optimal decoding of low-density parity-check codes
IEEE Transactions on Information Theory
On universal properties of capacity-approaching LDPC code ensembles
IEEE Transactions on Information Theory
On the fundamental system of cycles in the bipartite graphs of LDPC code ensembles
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Lower bounds on the graphical complexity of finite-length LDPC codes
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Linear programming bounds on the degree distributions of LDPC code ensembles
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Capacity-achieving codes for channels with memory with maximum-likelihood decoding
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Low-density graph codes that are optimal for binning and coding with side information
IEEE Transactions on Information Theory
Iterative decoding threshold analysis for LDPC convolutional codes
IEEE Transactions on Information Theory
Hi-index | 755.08 |
The moderate complexity of low-density parity-check (LDPC) codes under iterative decoding is attributed to the sparseness of their parity-check matrices. It is therefore of interest to consider how sparse parity-check matrices of binary linear block codes can be a function of the gap between their achievable rates and the channel capacity. This issue was addressed by Sason and Urbanke, and it is revisited in this paper. The remarkable performance of LDPC codes under practical and suboptimal decoding algorithms motivates the assessment of the inherent loss in performance which is attributed to the structure of the code or ensemble under maximum-likelihood (ML) decoding, and the additional loss which is imposed by the suboptimality of the decoder. These issues are addressed by obtaining upper bounds on the achievable rates of binary linear block codes, and lower bounds on the asymptotic density of their parity-check matrices as a function of the gap between their achievable rates and the channel capacity; these bounds are valid under ML decoding, and hence, they are valid for any suboptimal decoding algorithm. The new bounds improve on previously reported results by Burshtein and by Sason and Urbanke, and they hold for the case where the transmission takes place over an arbitrary memoryless binary-input output-symmetric (MBIOS) channel. The significance of these information-theoretic bounds is in assessing the tradeoff between the asymptotic performance of LDPC codes and their decoding complexity (per iteration) under message-passing decoding. They are also helpful in studying the potential achievable rates of ensembles of LDPC codes under optimal decoding; by comparing these thresholds with those calculated by the density evolution technique, one obtains a measure for the asymptotic suboptimality of iterative decoding algorithms