Practical loss-resilient codes
STOC '97 Proceedings of the twenty-ninth annual ACM symposium on Theory of computing
Information Theory and Reliable Communication
Information Theory and Reliable Communication
The generalized area theorem and some of its consequences
IEEE Transactions on Information Theory
Modern Coding Theory
Capacity, mutual information, and coding for finite-state Markov channels
IEEE Transactions on Information Theory
Random coding techniques for nonrandom codes
IEEE Transactions on Information Theory
The capacity of low-density parity-check codes under message-passing decoding
IEEE Transactions on Information Theory
Bounds on the maximum-likelihood decoding error probability of low-density parity-check codes
IEEE Transactions on Information Theory
Capacity-achieving sequences for the erasure channel
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
On the application of LDPC codes to arbitrary discrete-memoryless channels
IEEE Transactions on Information Theory
Feedback capacity of finite-state machine channels
IEEE Transactions on Information Theory
Matched information rate codes for Partial response channels
IEEE Transactions on Information Theory
Capacity-achieving ensembles for the binary erasure channel with bounded complexity
IEEE Transactions on Information Theory
Tight bounds for LDPC and LDGM codes under MAP decoding
IEEE Transactions on Information Theory
Analysis of low-density parity-check codes for the Gilbert-Elliott channel
IEEE Transactions on Information Theory
Simulation-Based Computation of Information Rates for Channels With Memory
IEEE Transactions on Information Theory
Capacity of Finite State Channels Based on Lyapunov Exponents of Random Matrices
IEEE Transactions on Information Theory
On Designing Good LDPC Codes for Markov Channels
IEEE Transactions on Information Theory
Upper Bounds on the Rate of LDPC Codes for a Class of Finite-State Markov Channels
IEEE Transactions on Information Theory
Parity-Check Density Versus Performance of Binary Linear Block Codes: New Bounds and Applications
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Sharp Bounds on Generalized EXIT Functions
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
A Generalization of the Blahut–Arimoto Algorithm to Finite-State Channels
IEEE Transactions on Information Theory
Capacity Achieving LDPC Codes Through Puncturing
IEEE Transactions on Information Theory
Joint iterative decoding of LDPC codes for channels with memory and erasure noise
IEEE Journal on Selected Areas in Communications
Capacity-achieving codes for channels with memory with maximum-likelihood decoding
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Hi-index | 0.00 |
Codes on sparse graphs have been shown to achieve remarkable performance in point-to-point channels with low decoding complexity. Most of the results in this area are based on experimental evidence and/or approximate analysis. The question of whether codes on sparse graphs can achieve the capacity of noisy channels with iterative decoding is still open, and has only been conclusively and positively answered for the binary erasure channel. On the other hand, codes on sparse graphs have been proven to achieve the capacity of memoryless, binary-input, output-symmetric channels with finite graphical complexity per information bit when maximum likelihood (ML) decoding is performed. In this paper, we consider transmission over finite-state channels (FSCs). We derive upper bounds on the average error probability of code ensembles with ML decoding. Based on these bounds we show that codes on sparse graphs can achieve the symmetric information rate (SIR) of FSCs, which is the maximum achievable rate with independently and uniformly distributed input sequences. In order to achieve rates beyond the SIR, we consider a simple quantization scheme that when applied to ensembles of codes on sparse graphs induces a Markov distribution on the transmitted sequence. By deriving average error probability bounds for these quantized code ensembles, we prove that they can achieve the information rates corresponding to the induced Markov distribution, and thus approach the FSC capacity.