Information Theory and Reliable Communication
Information Theory and Reliable Communication
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Secret Key Agreement from Correlated Source Outputs Using Low Density Parity Check Matrices
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
Hash property and coding theorems for sparse matrices and maximum-likelihood coding
IEEE Transactions on Information Theory
Good error-correcting codes based on very sparse matrices
IEEE Transactions on Information Theory
Bounds on the maximum-likelihood decoding error probability of low-density parity-check codes
IEEE Transactions on Information Theory
On the application of LDPC codes to arbitrary discrete-memoryless channels
IEEE Transactions on Information Theory
Using linear programming to Decode Binary linear codes
IEEE Transactions on Information Theory
The ML decoding performance of LDPC ensembles over Zq
IEEE Transactions on Information Theory
Low-density parity-check matrices for coding of correlated sources
IEEE Transactions on Information Theory
Source Coding Using Families of Universal Hash Functions
IEEE Transactions on Information Theory
Hi-index | 0.00 |
The aim of this paper is to prove the achievability of the general (asymmetric) channel coding problem based on the hash property introduced in [14][15]. Since an ensemble of q-ary sparse matrices (the maximum column weight grows logarithmically in the block length) satisfies the hash property, it is proved that the rate of codes using sparse matrices can achieve the channel capacity.