Mathematics of Information and Coding
Mathematics of Information and Coding
Information Theory and Reliable Communication
Information Theory and Reliable Communication
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Low Density Codes Achieve theRate-Distortion Bound
DCC '06 Proceedings of the Data Compression Conference
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Secret Key Agreement from Correlated Source Outputs Using Low Density Parity Check Matrices
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
A First Course in Information Theory (Information Technology: Transmission, Processing and Storage)
A First Course in Information Theory (Information Technology: Transmission, Processing and Storage)
A Construction of Lossy Source Code Using LDPC Matrices
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
Nonlinear sparse-graph codes for lossy compression
IEEE Transactions on Information Theory
Hash property and fixed-rate universal coding theorems
IEEE Transactions on Information Theory
Good error-correcting codes based on very sparse matrices
IEEE Transactions on Information Theory
The generalized distributive law
IEEE Transactions on Information Theory
Factor graphs and the sum-product algorithm
IEEE Transactions on Information Theory
Bounds on the maximum-likelihood decoding error probability of low-density parity-check codes
IEEE Transactions on Information Theory
Nested linear/lattice codes for structured multiterminal binning
IEEE Transactions on Information Theory
A coding theorem for lossy data compression by LDPC codes
IEEE Transactions on Information Theory
On the application of LDPC codes to arbitrary discrete-memoryless channels
IEEE Transactions on Information Theory
Using linear programming to Decode Binary linear codes
IEEE Transactions on Information Theory
The ML decoding performance of LDPC ensembles over Zq
IEEE Transactions on Information Theory
Low-density parity-check matrices for coding of correlated sources
IEEE Transactions on Information Theory
Source Coding Using Families of Universal Hash Functions
IEEE Transactions on Information Theory
Coding theorem for general stationary memoryless channel based on hash property
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Hash property and fixed-rate universal coding theorems
IEEE Transactions on Information Theory
Corrections to "hash property and coding theorems for sparse matrices and maximum-likelihood coding"
IEEE Transactions on Information Theory
Hi-index | 754.97 |
The aim of this paper is to prove the achievability of rate regions for several coding problems by using sparse matrices (with logarithmic column degree) and maximum-likelihood (ML) coding. These problems are the Gel'fand-Pinsker problem, the Wyner-Ziv problem, and the one-helps-one problem (source coding with partial side information at the decoder). To this end, the notion of a hash property for an ensemble of functions is introduced and it is proved that an ensemble of q-ary sparse matrices satisfies the hash property. Based on this property, it is proved that the rate of codes using sparse matrices and ML coding can achieve the optimal rate.