Elements of information theory
Elements of information theory
Randomized algorithms
FOCS '02 Proceedings of the 43rd Symposium on Foundations of Computer Science
Good Codes Based on Very Sparse Matrices
Proceedings of the 5th IMA Conference on Cryptography and Coding
Turbo and Trellis-Based Constructions for Source Coding with Side Information
DCC '03 Proceedings of the Conference on Data Compression
Approximating the Satisfiability Threshold for Random k-XOR-formulas
Combinatorics, Probability and Computing
On Multiterminal Source Code Design
DCC '05 Proceedings of the Data Compression Conference
Low Density Codes Achieve theRate-Distortion Bound
DCC '06 Proceedings of the Data Compression Conference
IEEE/ACM Transactions on Networking (TON) - Special issue on networking and information theory
Improved low-density parity-check codes using irregular graphs
IEEE Transactions on Information Theory
The capacity of low-density parity-check codes under message-passing decoding
IEEE Transactions on Information Theory
Design of capacity-approaching irregular low-density parity-check codes
IEEE Transactions on Information Theory
On ensembles of low-density parity-check codes: asymptotic distance distributions
IEEE Transactions on Information Theory
Nested linear/lattice codes for structured multiterminal binning
IEEE Transactions on Information Theory
Upper bounds on the rate of LDPC codes
IEEE Transactions on Information Theory
Distributed source coding using syndromes (DISCUS): design and construction
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
A coding theorem for lossy data compression by LDPC codes
IEEE Transactions on Information Theory
Capacity-achieving ensembles for the binary erasure channel with bounded complexity
IEEE Transactions on Information Theory
A close-to-capacity dirty paper coding scheme
IEEE Transactions on Information Theory
Raptor codes on binary memoryless symmetric channels
IEEE Transactions on Information Theory
Parity-Check Density Versus Performance of Binary Linear Block Codes: New Bounds and Applications
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Polar codes are optimal for lossy source coding
IEEE Transactions on Information Theory
Achieving the rate-distortion bound with low-density generator matrix codes
IEEE Transactions on Communications
Lossy source compression using low-density generator matrix codes: analysis and algorithms
IEEE Transactions on Information Theory
Distributed Video Coding: an overview of basics, research issues and solutions
International Journal of Ad Hoc and Ubiquitous Computing
Hi-index | 754.96 |
In this paper, we describe and analyze the source and channel coding properties of a class of sparse graphical codes based on compounding a low-density generator matrix (LDGM) code with a low-density parity-check (LDPC) code. Our first pair of theorems establishes that there exist codes from this ensemble, with all degrees remaining bounded independently of block length, that are simultaneously optimal for both channel coding and source coding with binary data when encoding and decoding are performed optimally. More precisely, in the context of lossy compression, we prove that finite-degree constructions can achieve any pair (R, D) on the rate-distortion curve of the binary symmetric source. In the context of channel coding, we prove that the same finite-degree codes can achieve any pair (C, p) on the capacity-noise curve of the binary symmetric channel (BSC). Next, we show that our compound construction has a nested structure that can be exploited to achieve the Wyner-Ziv bound for source coding with side information (SCSI), as well as the Gelfand-Pinsker bound for channel coding with side information (CCSI). Although the results described here are based on optimal encoding and decoding, the proposed graphical codes have sparse structure and high girth that renders them well suited to message passing and other efficient decoding procedures.