Elements of information theory
Elements of information theory
Good Codes Based on Very Sparse Matrices
Proceedings of the 5th IMA Conference on Cryptography and Coding
Turbo and Trellis-Based Constructions for Source Coding with Side Information
DCC '03 Proceedings of the Conference on Data Compression
IEEE Transactions on Information Theory
Performance of polar codes for channel and source coding
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 3
On the rate of channel polarization
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 3
Low-density graph codes that are optimal for binning and coding with side information
IEEE Transactions on Information Theory
Polar codes: characterization of exponent, bounds, and constructions
IEEE Transactions on Information Theory
Codes on graphs: normal realizations
IEEE Transactions on Information Theory
Nested linear/lattice codes for structured multiterminal binning
IEEE Transactions on Information Theory
Distributed source coding using syndromes (DISCUS): design and construction
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
A coding theorem for lossy data compression by LDPC codes
IEEE Transactions on Information Theory
A close-to-capacity dirty paper coding scheme
IEEE Transactions on Information Theory
On Multiterminal Source Code Design
IEEE Transactions on Information Theory
The compound capacity of polar codes
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
Achieving the rate-distortion bound with low-density generator matrix codes
IEEE Transactions on Communications
Achievable complexity-performance tradeoffs in lossy compression
Problems of Information Transmission
Hi-index | 754.84 |
We consider lossy source compression of a binary symmetric source using polar codes and a low-complexity successive encoding algorithm. It was recently shown by Arikan that polar codes achieve the capacity of arbitrary symmetric binary-input discrete memoryless channels under a successive decoding strategy. We show the equivalent result for lossy source compression, i.e., we show that this combination achieves the rate-distortion bound for a binary symmetric source. We further show the optimality of polar codes for various multiterminal problems including the binary Wyner-Ziv and the binary Gelfand-Pinsker problems. Our results extend to general versions of these problems.