Realizing the benefits of user-level channel diversity
ACM SIGCOMM Computer Communication Review
Topics in Multi-User Information Theory
Foundations and Trends in Communications and Information Theory
Vector Gaussian multiple description with two levels of receivers
IEEE Transactions on Information Theory
Multiple-description coding by dithered delta-sigma quantization
IEEE Transactions on Information Theory
Multiple description coding for stationary Gaussian sources
IEEE Transactions on Information Theory
Approximating the Gaussian multiple description rate region under symmetric distortion constraints
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Gaussian multiple description coding with individual and central distortion constraints
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
New coding schemes for the symmetric K -description problem
IEEE Transactions on Information Theory
N-channel asymmetric entropy-constrained multiple-description lattice vector quantization
IEEE Transactions on Information Theory
A three-layer scheme for M-channel multiple description image coding
Signal Processing
Suppressing the Cliff Effect in Video Reproduction Quality
Bell Labs Technical Journal
Hi-index | 755.26 |
In this Part II of a two-part paper, we present a new achievable rate-distortion region for the symmetric n-channel multiple-descriptions coding problem (n2) where the rate of every description is the same, and the reconstruction distortion depends only on the number of descriptions received. Using a new approach for the random coding constructions, along with a generalization of the technique used in the two-channel El Gamal and Cover region to any n, the rate region presented here achieves points that have not been known in the literature previously. This rate region is derived from a concatenation of source-channel erasure codes developed in Part I of this work by deploying the framework of source coding with side information ("random binning"). The key idea is that by using the framework of source coding with side information, multiple statistically identical realizations representing the coarse version of a source can be simultaneously refined by a single encoding. We point out that there is an important conceptual difference in random coding construction for the multiple-descriptions coding problem between the case of n=2 and n2. To illustrate the framework, we also present the important case of the Gaussian source in detail.