Lossy network correlated data gathering with high-resolution coding
IEEE/ACM Transactions on Networking (TON) - Special issue on networking and information theory
High-rate quantization and transform coding with side information at the decoder
Signal Processing - Special section: Distributed source coding
Theory and Practice of Rate Division in Distributed Video Coding
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
IEICE - Transactions on Information and Systems
Distributed arithmetic coding for the Slepian-Wolf problem
IEEE Transactions on Signal Processing
Spatial-domain unidirectional DVC with side-information dependent correlation channel estimation
DSP'09 Proceedings of the 16th international conference on Digital Signal Processing
Robust video transmission with distributed source coded auxiliary channel
IEEE Transactions on Image Processing
Robust distributed multiview video compression for wireless camera networks
IEEE Transactions on Image Processing
Heegard-Berger video coding using LMMSE estimator
PCM'06 Proceedings of the 7th Pacific Rim conference on Advances in Multimedia Information Processing
Probabilistic motion-compensated prediction in distributed video coding
Multimedia Tools and Applications
Hi-index | 0.00 |
The rate-distortion function for source coding with side information at the decoder (the “Wyner-Ziv problem”) is given in terms of an auxiliary random variable, which forms a Markov chain with the source and the side information. This Markov chain structure, typical to the solution of multiterminal source coding problems, corresponds to a loss in coding rate with respect to the conditional rate-distortion function, i.e., to the case where the encoder is fully informed. We show that for difference (or balanced) distortion measures, this loss is bounded by a universal constant, which is the minimax capacity of a suitable additive-noise channel. Furthermore, in the worst case, this loss is equal to the maximin redundancy over the rate-distortion function of the additive noise “test” channel. For example, the loss in the Wyner-Ziv problem is less than 0.5 bit/sample in the squared-error distortion case, and it is less than 0.22 bit for a binary source with Hamming distance. These results have implications also in universal quantization with side information, and in more general multiterminal source coding problems