The rate loss in the Wyner-Ziv problem

  • Authors:
  • R. Zamir

  • Affiliations:
  • Sch. of Electr. Eng., Cornell Univ., Ithaca, NY

  • Venue:
  • IEEE Transactions on Information Theory - Part 2
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

The rate-distortion function for source coding with side information at the decoder (the “Wyner-Ziv problem”) is given in terms of an auxiliary random variable, which forms a Markov chain with the source and the side information. This Markov chain structure, typical to the solution of multiterminal source coding problems, corresponds to a loss in coding rate with respect to the conditional rate-distortion function, i.e., to the case where the encoder is fully informed. We show that for difference (or balanced) distortion measures, this loss is bounded by a universal constant, which is the minimax capacity of a suitable additive-noise channel. Furthermore, in the worst case, this loss is equal to the maximin redundancy over the rate-distortion function of the additive noise “test” channel. For example, the loss in the Wyner-Ziv problem is less than 0.5 bit/sample in the squared-error distortion case, and it is less than 0.22 bit for a binary source with Hamming distance. These results have implications also in universal quantization with side information, and in more general multiterminal source coding problems