Pointwise redundancy in lossy data compression and universal lossy data compression

  • Authors:
  • I. Kontoyiannis

  • Affiliations:
  • Dept. of Stat., Purdue Univ., West Lafayette, IN

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2000

Quantified Score

Hi-index 754.84

Visualization

Abstract

We characterize the achievable pointwise redundancy rates for lossy data compression at a fixed distortion level. “Pointwise redundancy” refers to the difference between the description length achieved by an nth-order block code and the optimal nR(D) bits. For memoryless sources, we show that the best achievable redundancy rate is of order O(√n) in probability. This follows from a second-order refinement to the classical source coding theorem, in the form of a “one-sided central limit theorem”. Moreover, we show that, along (almost) any source realization, the description lengths of any sequence of block codes operating at distortion level D exceed nR(D) by at least as much as C√(nloglogn), infinitely often. Corresponding direct coding theorems are also given, showing that these rates are essentially achievable. The above rates are in sharp contrast with the expected redundancy rates of order O(log n) reported by various authors. Our approach is based on showing that the compression performance of an arbitrary sequence of codes is essentially bounded below by the performance of Shannon's random code. We obtain partial generalizations of the above results for arbitrary sources with memory, and we prove lossy analogs of “Barron's Lemma” (Barron 1985)