Complexity-compression tradeoffs in lossy compression via efficient random codebooks and databases

  • Authors:
  • C. Gioran;I. Kontoyiannis

  • Affiliations:
  • Department of Informatics, Athens University of Economics and Business, Athens, Greece;Department of Informatics, Athens University of Economics and Business, Athens, Greece

  • Venue:
  • Problems of Information Transmission
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

The compression-complexity trade-off of lossy compression algorithms that are based on a random codebook or a random database is examined. Motivated, in part, by recent results of Gupta-Verdú-Weissman (GVW) and their underlying connections with the pattern-matching scheme of Kontoyiannis' lossy Lempel-Ziv algorithm, we introduce a nonuniversal version of the lossy Lempel-Ziv method (termed LLZ). The optimality of LLZ for memory-less sources is established, and its performance is compared to that of the GVW divide-and-conquer approach. Experimental results indicate that the GVW approach often yields better compression than LLZ, but at the price of much higher memory requirements. To combine the advantages of both, we introduce a hybrid algorithm (HYB) that utilizes both the divide-and-conquer idea of GVW and the single-database structure of LLZ. It is proved that HYB shares with GVW the exact same rate-distortion performance and implementation complexity, while, like LLZ, requiring less memory, by a factor which may become unbounded, depending on the choice of the relevant design parameters. Experimental results are also presented, illustrating the performance of all three methods on data generated by simple discrete memory-less sources. In particular, the HYB algorithm is shown to outperform existing schemes for the compression of some simple discrete sources with respect to the Hamming distortion criterion.