Probability estimation in arithmetic and adaptive-Huffman entropy coders

  • Authors:
  • D. L. Duttweiler;C. Chamzas

  • Affiliations:
  • AT&T Bell Labs., Holmdel, NJ;-

  • Venue:
  • IEEE Transactions on Image Processing
  • Year:
  • 1995

Quantified Score

Hi-index 0.01

Visualization

Abstract

Entropy coders, such as Huffman and arithmetic coders, achieve compression by exploiting nonuniformity in the probabilities under which a random variable to be coded takes on its possible values. Practical realizations generally require running adaptive estimates of these probabilities. An analysis of the relationship between estimation quality and the resulting coding efficiency suggests a particular scheme, dubbed scaled-count, for obtaining such estimates. It can optimally balance estimation accuracy against a need for rapid response to changing underlying statistics. When the symbols being coded are from a binary alphabet, simple hardware and software implementations requiring almost no computation are possible. A scaled-count adaptive probability estimator of the type described in this paper is used in the arithmetic coder of the JBIG and JPEG image coding standards