An application of informational divergence to Huffman codes

  • Authors:
  • G. Longo;G. Galasso

  • Affiliations:
  • -;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.84

Visualization

Abstract

A classification of all probability distributions over the finite alphabet of an information source is given, where the classes are the sets of distributions sharing the same binary Huffman code. Such a classification can be used in noiseless coding, when the distribution of the finite memoryless source varies in time or becomes gradually known. Instead of applying the Huffman algorithm to each new estimate of the probability distribution, if a simple test based on the above classification is passed, then the Huffman code used previously is optimal also for the new distribution.