Zipf's law and entropy (Corresp.)

  • Authors:
  • D. Yavuz

  • Affiliations:
  • -

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.84

Visualization

Abstract

The estimate of the entropy of a language by assuming that the word probabilities follow Zipf's law is discussed briefly. Previous numerical results [3] on the vocabulary size implied by Zipf's law and entropy per word are corrected. The vocabulary size should be 12 366 words (not 8727 words) and the entropy per word 9.27 bits (not 11.82).