Entropy computations via analytic depoissonization

  • Authors:
  • P. Jacquet;W. Szpankowski

  • Affiliations:
  • Inst. Nat. de Recherche en Inf. et Autom., Le Chesnay;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.96

Visualization

Abstract

We investigate the basic question of information theory, namely, evaluation of Shannon entropy, and a more general Renyi (1961) entropy, for some discrete distributions (e.g., binomial, negative binomial, etc.). We aim at establishing analytic methods (i.e., those in which complex analysis plays a pivotal role) for such computations which often yield estimates of unparalleled precision. The main analytic tool used here is that of analytic poissonization and depoissonization. We illustrate our approach on the entropy evaluation of the binomial distribution, that is, we prove that for binomial (n, p) distribution Shannon's hn becomes hn≈½ln n+½+ln√(2πp(1-p))+Σk⩾1ak n-k where ak are explicitly computable constants. Moreover, we argue that analytic methods (e.g., complex asymptotics such as Rice's method and singularity analysis, Mellin transforms, poissonization, and depoissonization) can offer new tools for information theory, especially for studying second-order asymptotics (e.g., redundancy). In fact, there has been a resurgence of interest and a few successful applications of analytic methods to a variety of problems of information theory, therefore, we propose to name such investigations as analytic information theory