Monotonic convergence in an information-theoretic law of small numbers

  • Authors:
  • Yaming Yu

  • Affiliations:
  • Department of Statistics, University of California, Irvine, CA

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2009

Quantified Score

Hi-index 755.02

Visualization

Abstract

An "entropy increasing to the maximum" result analogous to the entropic central limit theorem (Barron 1986; Artstein et al. 2004) is obtained in the discrete setting. This involves the thinning operation and a Poisson limit. Monotonic convergence in relative entropy is established for general discrete distributions, while monotonic increase of Shannon entropy is proved for the special class of ultra-log-concave distributions. Overall we extend the parallel between the information-theoretic central limit theorem and law of small numbers explored by Kontoyiannis et al. (2005) and Harremoës et al. (2007, 2008, 2009). Ingredients in the proofs include convexity, majorization, and stochastic orders.