Entropy and the law of small numbers

  • Authors:
  • I. Kontoyiannis;P. Harremoes;O. Johnson

  • Affiliations:
  • Dept. of Comput. Sci., Brown Univ., Providence, RI, USA;-;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2005

Quantified Score

Hi-index 755.14

Visualization

Abstract

Two new information-theoretic methods are introduced for establishing Poisson approximation inequalities. First, using only elementary information-theoretic techniques it is shown that, when Sn=Σi=1nXi is the sum of the (possibly dependent) binary random variables X1,X2,...,Xn, with E(Xi)=pi and E(Sn)=λ, then D(P(Sn)||Po(λ)) ≤Σi=1npi2+[Σi=1nH(Xi)-H(X1,X2,...,Xn)] where D(P(Sn)||Po(λ)) is the relative entropy between the distribution of Sn and the Poisson (λ) distribution. The first term in this bound measures the individual smallness of the Xi and the second term measures their dependence. A general method is outlined for obtaining corresponding bounds when approximating the distribution of a sum of general discrete random variables by an infinitely divisible distribution. Second, in the particular case when the Xi are independent, the following sharper bound is established: D(P(Sn)||Po(λ))≤1/λ Σi=1n ((pi3)/(1-pi)) and it is also generalized to the case when the Xi are general integer-valued random variables. Its proof is based on the derivation of a subadditivity property for a new discrete version of the Fisher information, and uses a recent logarithmic Sobolev inequality for the Poisson distribution.