IEEE Transactions on Wireless Communications
On the entropy of compound distributions on nonnegative integers
IEEE Transactions on Information Theory
A criterion for the compound Poisson distribution to be maximum entropy
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 3
Concavity of entropy under thinning
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Monotonic convergence in an information-theoretic law of small numbers
IEEE Transactions on Information Theory
Counterexamples to a proposed stam inequality on finite groups
IEEE Transactions on Information Theory
Sharp bounds on the entropy of the poisson law and related quantities
IEEE Transactions on Information Theory
Thinning, entropy, and the law of thin numbers
IEEE Transactions on Information Theory
Discrete Applied Mathematics
Hi-index | 755.14 |
Two new information-theoretic methods are introduced for establishing Poisson approximation inequalities. First, using only elementary information-theoretic techniques it is shown that, when Sn=Σi=1nXi is the sum of the (possibly dependent) binary random variables X1,X2,...,Xn, with E(Xi)=pi and E(Sn)=λ, then D(P(Sn)||Po(λ)) ≤Σi=1npi2+[Σi=1nH(Xi)-H(X1,X2,...,Xn)] where D(P(Sn)||Po(λ)) is the relative entropy between the distribution of Sn and the Poisson (λ) distribution. The first term in this bound measures the individual smallness of the Xi and the second term measures their dependence. A general method is outlined for obtaining corresponding bounds when approximating the distribution of a sum of general discrete random variables by an infinitely divisible distribution. Second, in the particular case when the Xi are independent, the following sharper bound is established: D(P(Sn)||Po(λ))≤1/λ Σi=1n ((pi3)/(1-pi)) and it is also generalized to the case when the Xi are general integer-valued random variables. Its proof is based on the derivation of a subadditivity property for a new discrete version of the Fisher information, and uses a recent logarithmic Sobolev inequality for the Poisson distribution.