On the entropy of compound distributions on nonnegative integers
IEEE Transactions on Information Theory
Concavity of entropy under thinning
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Monotonic convergence in an information-theoretic law of small numbers
IEEE Transactions on Information Theory
Sharp bounds on the entropy of the poisson law and related quantities
IEEE Transactions on Information Theory
Hi-index | 755.02 |
It is shown that the Binomial(n,p) distribution maximizes the entropy in the class of ultra-log-concave distributions of order n with fixed mean np. This result, which extends a theorem of Shepp and Olkin (1981), is analogous to that of Johnson (2007), who considers the Poisson case. The proof constructs a Markov chain whose limiting distribution is Binomial(n,p) and shows that the entropy never decreases along the iterations of this Markov chain.