The number of linear extensions of subset ordering
Discrete Mathematics
On the numbers of independent k-sets in a claw free graph
Journal of Combinatorial Theory Series B
Elements of information theory
Elements of information theory
An entropy proof of Bergman's theorem
Journal of Combinatorial Theory Series A
Ultra logconcave sequences and negative dependence
Journal of Combinatorial Theory Series A
Singularity analysis and asymptotics of Bernoulli sums
Theoretical Computer Science
The roots of the independence polynomial of a clawfree graph
Journal of Combinatorial Theory Series B
On the entropy of compound distributions on nonnegative integers
IEEE Transactions on Information Theory
Negative correlation and log-concavity
Random Structures & Algorithms
A strong log-concavity property for measures on Boolean algebras
Journal of Combinatorial Theory Series A
Entropy computations via analytic depoissonization
IEEE Transactions on Information Theory
Binomial and Poisson distributions as maximum entropy distributions
IEEE Transactions on Information Theory
Entropy and the law of small numbers
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Generalized Entropy Power Inequalities and Monotonicity Properties of Information
IEEE Transactions on Information Theory
On the maximum entropy of the sum of two dependent random variables
IEEE Transactions on Information Theory
The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions
IEEE Transactions on Information Theory
Log-concavity of compound distributions with applications in stochastic optimization
Discrete Applied Mathematics
Hi-index | 0.04 |
Sufficient conditions are developed, under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. Recently, one of the authors [O.T. Johnson, Log-concavity and the maximum entropy property of the Poisson distribution, Stochastic Process. Appl., 117(6) (2007) 791-802] used a semigroup approach to show that the Poisson has maximal entropy among all ultra-log-concave distributions with fixed mean. We show via a non-trivial extension of this semigroup approach that the natural analog of the Poisson maximum entropy property remains valid if the compound Poisson distributions under consideration are log-concave, but that it fails in general. A parallel maximum entropy result is established for the family of compound binomial measures. Sufficient conditions for compound distributions to be log-concave are discussed and applications to combinatorics are examined; new bounds are derived on the entropy of the cardinality of a random independent set in a claw-free graph, and a connection is drawn to Mason's conjecture for matroids. The present results are primarily motivated by the desire to provide an information-theoretic foundation for compound Poisson approximation and associated limit theorems, analogous to the corresponding developments for the central limit theorem and for Poisson approximation. Our results also demonstrate new links between some probabilistic methods and the combinatorial notions of log-concavity and ultra-log-concavity, and they add to the growing body of work exploring the applications of maximum entropy characterizations to problems in discrete mathematics.