Discrete Applied Mathematics
Hi-index | 754.84 |
Investigates the maximization of the differential entropy h(X+Y) of arbitrary dependent random variables X and Y under the constraints of fixed equal marginal densities for X and Y. We show that max[h(X+Y)]=h(2X), under the constraints that X and Y have the same fixed marginal density f, if and only if f is log-concave. The maximum is achieved when X=Y. If f is not log-concave, the maximum is strictly greater than h(2X). As an example, identically distributed Gaussian random variables have log-concave densities and satisfy max[h(X+Y)]=h(2X) with X=Y. More general inequalities in this direction should lead to capacity bounds for additive noise channels with feedback