On the maximum entropy of the sum of two dependent random variables

  • Authors:
  • T. M. Cover;Zhen Zhang

  • Affiliations:
  • Dept. of Electr. Eng., Stanford Univ., CA;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.84

Visualization

Abstract

Investigates the maximization of the differential entropy h(X+Y) of arbitrary dependent random variables X and Y under the constraints of fixed equal marginal densities for X and Y. We show that max[h(X+Y)]=h(2X), under the constraints that X and Y have the same fixed marginal density f, if and only if f is log-concave. The maximum is achieved when X=Y. If f is not log-concave, the maximum is strictly greater than h(2X). As an example, identically distributed Gaussian random variables have log-concave densities and satisfy max[h(X+Y)]=h(2X) with X=Y. More general inequalities in this direction should lead to capacity bounds for additive noise channels with feedback