Generalized Entropy Power Inequalities and Monotonicity Properties of Information

  • Authors:
  • M. Madiman;A. Barron

  • Affiliations:
  • Yale Univ., New Haven;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2007

Quantified Score

Hi-index 755.20

Visualization

Abstract

New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of n independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets. As a consequence, a simple proof of the monotonicity of information in central limit theorems is obtained, both in the setting of independent and identically distributed (i.i.d.) summands as well as in the more general setting of independent summands with variance-standardized sums.