A conditional entropy power inequality for dependent variables

  • Authors:
  • O. Johnson

  • Affiliations:
  • Centre for Math. Sci., Cambridge Univ., UK

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.84

Visualization

Abstract

We provide a condition under which a version of Shannon's entropy power inequality will hold for dependent variables. We first provide a Fisher information inequality extending that found in the independent case. The key ingredients are a conditional expectation representation for the score function of a sum, and the de Bruijn identity which relates entropy and Fisher information.