On mutual information, likelihood ratios, and estimation error for the additive Gaussian channel

  • Authors:
  • M. Zakai

  • Affiliations:
  • Dept. of Electr. Eng., Technion-Israel Inst. of Technol., Haifa

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2005

Quantified Score

Hi-index 755.14

Visualization

Abstract

This paper considers the model of an arbitrarily distributed signal x observed through an added independent white Gaussian noise w, y=x+w. New relations between the minimal mean-square error of the noncausal estimator and the likelihood ratio between y and w are derived. This is followed by an extended version of a recently derived relation between the mutual information I(x;y) and the minimal mean-square error. These results are applied to derive infinite-dimensional versions of the Fisher information and the de Bruijn identity. A comparison between the causal and noncausal estimation errors yields a restricted form of the logarithmic Sobolev inequality. The derivation of the results is based on the Malliavin calculus