Mutual information and minimum mean-square error in Gaussian channels

  • Authors:
  • D. Guo;S. Shamai;S. Verdu

  • Affiliations:
  • Dept. of Electr. Eng., Princeton Univ., NJ, USA;-;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2005

Quantified Score

Hi-index 755.87

Visualization

Abstract

This paper deals with arbitrarily distributed finite-power input signals observed through an additive Gaussian noise channel. It shows a new formula that connects the input-output mutual information and the minimum mean-square error (MMSE) achievable by optimal estimation of the input given the output. That is, the derivative of the mutual information (nats) with respect to the signal-to-noise ratio (SNR) is equal to half the MMSE, regardless of the input statistics. This relationship holds for both scalar and vector signals, as well as for discrete-time and continuous-time noncausal MMSE estimation. This fundamental information-theoretic result has an unexpected consequence in continuous-time nonlinear estimation: For any input signal with finite power, the causal filtering MMSE achieved at SNR is equal to the average value of the noncausal smoothing MMSE achieved with a channel whose SNR is chosen uniformly distributed between 0 and SNR.