Mismatched estimation and relative entropy

  • Authors:
  • Sergio Verdú

  • Affiliations:
  • Department of Electrical Engineering, Princeton University, Princeton, NJ

  • Venue:
  • ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
  • Year:
  • 2009

Quantified Score

Hi-index 0.06

Visualization

Abstract

A random variable with distribution P is observed in Gaussian noise and is estimated by a minimum meansquare estimator that assumes that the distribution is Q. This paper shows that the integral over all signal-to-noise ratios of the excess mean-square estimation error incurred by the mismatched estimator is twice the relative entropy D(P∥Q). This representation of relative entropy can be generalized to non real-valued random variables, and can be particularized to give a new general representation of mutual information in terms of conditional means. Inspired by the new representation, we also propose a definition of free relative entropy which fills a gap in, and is consistent with, the literature on free probability.