Relative entropy and score function: new information-estimation relationships through arbitrary additive perturbation

  • Authors:
  • Dongning Guo

  • Affiliations:
  • Department of Electrical Engineering & Computer Science, Northwestern University, Evanston, IL

  • Venue:
  • ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
  • Year:
  • 2009

Quantified Score

Hi-index 0.12

Visualization

Abstract

This paper establishes new information-estimation relationships pertaining to models with additive noise of arbitrary distribution. In particular, we study the change in the relative entropy between two probability measures when both of them are perturbed by a small amount of the same additive noise. It is shown that the rate of the change with respect to the energy of the perturbation can be expressed in terms of the mean squared difference of the score functions of the two distributions, and, rather surprisingly, is unrelated to the distribution of the perturbation otherwise. The result holds true for the classical relative entropy (or Kullback-Leibler distance), as well as two of its generalizations: Rényi's relative entropy and the f-divergence. The result generalizes a recent relationship between the relative entropy and mean squared errors pertaining to Gaussian noise models, which in turn supersedes many previous information-estimation relationships. A generalization of the de Bruijn identity to non-Gaussian models can also be regarded as consequence of this new result.