Gradient of mutual information in linear vector Gaussian channels

  • Authors:
  • D. P. Palomar;S. Verdu

  • Affiliations:
  • Dept. of Electr. Eng., Princeton Univ., NJ, USA;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 755.20

Visualization

Abstract

This paper considers a general linear vector Gaussian channel with arbitrary signaling and pursues two closely related goals: i) closed-form expressions for the gradient of the mutual information with respect to arbitrary parameters of the system, and ii) fundamental connections between information theory and estimation theory. Generalizing the fundamental relationship recently unveiled by Guo, Shamai, and Verdu´, we show that the gradient of the mutual information with respect to the channel matrix is equal to the product of the channel matrix and the error covariance matrix of the best estimate of the input given the output. Gradients and derivatives with respect to other parameters are then found via the differentiation chain rule.