Random matrix theory and wireless communications
Communications and Information Theory
Sensitivity of channel capacity
IEEE Transactions on Information Theory - Part 2
Second-order asymptotics of mutual information
IEEE Transactions on Information Theory
Mutual information and minimum mean-square error in Gaussian channels
IEEE Transactions on Information Theory
On mutual information, likelihood ratios, and estimation error for the additive Gaussian channel
IEEE Transactions on Information Theory
Gradient of mutual information in linear vector Gaussian channels
IEEE Transactions on Information Theory
Divergence and minimum mean-square error in continuous-time additive white Gaussian noise channels
IEEE Transactions on Information Theory
A simple proof of the entropy-power inequality
IEEE Transactions on Information Theory
Optimum power allocation for parallel Gaussian channels with arbitrary input distributions
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Representation of Mutual Information Via Input Estimates
IEEE Transactions on Information Theory
Mutual Information and Conditional Mean Estimation in Poisson Channels
IEEE Transactions on Information Theory
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
The relationship between causal and noncausal mismatched estimation in continuous-time AWGN channels
IEEE Transactions on Information Theory
Hi-index | 0.06 |
A random variable with distribution P is observed in Gaussian noise and is estimated by a minimum meansquare estimator that assumes that the distribution is Q. This paper shows that the integral over all signal-to-noise ratios of the excess mean-square estimation error incurred by the mismatched estimator is twice the relative entropy D(P∥Q). This representation of relative entropy can be generalized to non real-valued random variables, and can be particularized to give a new general representation of mutual information in terms of conditional means. Inspired by the new representation, we also propose a definition of free relative entropy which fills a gap in, and is consistent with, the literature on free probability.