On the entropy rate of hidden Markov processes observed through arbitrary memoryless channels
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Mismatched estimation and relative entropy
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
Mismatched estimation and relative entropy
IEEE Transactions on Information Theory
The relationship between causal and noncausal mismatched estimation in continuous-time AWGN channels
IEEE Transactions on Information Theory
Statistical physics of signal estimation in Gaussian noise: theory and examples of phase transitions
IEEE Transactions on Information Theory
Distributed robotic sensor networks: An information-theoretic approach
International Journal of Robotics Research
Hi-index | 755.14 |
A relationship between information theory and estimation theory was recently shown for the Gaussian channel, relating the derivative of mutual information with the minimum mean-square error. This paper generalizes the link between information theory and estimation theory to arbitrary channels, giving representations of the derivative of mutual information as a function of the conditional marginal input distributions given the outputs. We illustrate the use of this representation in the efficient numerical computation of the mutual information achieved by inputs such as specific codes or natural language