IEEE Transactions on Information Theory
Mismatched estimation and relative entropy
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
The generalized area theorem and some of its consequences
IEEE Transactions on Information Theory
Mutual information for stochastic signals and Lévy processes
IEEE Transactions on Information Theory
Mismatched estimation and relative entropy
IEEE Transactions on Information Theory
The relationship between causal and noncausal mismatched estimation in continuous-time AWGN channels
IEEE Transactions on Information Theory
Hi-index | 755.14 |
This paper considers the model of an arbitrarily distributed signal x observed through an added independent white Gaussian noise w, y=x+w. New relations between the minimal mean-square error of the noncausal estimator and the likelihood ratio between y and w are derived. This is followed by an extended version of a recently derived relation between the mutual information I(x;y) and the minimal mean-square error. These results are applied to derive infinite-dimensional versions of the Fisher information and the de Bruijn identity. A comparison between the causal and noncausal estimation errors yields a restricted form of the logarithmic Sobolev inequality. The derivation of the results is based on the Malliavin calculus