Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
Mismatched estimation and relative entropy
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
Directed information and causal estimation in continuous time
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
Mutual information for stochastic signals and Lévy processes
IEEE Transactions on Information Theory
Mutual information and minimum mean-square error in Gaussian channels
IEEE Transactions on Information Theory
On mutual information, likelihood ratios, and estimation error for the additive Gaussian channel
IEEE Transactions on Information Theory
Gradient of mutual information in linear vector Gaussian channels
IEEE Transactions on Information Theory
Divergence and minimum mean-square error in continuous-time additive white Gaussian noise channels
IEEE Transactions on Information Theory
A simple proof of the entropy-power inequality
IEEE Transactions on Information Theory
Optimum power allocation for parallel Gaussian channels with arbitrary input distributions
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Representation of Mutual Information Via Input Estimates
IEEE Transactions on Information Theory
Generalized Entropy Power Inequalities and Monotonicity Properties of Information
IEEE Transactions on Information Theory
Mutual Information and Conditional Mean Estimation in Poisson Channels
IEEE Transactions on Information Theory
Mutual Information for Stochastic Signals and Fractional Brownian Motion
IEEE Transactions on Information Theory
Scanning and Sequential Decision Making for Multidimensional Data—Part II: The Noisy Case
IEEE Transactions on Information Theory
A strong version of the redundancy-capacity theorem of universal coding
IEEE Transactions on Information Theory
Mismatched estimation and relative entropy
IEEE Transactions on Information Theory
Hi-index | 754.90 |
A continuous-time finite-power process with distribution P is observed through an AWGN channel, at a given signal-to-noise ratio (SNR), and is estimated by an estimator that would have minimized the mean-square error if the process had distribution Q. We show that the causal filtering mean-square error (MSE) achieved at SNR level snr is equal to the average value of the non-causal (smoothing) MSE achieved with a channel whose SNR is chosen uniformly distributed between 0 and snr. Emerging as the bridge for equating these two quantities are mutual information and relative entropy. Our result generalizes that of Guo, Shamai, and Verdú. (2005) from the nonmismatched case, where P = Q, to general P and Q. Among our intermediate results is an extension of Duncan's theorem, that relates mutual information and causal MMSE, to the case of mismatched estimation. Some further extensions and implications are discussed. Key to our findings is the recent result of Verdú on mismatched estimation and relative entropy.