Mutual information and minimum mean-square error in Gaussian channels
IEEE Transactions on Information Theory
On mutual information, likelihood ratios, and estimation error for the additive Gaussian channel
IEEE Transactions on Information Theory
Mutual Information and Conditional Mean Estimation in Poisson Channels
IEEE Transactions on Information Theory
Mutual Information for Stochastic Signals and Fractional Brownian Motion
IEEE Transactions on Information Theory
The relationship between causal and noncausal mismatched estimation in continuous-time AWGN channels
IEEE Transactions on Information Theory
Risk bounds of learning processes for Lévy processes
The Journal of Machine Learning Research
Hi-index | 754.90 |
In this paper, some relations between estimation and mutual information are given by expressing two mutual information calculations in terms of two distinct estimation errors. Specifically the mutual information between a stochastic signal and a pure jump Lévy process whose rate function depends on the signal is expressed in terms of a filtering error and the rate of change of this mutual information with respect to a parameter multiplying the rate function of the Lévy process is expressed in terms of a smoothing error. These results generalize the analogous mutual information results for some Gaussian noise processes with additive stochastic signals.