Capacity of a single spiking neuron channel
Neural Computation
Mismatched estimation and relative entropy
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 2
Mutual information for stochastic signals and Lévy processes
IEEE Transactions on Information Theory
Sharp bounds on the entropy of the poisson law and related quantities
IEEE Transactions on Information Theory
Mismatched estimation and relative entropy
IEEE Transactions on Information Theory
The relationship between causal and noncausal mismatched estimation in continuous-time AWGN channels
IEEE Transactions on Information Theory
Hi-index | 755.08 |
Following the discovery of a fundamental connection between information measures and estimation measures in Gaussian channels, this paper explores the counterpart of those results in Poisson channels. In the continuous-time setting, the received signal is a doubly stochastic Poisson point process whose rate is equal to the input signal plus a dark current. It is found that, regardless of the statistics of the input, the derivative of the input-output mutual information with respect to the intensity of the additive dark current can be expressed as the expected difference between the logarithm of the input and the logarithm of its noncausal conditional mean estimate. The same holds for the derivative with respect to input scaling, but with the logarithmic function replaced by x log x. Similar relationships hold for discrete-time versions of the channel where the outputs are Poisson random variables conditioned on the input symbols.