The upward bias in measures of information derived from limited data samples
Neural Computation
Spikes: exploring the neural code
Spikes: exploring the neural code
Estimation of entropy and mutual information
Neural Computation
Estimating Entropy Rates with Bayesian Confidence Intervals
Neural Computation
Indices for testing neural codes
Neural Computation
Information theory and neural information processing
IEEE Transactions on Information Theory - Special issue on information theory in molecular biology and neuroscience
Entropy estimation in turing's perspective
Neural Computation
Neural encoding schemes of tactile information in afferent activity of the vibrissal system
Journal of Computational Neuroscience
Hi-index | 0.00 |
We present a new derivation of the asymptotic correction for bias in the estimate of information from a finite sample. The new derivation reveals a relationship between information estimates and a sequence of polynomials with combinatorial significance, the exponential (Bell) polynomials, and helps to provide an understanding of the form and behavior of the asymptotic correction for bias.