Elements of information theory
Elements of information theory
The upward bias in measures of information derived from limited data samples
Neural Computation
An introduction to symbolic dynamics and coding
An introduction to symbolic dynamics and coding
Spikes: exploring the neural code
Spikes: exploring the neural code
Stochastic Complexity in Statistical Inquiry Theory
Stochastic Complexity in Statistical Inquiry Theory
Estimation of entropy and mutual information
Neural Computation
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity
Neural Computation
Predictability, Complexity, and Learning
Neural Computation
Neural Computation
Geodesic entropic graphs for dimension and entropy estimation in manifold learning
IEEE Transactions on Signal Processing
The context-tree weighting method: extensions
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
On the role of pattern matching in information theory
IEEE Transactions on Information Theory
Linear time universal coding and time reversal of tree sources via FSM closure
IEEE Transactions on Information Theory
Universal compression of memoryless sources over unknown alphabets
IEEE Transactions on Information Theory
The context-tree weighting method: basic properties
IEEE Transactions on Information Theory
Indices for testing neural codes
Neural Computation
Information in the nonstationary case
Neural Computation
The interplay between entropy and variational distance
IEEE Transactions on Information Theory
Implementation and testing of high-speed CMOS true random number generators based on chaotic systems
IEEE Transactions on Circuits and Systems Part I: Regular Papers - Special section on 2009 IEEE system-on-chip conference
Information theory in neuroscience
Journal of Computational Neuroscience
Journal of Computational Neuroscience
Information transfer characteristic in memristic neuromorphic network
ISNN'13 Proceedings of the 10th international conference on Advances in Neural Networks - Volume Part I
Synergy, redundancy, and multivariate information measures: an experimentalist's perspective
Journal of Computational Neuroscience
Hi-index | 0.06 |
The entropy rate quantifies the amount of uncertainty or disorder produced by any dynamical system. In a spiking neuron, this uncertainty translates into the amount of information potentially encoded and thus the subject of intense theoretical and experimental investigation. Estimating this quantity in observed, experimental data is difficult and requires a judicious selection of probabilistic models, balancing between two opposing biases. We use a model weighting principle originally developed for lossless data compression, following the minimum description length principle. This weighting yields a direct estimator of the entropy rate, which, compared to existing methods, exhibits significantly less bias and converges faster in simulation. With Monte Carlo techinques, we estimate a Bayesian confidence interval for the entropy rate. In related work, we apply these ideas to estimate the information rates between sensory stimuli and neural responses in experimental data (Shlens, Kennel, Abarbanel, & Chichilnisky, in preparation).