Distance measures for signal processing and pattern recognition
Signal Processing
Spikes: exploring the neural code
Spikes: exploring the neural code
Estimation of entropy and mutual information
Neural Computation
Measuring information transfer in the spike generator of crayfish sustaining fibers
Biological Cybernetics
A Unified Approach to the Study of Temporal, Correlational, and Rate Coding
Neural Computation
Neural Computation
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Toward a theory of information processing
Signal Processing
On the asymptotics of M-hypothesis Bayesian detection
IEEE Transactions on Information Theory
An overview of bayesian methods for neural spike train analysis
Computational Intelligence and Neuroscience - Special issue on Modeling and Analysis of Neural Spike Trains
Hi-index | 0.00 |
Neuroscientists want to quantify how well neurons, individually and collectively, process information and encode the result in their outputs. We demonstrate that while classic information theory demarcates optimal performance boundaries, it does not provide results that would be useful in analyzing an existing system about which little is known (such as the brain). In the classical vein, non-Poisson channels, which describe the communication medium for neural signals, are shown to have individually a capacity strictly smaller than the Poisson ideal. We describe recent capacity results for Poisson neural populations, showing that connections among neurons can increase capacity. We then present an alternative theory more amenable to data analysis and to situations wherein systems actively extract and represent information. Using this theory, we show that the ability of a neural population to jointly represent information depends nature of its input signal, not on the encoded information.