Elements of information theory
Elements of information theory
Fundamentals of statistical signal processing: estimation theory
Fundamentals of statistical signal processing: estimation theory
Mutual information, Fisher information, and population coding
Neural Computation
Neuronal tuning: to sharpen or broaden
Neural Computation
The effect of correlated variability on the accuracy of a population code
Neural Computation
Population coding and decoding in a neural field: a computational study
Neural Computation
Optimal short-term population coding: when fisher information fails
Neural Computation
Neural Computation
Journal of Cognitive Neuroscience
Towards a theory of early visual processing
Neural Computation
Hi-index | 0.00 |
As neural activity is transmitted through the nervous system, neuronal noise degrades the encoded information and limits performance. It is therefore important to know how information loss can be prevented. We study this question in the context of neural population codes. Using Fisher information, we show how information loss in a layered network depends on the connectivity between the layers. We introduce an algorithm, reminiscent of the water filling algorithm for Shannon information that minimizes the loss. The optimal connection profile has a center-surround structure with a spatial extent closely matching the neurons' tuning curves. In addition, we show how the optimal connectivity depends on the correlation structure of the trial-to-trial variability in the neuronal responses. Our results explain how optimal communication of population codes requires the center-surround architectures found in the nervous system and provide explicit predictions on the connectivity parameters.