Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Discrimination thresholds for channel-coded systems
Biological Cybernetics
Mutual information, Fisher information, and population coding
Neural Computation
Neuronal tuning: to sharpen or broaden
Neural Computation
An Information-Theoretic Approach to Neural Computing
An Information-Theoretic Approach to Neural Computing
Parameter extraction from population codes: A critical assessment
Neural Computation
Representational accuracy of stochastic neural populations
Neural Computation
Optimal short-term population coding: when fisher information fails
Neural Computation
Neural Coding of Dynamic Stimuli
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Difficulty of Singularity in Population Coding
Neural Computation
Optimal neuronal tuning for finite stimulus spaces
Neural Computation
Optimal population codes for space: Grid cells outperform place cells
Neural Computation
Hi-index | 0.00 |
Neural responses in sensory systems are typically triggered by a multitude of stimulus features. Using information theory, we study the encoding accuracy of a population of stochastically spiking neurons characterized by different tuning widths for the different features. The optimal encoding strategy for representing one feature most accurately consists of narrow tuning in the dimension to be encoded, to increase the single-neuron Fisher information, and broad tuning in all other dimensions, to increase the number of active neurons. Extremely narrow tuning without sufficient receptive field overlap will severely worsen the coding. This implies the existence of an optimal tuning width for the feature to be encoded. Empirically, only a subset of all stimulus features will normally be accessible. In this case, relative encoding errors can be calculated that yield a criterion for the function of a neural population based on the measured tuning curves.