Statistically efficient estimation using population coding
Neural Computation
Probabilistic interpretation of population codes
Neural Computation
Mutual information, Fisher information, and population coding
Neural Computation
Neuronal tuning: to sharpen or broaden
Neural Computation
The effect of correlated variability on the accuracy of a population code
Neural Computation
The effect of correlations on the Fisher information of population codes
Proceedings of the 1998 conference on Advances in neural information processing systems II
Dynamic Neural Field Theory for Motion Perception
Dynamic Neural Field Theory for Motion Perception
Representational accuracy of stochastic neural populations
Neural Computation
Unfaithful Population Decoding
IJCNN '00 Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 2 - Volume 2
Attention Modulation of Neural Tuning Through Peak and Base Rate
Neural Computation
Population Coding with Correlation and an Unfaithful Model
Neural Computation
Parameter extraction from population codes: A critical assessment
Neural Computation
Sequential Bayesian decoding with a population of neurons
Neural Computation
Analyzing Global Dynamics of a Neural Field Model
Neural Processing Letters
Difficulty of Singularity in Population Coding
Neural Computation
Computing with Continuous Attractors: Stability and Online Aspects
Neural Computation
Singularities Affect Dynamics of Learning in Neuromanifolds
Neural Computation
Implications of neuronal diversity on population coding
Neural Computation
Optimal neuronal tuning for finite stimulus spaces
Neural Computation
Population coding with motion energy filters: The impact of correlations
Neural Computation
Dynamics and computation of continuous attractors
Neural Computation
The Tracking Speed of Continuous Attractors
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks
Continuous Attractors of Lotka-Volterra Recurrent Neural Networks
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Conditional mixture model for correlated neuronal spikes
Neural Computation
Traveling bumps and their collisions in a two-dimensional neural field
Neural Computation
Continuous attractors of a class of neural networks with a large number of neurons
Computers & Mathematics with Applications
The ideal noisy environment for fast neural computation
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
Transmission of population-coded information
Neural Computation
Neural information processing with feedback modulations
Neural Computation
Multiple levels of spatial organization: World Graphs and spatial difference learning
Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
Neural population decoding in short-time windows
IScIDE'12 Proceedings of the third Sino-foreign-interchange conference on Intelligent Science and Intelligent Data Engineering
Hi-index | 0.00 |
This study uses a neural field model to investigate computational aspects of population coding and decoding when the stimulus is a single variable. A general prototype model for the encoding process is proposed, in which neural responses are correlated, with strength specified by a gaussian function of their difference in preferred stimuli. Based on the model, we study the effect of correlation on the Fisher information, compare the performances of three decoding methods that differ in the amount of encoding information being used, and investigate the implementation of the three methods by using a recurrent network. This study not only rediscovers main results in existing literatures in a unified way, but also reveals important new features, especially when the neural correlation is strong. As the neural correlation of firing becomes larger, the Fisher information decreases drastically. We confirm that as the width of correlation increases, the Fisher information saturates and no longer increases in proportion to the number of neurons. However, we prove that as the width increases further-wider than √2 times the effective width of the turning function-the Fisher information increases again, and it increases without limit in proportion to the number of neurons. Furthermore, we clarify the asymptotic efficiency of the maximum likelihood inference (MLI) type of decoding methods for correlated neural signals. It shows that when the correlation covers a nonlocal range of population (excepting the uniform correlation and when the noise is extremely small), the MLI type of method, whose decoding error satisfies the Cauchy-type distribution, is not asymptotically efficient. This implies that the variance is no longer adequate to measure decoding accuracy.