Learning continuous attractors in recurrent networks
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Population coding and decoding in a neural field: a computational study
Neural Computation
Analysis of Cyclic Dynamics for Networks of Linear Threshold Neurons
Neural Computation
Computing with Continuous Attractors: Stability and Online Aspects
Neural Computation
Continuous attractors of a class of recurrent neural networks
Computers & Mathematics with Applications
Analysis of continuous attractors for 2-D linear threshold neural networks
IEEE Transactions on Neural Networks
Continuous Attractors of Lotka-Volterra Recurrent Neural Networks
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Hi-index | 0.09 |
A class of simplified background neural networks model with a large number of neurons is proposed. Continuous attractors of the simplified model are studied in this paper. It contains: (1) When the background inputs are set to zero and the excitatory connections are in Gaussian shape, continuous attractors of the new network are obtained under some condition. (2) When the background inputs are nonzero and the excitatory connections are still in Gaussian shape, continuous attractors are achieved under some appropriately selected condition. (3) Discussions and examples are used to illustrate the theories developed.