Convergent activation dynamics in continuous time networks
Neural Networks
A Competitive-Layer Model for Feature Binding and Sensory Segmentation
Neural Computation
On neurodynamics with limiter function and linsker's developmental model
Neural Computation
Selectively grouping neurons in recurrent networks of lateral inhibition
Neural Computation
Using Maximal Recurrence in Linear Threshold Competitive Layer Networks
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Permitted and forbidden sets in symmetric threshold-linear networks
Neural Computation
Discrimination networks for maximum selection
Neural Networks
Complete Convergence of Competitive Neural Networks with Different Time Scales
Neural Processing Letters
Analysis of Cyclic Dynamics for Networks of Linear Threshold Neurons
Neural Computation
Selectivity and Stability via Dendritic Nonlinearity
Neural Computation
State-dependent computation using coupled recurrent networks
Neural Computation
A Winner-Take-All Neural Networks of N Linear Threshold Neurons without Self-Excitatory Connections
Neural Processing Letters
Multistability of Neural Networks with a Class of Activation Functions
ISNN '09 Proceedings of the 6th International Symposium on Neural Networks on Advances in Neural Networks
Solving TSP by using Lotka-Volterra neural networks
Neurocomputing
Analysis of continuous attractors for 2-D linear threshold neural networks
IEEE Transactions on Neural Networks
Representations of continuous attractors of recurrent neural networks
IEEE Transactions on Neural Networks
Permitted and forbidden sets in discrete-time linear threshold recurrent neural networks
IEEE Transactions on Neural Networks
Delayed neural networks with multistable almost periodic solutions
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Multistability and new attraction basins of almost-periodic solutions of delayed neural networks
IEEE Transactions on Neural Networks
Nontrivial global attractors in 2-D multistable attractor neural networks
IEEE Transactions on Neural Networks
Memory dynamics in attractor networks with saliency weights
Neural Computation
Self-emerging action gestalts for task segmentation
KI'09 Proceedings of the 32nd annual German conference on Advances in artificial intelligence
Foundations of implementing the competitive layer model by Lotka-Volterra recurrent neural networks
IEEE Transactions on Neural Networks
Journal of Computational Neuroscience
A Competitive Layer Model for Cellular Neural Networks
Neural Networks
Hi-index | 0.01 |
We establish two conditions that ensure the nondivergence of additive recurrent networks with unsaturating piecewise linear transfer functions, also called linear threshold or semilinear transfer functions. As Hahnloser, Sarpeshkar, Mahowald, Douglas, and Seung (2000) showed, networks of this type can be efficiently built in silicon and exhibit the coexistence of digital selection and analog amplification in a single circuit. To obtain this behavior, the network must be multistable and nondivergent, and our conditions allow determining the regimes where this can be achieved with maximal recurrent amplification. The first condition can be applied to nonsymmetric networks and has a simple interpretation of requiring that the strength of local inhibition match the sum over excitatory weights converging onto a neuron. The second condition is restricted to symmetric networks, but can also take into account the stabilizing effect of nonlocal inhibitory interactions. We demonstrate the application of the conditions on a simple example and the orientation-selectivity model of Ben-Yishai, Lev Bar-Or, and Sompolinsky (1995). We show that the conditions can be used to identify in their model regions of maximal orientation-selective amplification and symmetry breaking.