Matrix analysis
Learning and relearning in Boltzmann machines
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Reduction of conductance-based models with slow synapses to neural nets
Neural Computation
Selectively grouping neurons in recurrent networks of lateral inhibition
Neural Computation
Complete Convergence of Competitive Neural Networks with Different Time Scales
Neural Processing Letters
Analysis of Cyclic Dynamics for Networks of Linear Threshold Neurons
Neural Computation
Selectivity and Stability via Dendritic Nonlinearity
Neural Computation
State-dependent computation using coupled recurrent networks
Neural Computation
Representations of continuous attractors of recurrent neural networks
IEEE Transactions on Neural Networks
Permitted and forbidden sets in discrete-time linear threshold recurrent neural networks
IEEE Transactions on Neural Networks
Biophysical models of neural computation: max and tuning circuits
WImBI'06 Proceedings of the 1st WICI international conference on Web intelligence meets brain informatics
Foundations of implementing the competitive layer model by Lotka-Volterra recurrent neural networks
IEEE Transactions on Neural Networks
Large-margin classification in infinite neural networks
Neural Computation
A systematic method for configuring vlsi networks of spiking neurons
Neural Computation
A simple message passing algorithm for graph partitioning problems
ISAAC'06 Proceedings of the 17th international conference on Algorithms and Computation
Journal of Global Optimization
Collective stability of networks of winner-take-all circuits
Neural Computation
A Competitive Layer Model for Cellular Neural Networks
Neural Networks
A neural network model for currency arbitrage detection
ISNN'12 Proceedings of the 9th international conference on Advances in Neural Networks - Volume Part I
Encoding binary neural codes in networks of threshold-linear neurons
Neural Computation
Hi-index | 0.00 |
The richness and complexity of recurrent cortical circuits is an inexhaustible source of inspiration for thinking about high-level biological computation. In past theoretical studies, constraints on the synaptic connection patterns of threshold-linear networks were found that guaranteed bounded network dynamics, convergence to attractive fixed points, and multistability, all fundamental aspects of cortical information processing. However, these conditions were only sufficient, and it remained unclear which were the minimal (necessary) conditions for convergence and multistability.We show that symmetric threshold-linear networks converge to a set of attractive fixed points if and only if the network matrix is copositive. Furthermore, the set of attractive fixed points is nonconnected (the network is multiattractive) if and only if the network matrix is not positive semidefinite. There are permitted sets of neurons that can be coactive at a stable steady state and forbidden sets that cannot. Permitted sets are clustered in the sense that subsets of permitted sets are permitted and supersets of forbidden sets are forbidden. By viewing permitted sets as memories stored in the synaptic connections, we provide a formulation of long-term memory that is more general than the traditional perspective of fixed-point attractor networks. There is a close correspondence between threshold-linear networks and networks defined by the generalized Lotka-Volterra equations.