Introduction to the theory of neural computation
Introduction to the theory of neural computation
Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
Differential Inclusions: Set-Valued Maps and Viability Theory
Differential Inclusions: Set-Valued Maps and Viability Theory
Data-reusing recurrent neural adaptive filters
Neural Computation
On the Computational Power of Winner-Take-All
Neural Computation
IEEE Transactions on Neural Networks
Novel weighting-delay-based stability criteria for recurrent neural networks with time-varying delay
IEEE Transactions on Neural Networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A new k-winners-take-all neural network and its array architecture
IEEE Transactions on Neural Networks
Another K-winners-take-all analog neural network
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A Simplified Dual Neural Network for Quadratic Programming With Its KWTA Application
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Global Asymptotic Stability of Recurrent Neural Networks With Multiple Time-Varying Delays
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
K-winners-take-all circuit with O(N) complexity
IEEE Transactions on Neural Networks
A neural-network contention controller for packet switching networks
IEEE Transactions on Neural Networks
Generalized recurrent neural network for ε-insensitive support vector regression
Mathematics and Computers in Simulation
A model of analogue K-winners-take-all neural circuit
Neural Networks
Hi-index | 0.00 |
In this paper, based on a one-neuron recurrent neural network, a novel k-winners-take-all (k-WTA) network is proposed. Finite time convergence of the proposed neural network is proved using the Lyapunov method. The k-WTA operation is first converted equivalently into a linear programming problem. Then, a one-neuron recurrent neural network is proposed to get the kth or (k + 1)th largest inputs of the k-WTA problem. Furthermore, a k-WTA network is designed based on the proposed neural network to perform the k-WTA operation. Compared with the existing k-WTA networks, the proposed network has simple structure and finite time convergence. In addition, simulation results on numerical examples show the effectiveness and performance of the proposed k-WTA network.