M-matrices and global convergence of discontinuous neural networks: Research Articles
International Journal of Circuit Theory and Applications
Block attractor in spatially organized neural networks
Neurocomputing
IEEE Transactions on Circuits and Systems Part I: Regular Papers
Multistability and new attraction basins of almost-periodic solutions of delayed neural networks
IEEE Transactions on Neural Networks
Nontrivial global attractors in 2-D multistable attractor neural networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A new synthesis approach for feedback neural networks based on the perceptron training algorithm
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
This paper considers a recurrent neural network (RNN) with a special class of discontinuous activation function which is piecewise constants in the state space. One sufficient condition is established to ensure that the novel recurrent neural networks can have (4k-1)^n locally exponential stable equilibrium points. Such RNN is suitable for synthesizing high-capacity associative memories. The design procedure is presented with the method of singular value decomposition. Finally, the validity and performance of the results are illustrated by use of two numerical examples.