Backpropagation in perceptrons with feedback
Proceedings of the NATO Advanced Research Workshop on Neural computers
Finite impulse response neural networks with applications in time series prediction
Finite impulse response neural networks with applications in time series prediction
Stability analysis of delayed cellular neural networks
Neural Networks
Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
On the stability analysis of delayed neural networks systems
Neural Networks
Spatiotemporal Connectionist Networks: A Taxonomy and Review
Neural Computation
Hi-index | 0.00 |
In this paper, the problems of stability of a general class of discrete-time delayed recurrent neural networks are re-investigated in light of some recent results. These networks are obtained by modeling synapses as Finite Impulse Response (FIR) filters instead of multiplicative scalars. We first derive a sufficient conditions for the network operating in closed-loop to converge to a fixed point using Lyapunov functional method; the symmetry of the connection matrix is not assumed. We then show how these conditions relate to other conditions ensuring both the existence of the error gradient other arbitrary long trajectories and the asymptotic stability of the fixed points at each time step.