Closed loop stability of FIR-recurrent neural networks

  • Authors:
  • Alex Aussem

  • Affiliations:
  • LIMOS, UMR CNRS, University Blaise Pascal, Aubiere Cedex, France

  • Venue:
  • ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, the problems of stability of a general class of discrete-time delayed recurrent neural networks are re-investigated in light of some recent results. These networks are obtained by modeling synapses as Finite Impulse Response (FIR) filters instead of multiplicative scalars. We first derive a sufficient conditions for the network operating in closed-loop to converge to a fixed point using Lyapunov functional method; the symmetry of the connection matrix is not assumed. We then show how these conditions relate to other conditions ensuring both the existence of the error gradient other arbitrary long trajectories and the asymptotic stability of the fixed points at each time step.