Time Series Generation by Recurrent Neural Networks

  • Authors:
  • A. Priel;I. Kanter

  • Affiliations:
  • Department of Physics and Minerva Center, Bar-Ilan University, 52900 Ramat-Gan, Israel, Center for Brain Sciences, Bar-Ilan University, Ramat-Gan, Israel e-mail: priel@mail.biu.ac.ilido@kanter1.ph.biu.ac.il

  • Venue:
  • Annals of Mathematics and Artificial Intelligence
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

The properties of time series, generated by continuous valued feed-forward networks in which the next input vector is determined from past output values, are studied. Asymptotic solutions developed suggest that the typical stable behavior is (quasi) periodic with attractor dimension that is limited by the number of hidden units, independent of the details of the weights. The results are robust under additive noise, except for expected noise-induced effects – attractor broadening and loss of phase coherence at large times. These effects, however, are moderated by the size of the network N.