Scalability of learning impact on complex parameters in recurrent neural networks

  • Authors:
  • Branko Šter;Andrej Dobnikar

  • Affiliations:
  • Faculty of Computer and Information Science, University of Ljubljana, Ljubljana, Slovenia;Faculty of Computer and Information Science, University of Ljubljana, Ljubljana, Slovenia

  • Venue:
  • ICANNGA'09 Proceedings of the 9th international conference on Adaptive and natural computing algorithms
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

The impact of problem extents and network sizes on learning in recurrent neural networks is analysed in terms of structural parameters of related graphs. In previous work the influence of learning on the changes of the typical parameters such as characteristic path length, clustering coefficient, degree distribution and entropy, was investigated. In the present work the focus is enlarged to the scaling problem of the learning paradigm. The results prove the scalability of learning procedures due to the retained dynamics of the parameters during learning with different problem extents and network sizes.