Computational properties of cyclic and almost-cyclic learning with momentum for feedforward neural networks

  • Authors:
  • Jian Wang;Wei Wu;Jacek M. Zurada

  • Affiliations:
  • Dalian University of Technology, Dalian, Liaoning, China,University of Louisville, Louisville, Kentucky;Dalian University of Technology, Dalian, Liaoning, China;University of Louisville, Louisville, Kentucky

  • Venue:
  • ISNN'12 Proceedings of the 9th international conference on Advances in Neural Networks - Volume Part I
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Two backpropagation algorithms with momentum for feedforward neural networks with a single hidden layer are considered. It is assumed that the training samples are supplied to the network in a cyclic or an almost-cyclic fashion in the learning procedure. A re-start strategy for the momentum is adopted such that the momentum coefficient is set to zero at the beginning of each training cycle. Corresponding weak and strong convergence results are presented, respectively. The convergence conditions on the learning rate, the momentum coefficient and the activation functions are much relaxed compared with those of the existing results. Numerical examples are implemented to support our theoretical results and demonstrate that ACMFNN does much better than CMFNN on both convergence speed and generalization ability.