Deterministic convergence of an online gradient method for BP neural networks

  • Authors:
  • Wei Wu;Guorui Feng;Zhengxue Li;Yuesheng Xu

  • Affiliations:
  • Appl. Math. Dept., Dalian Univ. of Technol., China;-;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2005

Quantified Score

Hi-index 0.01

Visualization

Abstract

Online gradient methods are widely used for training feedforward neural networks. We prove in this paper a convergence theorem for an online gradient method with variable step size for backward propagation (BP) neural networks with a hidden layer. Unlike most of the convergence results that are of probabilistic and nonmonotone nature, the convergence result that we establish here has a deterministic and monotone nature.