A new two-step gradient-based backpropagation training method for neural networks

  • Authors:
  • Xuewen Mu;Yaling Zhang

  • Affiliations:
  • Department of Applied Mathematics, Xidian University, Xi'an, China;Department of Computer Science, Xi'an Science and Technology University, Xi'an, China

  • Venue:
  • ISNN'10 Proceedings of the 7th international conference on Advances in Neural Networks - Volume Part I
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

A new two step gradient-based backpropagation training method is proposed for neural networks in this paper Based on the Barzilai and Borwein steplength update rule and the technique of Resilient Gradient Descent method, we give a new descent direction and steplength update rule The new two step learning rate improves the speed and the success rate Experimental results show that the proposed method has considerably improved convergence speed, and for the chosen test problems, outperforms other well-known training methods.