An improved backpropagation algorithm using absolute error function

  • Authors:
  • Jiancheng Lv;Zhang Yi

  • Affiliations:
  • Computational Intelligence Laboratory, School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chegndu, China;Computational Intelligence Laboratory, School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chegndu, China

  • Venue:
  • ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

An improved backpropagation algorithm is proposed by using the Lyapunov method to minimize the absolution error function. The improved algorithm can make both error and gradient approach zero so that the local minima problem can be avoided. In addition, since the absolute error function is used, this algorithm is more robust and faster for learning than the backpropagation with the traditional square error function when target signals include some incorrect data. This paper also proposes a method of using Lyapunov stability theory to derive a learning algorithm which directly minimize the absolute error function.