An equalized error backpropagation algorithm for the on-line training of multilayer perceptrons

  • Authors:
  • J. -P. Martens;N. Weymaere

  • Affiliations:
  • Electron. & Inf. Syst., Ghent Univ., Gent;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

The error backpropagation (EBP) training of a multilayer perceptron (MLP) may require a very large number of training epochs. Although the training time can usually be reduced considerably by adopting an on-line training paradigm, it can still be excessive when large networks have to be trained on lots of data. In this paper, a new on-line training algorithm is presented. It is called equalized EBP (EEBP), and it offers improved accuracy, speed, and robustness against badly scaled inputs. A major characteristic of EEBP is its utilization of weight specific learning rates whose relative magnitudes are derived from a priori computable properties of the network and the training data