Enhanced training algorithms, and integrated training/architecture selection for multilayer perceptron networks

  • Authors:
  • M. G. Bello

  • Affiliations:
  • Charles Stark Draper Lab. Inc., Cambridge, MA

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 1992

Quantified Score

Hi-index 0.00

Visualization

Abstract

The standard backpropagation-based multilayer perceptron training algorithm suffers from a slow asymptotic convergence rate. Sophisticated nonlinear least-squares and quasi-Newton optimization techniques are used to construct enhanced multilayer perceptron training algorithms, which are then compared to the backpropagation algorithm in the context of several example problems. In addition, an integrated approach to training and architecture selection that uses the described enhanced algorithms is presented, and its effectiveness illustrated in the context of synthetic and actual pattern recognition problems