Fast training algorithms for multilayer neural nets

  • Authors:
  • R. P. Brent

  • Affiliations:
  • Comput. Sci. Lab., Australian Nat. Univ., Canberra, ACT

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 1991

Quantified Score

Hi-index 0.01

Visualization

Abstract

An algorithm that is faster than back-propagation and for which it is not necessary to specify the number of hidden units in advance is described. The relationship with other fast pattern-recognition algorithms, such as algorithms based on k-d trees, is discussed. The algorithm has been implemented and tested on artificial problems, such as the parity problem, and on real problems arising in speech recognition. Experimental results, including training times and recognition accuracy, are given. Generally, the algorithm achieves accuracy as good as or better than nets trained using back-propagation. Accuracy is comparable to that for the nearest-neighbor algorithm, which is slower and requires more storage space