A fast learning algorithm based on layered hessian approximations and the pseudoinverse

  • Authors:
  • E. J. Teoh;C. Xiang;K. C. Tan

  • Affiliations:
  • Department of Electrical and Computer Engineering, National University of Singapore, Singapore;Department of Electrical and Computer Engineering, National University of Singapore, Singapore;Department of Electrical and Computer Engineering, National University of Singapore, Singapore

  • Venue:
  • ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this article, we present a simple, effective method to learning for an MLP that is based on approximating the Hessian using only local information, specifically, the correlations of output activations from previous layers of hidden neurons. This approach of training the hidden layer weights with the Hessian approximation combined with the training of the final output layer of weights using the pseudoinverse [1] yields improved performance at a fraction of the computational and structural complexity of conventional learning algorithms.