Overcoming the local-minimum problem in training multilayer perceptrons with the NRAE-MSE training method

  • Authors:
  • James Ting-Ho Lo;Yichuan Gui;Yun Peng

  • Affiliations:
  • Department of Mathematics and Statistics, University of Maryland, Baltimore County, Baltimore, Maryland;Department of Computer Science and Electrical Engineering, University of Maryland, Baltimore County, Baltimore, Maryland;Department of Computer Science and Electrical Engineering, University of Maryland, Baltimore County, Baltimore, Maryland

  • Venue:
  • ISNN'13 Proceedings of the 10th international conference on Advances in Neural Networks - Volume Part I
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

The normalized risk-averting error (NRAE) training method presented in ISNN 2012 is capable of overcoming the local-minimum problem in training neural networks. However, the overall success rate is unsatisfactory. Motivated by this problem, a modification, called the NRAE-MSE training method is herein proposed. The new method trains neural networks with respect to NRAE with a fixed λ in the range of 106-1011, and takes excursions to train with the standard mean squared error (MSE) from time to time. Once an excursion produces a satisfactory MSE with cross-validation, the entire NRAE-MSE training stops. Numerical experiments show that the NRAE-MSE training method has a success rate of 100% in all the testing examples each starting with a large number of randomly selected initial weights.