Addressing the local minima problem by output monitoring and modification algorithms

  • Authors:
  • Sin-Chun Ng;Chi-Chung Cheung;Andrew kwok-fai Lui;Hau-Ting Tse

  • Affiliations:
  • School of Science and Technology, The Open University of Hong Kong, Homantin, Hong Kong;Department of Electronic and Information Engineering, The Hong Kong Polytechnic University, Hunghom, Hong Kong;School of Science and Technology, The Open University of Hong Kong, Homantin, Hong Kong;School of Science and Technology, The Open University of Hong Kong, Homantin, Hong Kong

  • Venue:
  • ISNN'12 Proceedings of the 9th international conference on Advances in Neural Networks - Volume Part I
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a new approach called output monitoring and modification (OMM) to address the local minimum problem for existing gradient-descent algorithms (like BP, Rprop and Quickprop) in training feed-forward neural networks. OMM monitors the learning process. When the learning process is trapped into a local minimum, OMM changes some incorrect output values to escape from such local minimum. This modification can be repeated with different parameter settings until the learning process converges to the global optimum. The simulation experiments show that a gradient-descent learning algorithm with OMM has a much better global convergence capability than those without OMM but their convergence rates are similar. In one benchmark problem (application), the global convergence capability was increased from 1% to 100%.