Original Contribution: An analysis of premature saturation in back propagation learning

  • Authors:
  • Youngjik Lee;Sang-Hoon Oh;Myung Won Kim

  • Affiliations:
  • -;-;-

  • Venue:
  • Neural Networks
  • Year:
  • 1993

Quantified Score

Hi-index 0.01

Visualization

Abstract

The back propagation (BP) algorithm is widely used for finding optimum weights of multilayer neural networks in many pattern recognition applications. However, the critical drawbacks of the algorithm are its slow learning speed and convergence to local minima. One of the major reasons for these drawbacks is the ''premature saturation'' which is a phenomenon that the error of the neural network stays significantly high constant for some period of time during learning. It is known to be caused by an inappropriate set of initial weights. In this paper, the probability of premature saturation at the beginning epoch of learning procedure in the BP algorithm has been derived in terms of the maximum value of initial weights, the number of nodes in each layer, and the maximum slope of the sigmoidal activation function; it has been verified by the Monte Carlo simulation. Using this result, the premature saturation can be avoided with proper initial weight settings.