Global Optimization for Neural Network Training

  • Authors:
  • Yi Shang;Benjamin W. Wah

  • Affiliations:
  • -;-

  • Venue:
  • Computer - Special issue: neural computing: companion issue to Spring 1996 IEEE Computational Science & Engineering
  • Year:
  • 1996

Quantified Score

Hi-index 0.00

Visualization

Abstract

Many learning algorithms find their roots in function-minimization algorithms that can be classified as local- or global-minimization algorithms. Algorithms that focus on either extreme--local search or global search--do not work well. The authors propose a hybrid method, called NOVEL for Nonlinear Optimization via External Lead, that combines global and local searches to explore the solution space, locate promising regions, and find local minima. To guide exploration of the solution space, it uses a continuous terrain-independent trace that does not get trapped in local minima. NOVEL next uses a locate gradient to attract the search to a local minimum, but the trace pulls it out once little improvement is found. NOVEL then selects one initial point for each promising region and uses these points for a descent algorithm to find local minima. It thus avoids searching unpromising local minima from random starting points using computationally expensive descent algorithms. In an implementation using differential- and difference-equation solvers, NOVEL demonstrated superior performance in five benchmark comparisons against the best global optimization algorithms.