A Global Minimization Algorithm Based on a Geodesic of a Lagrangian Formulation of Newtonian Dynamics

  • Authors:
  • Joon Shik Kim;Jong Chan Kim;Jangmin O;Byoung-Tak Zhang

  • Affiliations:
  • Department of Physics and Astronomy, Seoul National University, Seoul, Republic of Korea 151-747;Department of Physics and Astronomy, Seoul National University, Seoul, Republic of Korea 151-747;School of Computer Science and Engineering, Seoul National University, Seoul, Republic of Korea 151-744;School of Computer Science and Engineering, Seoul National University, Seoul, Republic of Korea 151-744

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

The global minimum search problem is important in neural networks because the error cost involved is formed as multiminima potential in weight parametric space. Therefore, parameters that produce a global minimum in a cost function are the best values for enhancing the performance of neural networks. Previously, a global minimum search based on a damped oscillator equation known as the heavy ball with friction (HBF) was studied. The kinetic energy overcomes a local minimum if the kinetic energy is sufficiently large or else the heavy ball will converge into a local minimum due to the action of friction. However, an appropriate damping coefficient has not been found in the HBF; therefore, the ball has to be shot again after it arrives at each local minimum until it finds a global minimum. In order to solve this problem, we determined an adaptive damping coefficient using the geodesic of Newtonian dynamics Lagrangian. This geometric method produces a second-order adaptively damped oscillator equation, the damping coefficient of which is the negative time derivative of the logarithmic function of the cost potential. Furthermore, we obtained a novel adaptive steepest descent by discretizing this second-order equation. To investigate the performance of this novel steepest descent, we applied our first-order update rule to the Rosenbrock- and Griewank-type potentials. The results show that our method determined the global minimum in most cases from various initial points. Our adaptive steepest descent may be applied in many fields related to global minimum search, such as neural networks, game theory, and economics.