Comparative Study of the CG and HBF ODEs Used in the Global Minimization of Nonconvex Functions

  • Authors:
  • Amit Bhaya;Fernando A. Pazos;Eugenius Kaszkurewicz

  • Affiliations:
  • Department of Electrical Engineering, COPPE/UFRJ, Rio de Janeiro, Brazil 21945/970;Department of Electrical Engineering, COPPE/UFRJ, Rio de Janeiro, Brazil 21945/970;Department of Electrical Engineering, COPPE/UFRJ, Rio de Janeiro, Brazil 21945/970

  • Venue:
  • ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a unified control Liapunov function (CLF) approach to the design of heavy ball with friction (HBF) and conjugate gradient (CG) neural networks that aim to minimize scalar nonconvex functions that have continuous first- and second-order derivatives and a unique global minimum. This approach leads naturally to the design of second-order differential equations which are the mathematical models of the corresponding implementations as neural networks. Preliminary numerical simulations indicate that, on a small suite of benchmark test problems, a continuous version of the well known conjugate gradient algorithm, designed by the proposed CLF method, has better performance than its HBF competitor.