Unified control Liapunov function based design of neural networks that aim at global minimization of nonconvex functions

  • Authors:
  • Fernando A. Pazos;Amit Bhaya;Eugenius Kaszkurewicz

  • Affiliations:
  •  ; ; 

  • Venue:
  • IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a unified approach to the design of neural networks that aim to minimize scalar nonconvex functions that have continuous first- and second-order derivatives and a unique global minimum. The approach is based on interpreting the function as a controlled object, namely one that has an output (the function value) that has to be driven to its smallest value by suitable manipulation of its inputs: this is achieved by the use of the control Liapunov function (CLF) technique, well known in systems and control theory. This approach leads naturally to the design of second-order differential equations which are the mathematical models of the corresponding implementations as neural networks. Preliminary numerical simulations indicate that, on a small suite of benchmark test problems, a continuous version of the well known conjugate gradient algorithm, designed by the proposed CLF method, has better performance than its competitors, such as the heavy ball with friction method or the more recent dynamic inertial Newton-like method.