Original Contribution: An analytical framework for optimizing neural networks

  • Authors:
  • Andrew H. Gee;Sreeram V. B. Aiyer;Richard W. Prager

  • Affiliations:
  • -;-;-

  • Venue:
  • Neural Networks
  • Year:
  • 1993

Quantified Score

Hi-index 0.00

Visualization

Abstract

There has recently been much research interest in the use of feedback neural networks to solve combinatorial optimization problems. Although initial results were disappointing, it has since been demonstrated how modified network dynamics and better problem mapping can greatly improve the solution quality. The aim of this paper is to build on this progress by presenting a new analytical framework in which problem mappings can be evaluated without recourse to purely experimental means. A linearized analysis of the Hopfield network's dynamics forms the main theory of the paper, followed by a series of experiments in which some problem mappings are investigated in the context of these dynamics. The experimental results are seen to be compatible with the linearized theory, and observed weaknesses in the mappings are fully explained within the framework. What emerges is a largely analytical technique for evaluating candidate problem mappings, without recourse to the more usual trial and error.