A general updating rule for discrete hopfield-type neural network with delay

  • Authors:
  • Shenshan Qiu;Eric C. C. Tsang;Daniel S. Yeung;Xizhao Wang

  • Affiliations:
  • Department of Automatic Control Engineering, South China University of Technology, Guangzhou, China;Dept. of Computing, The Hong Kong Polytechnic University;Dept. of Computing, The Hong Kong Polytechnic University;Dept. of Computing, The Hong Kong Polytechnic University

  • Venue:
  • IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, the Hopfield neural network with delay (HNND) is studied from the standpoint of regarding it as an optimized computational model. Two general updating rules for network with delay (GURD) are given based on Hopfield-type neural networks with delay for optimization problems and characterized dynamic thresholds. It is proved that in any sequence of updating rule modes, the GURD monotonously converges to a stable state of the network. The diagonal elements of the connection matrix are shown to have an important influence on the convergence process, and they represent the relationship of the local maximum value of the energy function with the stable states of the networks. All ordinary DHNN algorithms are instances of GURD. It can be shown that the convergence conditions of GURD may be relaxed in the context of applications, for instance, the condition of nonnegative diagonal elements of the connection matrix can be removed from the original convergence theorem. New updating rule mode and restrictive conditions can guarantee the network to achieve a local maximum of the energy function.