Stochastic high-order hopfield neural networks

  • Authors:
  • Yi Shen;Guoying Zhao;Minghui Jiang;Shigeng Hu

  • Affiliations:
  • Department of Control Science and Engineering, Huazhong University of Science and Technology, Wuhan, Hubei, China;Department of Control Science and Engineering, Huazhong University of Science and Technology, Wuhan, Hubei, China;Department of Control Science and Engineering, Huazhong University of Science and Technology, Wuhan, Hubei, China;Department of Mathematics, Huazhong University of Science and Technology, Wuhan, Hubei, China

  • Venue:
  • ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part I
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

In 1984 Hopfield showed that the time evolution of a symmetric Hopfield neural networks are a motion in state space that seeks out minima in the energy function (i.e., equilibrium point set of Hopfield neural networks). Because high-order Hopfield neural networks have more extensive applications than Hopfield neural networks, and have been discussed on the convergence of the networks. In practice, a neural network is often subject to environmental noise. It is therefore useful and interesting to find out whether the high-order neural network system still approacher some limit set under stochastic perturbation. In this paper, we will give a number of useful bounds for the noise intensity under which the stochastic high-order neural network will approach its limit set. Our result cancels the requirement of symmetry of the connection weight matrix and includes the classic result on Hopfield neural networks, which is a special case of stochastic high-order Hopfield neural networks. In the end, A example is given to verify the effective of our results.