A Winner-Take-All Neural Networks of N Linear Threshold Neurons without Self-Excitatory Connections

  • Authors:
  • Hong Qu;Zhang Yi;Xiaobin Wang

  • Affiliations:
  • School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu, People's Republic of China 610054;School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu, People's Republic of China 610054;School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu, People's Republic of China 610054

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multistable neural networks have attracted much interests in recent years, since the monostable networks are computationally restricted. This paper studies a N linear threshold neurons recurrent networks without Self-Excitatory connections. Our studies show that this network performs a Winner-Take-All (WTA) behavior, which has been recognized as a basic computational model done in brain. The contributions of this paper are: (1) It proves by mathematics that the proposed model is Non-Divergent. (2) An important implication (Winner-Take-All) of the proposed network model is studied. (3) Digital computer simulations are carried out to validate the performance of the theory findings.