A new synthesis approach for feedback neural networks based on the perceptron training algorithm

  • Authors:
  • D. Liu;Z. Lu

  • Affiliations:
  • Dept. of Electr. Eng. & Comput. Sci., Stevens Inst. of Technol., Hoboken, NJ;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 1997

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, a new synthesis approach is developed for associative memories based on the perceptron training algorithm. The design (synthesis) problem of feedback neural networks for associative memories is formulated as a set of linear inequalities such that the use of perceptron training is evident. The perceptron training in the synthesis algorithms is guaranteed to converge for the design of neural networks without any constraints on the connection matrix. For neural networks with constraints on the diagonal elements of the connection matrix, results concerning the properties of such networks and concerning the existence of such a network design are established. For neural networks with sparsity and/or symmetry constraints on the connection matrix, design algorithms are presented. Applications of the present synthesis approach to the design of associative memories realized by means of other feedback neural network models are studied. To demonstrate the applicability of the present results and to compare the present synthesis approach with existing design methods, specific examples are considered