Linear Algebra Approach to Neural Associative Memories and Noise performance of Neural Classifiers

  • Authors:
  • Vladimir Cherkassky;Karen Fassett;Nikolaos Vassilas

  • Affiliations:
  • -;-;-

  • Venue:
  • IEEE Transactions on Computers - Special issue on artificial neural networks
  • Year:
  • 1991

Quantified Score

Hi-index 0.02

Visualization

Abstract

The authors present an analytic evaluation of saturation and noise performance for a large class of associative memories based on matrix operations. The importance of using standard linear algebra techniques for evaluating noise performance of associative memories is emphasized. The authors present a detailed comparative analysis of the correlation matrix memory and the generalized inverse memory construction rules for auto-associative memory and neural classifiers. Analytic results for the noise performance of neural classifiers that can store several prototypes in one class are presented. The analysis indicates that for neural classifiers the simple correlation matrix memory provides better noise performance than the more complex generalized inverse memory.