A massively parallel architecture for a self-organizing neural pattern recognition machine
Computer Vision, Graphics, and Image Processing
Dynamic heteroassociative neural memories
Neural Networks
Two coding strategies for bidirectional associative memory
IEEE Transactions on Neural Networks
A Neural Associative Pattern Classifier
IBERAMIA 2002 Proceedings of the 8th Ibero-American Conference on AI: Advances in Artificial Intelligence
A novel neural hetero-associative memory model for pattern recognition
Pattern Recognition
Hi-index | 0.00 |
In this paper, we are to develop better learning rules for the bidirectional associative memory (BAM) based on three well-recognized optimal criteria, that is, all desired attractors should be made not only stable but also asymptotically stable, and spurious memories should be the fewest possible. We first explore the equivalence between the stability of all desired attractors and certain bidirectional linear separabilities and then relate three optimal criteria to expanding the kernal basin of attraction of each desired attractor in both the X-space and Y-space. To characterize this, we define an important and equivalent concept called by the Hamming-stability. Surprisingly, the Hamming-stability of all desired attractors turns out to be further equivalent to certain moderately expansive bidirectional linear separabilities. As a result, the well-known Rosenblatt's perceptron learning rule can be used to achieve the stability of all desired attractors and even three optimal criteria. The learning rules developed thereofare called the bidirectional perceptron stability learning rule (BPSL) and bidirectional perceptron Hamming-stability learning rule (BPHSL). A number of computer experiments show the improved performance of the BAM trained by the BPSL or BPHSL in regard to its stability, asymptotic stability, and spurious memories.