Better learning for bidirectional associative memory

  • Authors:
  • Xinhua Zhuang;Yan Huang;Su-Shing Chen

  • Affiliations:
  • Department of Electrical and Computer Engineering, USA;Department of Electrical and Computer Engineering, USA;Department of Electrical and Computer Engineering, USA

  • Venue:
  • Neural Networks
  • Year:
  • 1993

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we are to develop better learning rules for the bidirectional associative memory (BAM) based on three well-recognized optimal criteria, that is, all desired attractors should be made not only stable but also asymptotically stable, and spurious memories should be the fewest possible. We first explore the equivalence between the stability of all desired attractors and certain bidirectional linear separabilities and then relate three optimal criteria to expanding the kernal basin of attraction of each desired attractor in both the X-space and Y-space. To characterize this, we define an important and equivalent concept called by the Hamming-stability. Surprisingly, the Hamming-stability of all desired attractors turns out to be further equivalent to certain moderately expansive bidirectional linear separabilities. As a result, the well-known Rosenblatt's perceptron learning rule can be used to achieve the stability of all desired attractors and even three optimal criteria. The learning rules developed thereofare called the bidirectional perceptron stability learning rule (BPSL) and bidirectional perceptron Hamming-stability learning rule (BPHSL). A number of computer experiments show the improved performance of the BAM trained by the BPSL or BPHSL in regard to its stability, asymptotic stability, and spurious memories.