Multiple training concept for back-propagation neural networks for use in associative memories

  • Authors:
  • Yeou-Fang Wang;Jose B. Cruz, Jr.;J. H. Mulligan, Jr.

  • Affiliations:
  • University of California at Irvine, USA;University of California at Irvine, USA;University of California at Irvine, USA

  • Venue:
  • Neural Networks
  • Year:
  • 1993

Quantified Score

Hi-index 0.00

Visualization

Abstract

The multiple training concept first applied to Bidirectional Associative Memory training is applied to the back-propagation (BP) algorithm for use in associative memories. This new algorithm, which assigns different weights to the various pairs in the energy function, is called multiple training back-propagation (MTBP). The pair weights are updated during the training phase using the basic differential multiplier method (BDMM). A sufficient condition for convergence of the training phase is that the second derivative of the energy function with respect to the weights of the synapses is positive along the paths of both synapse weights and pair weights. A simple example of the use of the algorithm is provided, followed by two simulations that show that this algorithm can increase the training speed of the network dramatically.