Does the neuron “learn” like the synapse?
Advances in neural information processing systems 1
Comparing biases for minimal network construction with back-propagation
Advances in neural information processing systems 1
Two coding strategies for bidirectional associative memory
IEEE Transactions on Neural Networks
On multiple training for bidirectional associative memory
IEEE Transactions on Neural Networks
A Neural Associative Pattern Classifier
IBERAMIA 2002 Proceedings of the 8th Ibero-American Conference on AI: Advances in Artificial Intelligence
Hi-index | 0.00 |
The multiple training concept first applied to Bidirectional Associative Memory training is applied to the back-propagation (BP) algorithm for use in associative memories. This new algorithm, which assigns different weights to the various pairs in the energy function, is called multiple training back-propagation (MTBP). The pair weights are updated during the training phase using the basic differential multiplier method (BDMM). A sufficient condition for convergence of the training phase is that the second derivative of the energy function with respect to the weights of the synapses is positive along the paths of both synapse weights and pair weights. A simple example of the use of the algorithm is provided, followed by two simulations that show that this algorithm can increase the training speed of the network dramatically.