Enhanced MLP performance and fault tolerance resulting from synaptic weight noise during training

  • Authors:
  • A. F. Murray;P. J. Edwards

  • Affiliations:
  • Dept. of Electr. Eng., Edinburgh Univ.;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 1994

Quantified Score

Hi-index 0.00

Visualization

Abstract

We analyze the effects of analog noise on the synaptic arithmetic during multilayer perceptron training, by expanding the cost function to include noise-mediated terms. Predictions are made in the light of these calculations that suggest that fault tolerance, training quality and training trajectory should be improved by such noise-injection. Extensive simulation experiments on two distinct classification problems substantiate the claims. The results appear to be perfectly general for all training schemes where weights are adjusted incrementally, and have wide-ranging implications for all applications, particularly those involving “inaccurate” analog neural VLSI