The effects of adding noise during backpropagation training on a generalization performance

  • Authors:
  • Guozhong An

  • Affiliations:
  • Shell Research, P.O. Box 60, 2280 AB Rijswijk, The Netherlands

  • Venue:
  • Neural Computation
  • Year:
  • 1996

Quantified Score

Hi-index 0.00

Visualization

Abstract

We study the effects of adding noise to the inputs, outputs, weight connections, and weight changes of multilayer feedforward neural networks during backpropagation training. We rigorously derive and analyze the objective functions that are minimized by the noise-affected training processes. We show that input noise and weight noise encourage the neural-network output to be a smooth function of the input or its weights, respectively. In the weak-noise limit, noise added to the output of the neural networks only changes the objective function by a constant. Hence, it cannot improve generalization. Input noise introduces penalty terms in the objective function that are related to, but distinct from, those found in the regularization approaches. Simulations have been performed on a regression and a classification problem to further substantiate our analysis. Input noise is found to be effective in improving the generalization performance for both problems. However, weight noise is found to be effective in improving the generalization performance only for the classification problem. Other forms of noise have practically no effect on generalization.