Unbiased Likelihood Backpropagation Learning

  • Authors:
  • Masashi Sekino;Katsumi Nitta

  • Affiliations:
  • Tokyo Institute of Technology, Japan;Tokyo Institute of Technology, Japan

  • Venue:
  • Neural Information Processing
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

The error backpropagation is one of the popular methods for training an artificial neural network. When the error backpropagation is used for training an artificial neural network, overfitting occurs in the latter half of the training. This paper provides an explanation about why overfitting occurs with the model selection framework. The explanation leads to a new method for training an aritificial neural network, Unibiased Likelihood Backpropagation Learning. Several results are shown.