Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Hi-index | 0.00 |
The error backpropagation is one of the popular methods for training an artificial neural network. When the error backpropagation is used for training an artificial neural network, overfitting occurs in the latter half of the training. This paper provides an explanation about why overfitting occurs with the model selection framework. The explanation leads to a new method for training an aritificial neural network, Unibiased Likelihood Backpropagation Learning. Several results are shown.