Improving the error backpropagation algorithm with a modified error function

  • Authors:
  • Sang-Hoon Oh

  • Affiliations:
  • Res. Dept., Electron. & Telecommun. Res. Inst., Taejon

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 1997

Quantified Score

Hi-index 0.00

Visualization

Abstract

This letter proposes a modified error function to improve the error backpropagation (EBP) algorithm of multilayer perceptrons (MLPs) which suffers from slow learning speed. To accelerate the learning speed of the EBP algorithm, the proposed method reduces the probability that output nodes are near the wrong extreme value of sigmoid activation function. This is acquired through a strong error signal for the incorrectly saturated output node and a weak error signal for the correctly saturated output node. The weak error signal for the correctly saturated output node, also, prevents overspecialization of learning for training patterns. The effectiveness of the proposed method is demonstrated in a handwritten digit recognition task