A Method on Improvement of the Online Mode Error Backpropagation Algorithm for Pattern Recognition

  • Authors:
  • Tae-Seung Lee;Ho-Jin Choi;Young-Kil Kwag;Byong-Won Hwang

  • Affiliations:
  • -;-;-;-

  • Venue:
  • PRICAI '02 Proceedings of the 7th Pacific Rim International Conference on Artificial Intelligence: Trends in Artificial Intelligence
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

Having a variety of good characteristics against other pattern recognition techniques, the multilayer perceptron (MLP) has been used in many applications. But, it is known that the error backpropagation (EBP) algorithm that the MLP uses in learning has the defect that requires relatively long learning time. In order to increase learning speed it is very effective to use the online-based learning methods, which update the weight vector of the MLP pattern by pattern, because the learning data for pattern recognition contain high redundancy. A typical online EBP algorithm applies the fixed learning rate for each update of the weight vector. Though a large amount of speedup with the online EBP can be obtained just by choosing the appropriate fixed rate, fixing the rate has the inefficiency that doesn't fully utilize the instant updates of the online mode. And, although the patterns come to be divided into the learned and the unlearned during learning process and the learned have no need to go through the computation for learning, the existing online EBP uniformly computes the whole patterns during an epoch. To remedy these inefficiencies, this paper proposes a Changing rate and Omitting patterns in Instant Learning (COIL) method to apply the appropriate rate for each pattern and put only the unlearned into learning. To verify the efficiency of the COIL, experimentations are conducted for speaker verification and speech recognition as the applications of pattern recognition and the results are presented.