Hierarchical Incremental Class Learning with Reduced Pattern Training

  • Authors:
  • Sheng-Uei Guan;Chunyu Bao;Ru-Tian Sun

  • Affiliations:
  • School of Engineering and Design, Brunel University, Uxbridge, UK UB8 3PH;Department of Electrical and Computer Engineering, National University of Singapore, Singapore, Singapore 119260;Department of Electrical and Computer Engineering, National University of Singapore, Singapore, Singapore 119260

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Hierarchical Incremental Class Learning (HICL) is a new task decomposition method that addresses the pattern classification problem. The HICL is proven to be a good classifier but closer examination reveals areas for potential improvement. This paper proposes a theoretical model to evaluate the performance of HICL and presents an approach to improve the classification accuracy of HICL by applying the concept of Reduced Pattern Training (RPT). The theoretical analysis shows that HICL can achieve better classification accuracy than Output Parallelism [Guan and Li: IEEE Transaction on Neural Networks, 13 (2002), 542---550]. The procedure for RPT is described and compared with the original training procedure. The RPT reduces systematically the size of the training data set based on the order of sub-networks built. The results from four benchmark classification problems show much promise for the improved model.