Divide and conquer neural networks
Neural Networks
Modeling with constructive backpropagation
Neural Networks
A class decomposition approach for GA-based classifiers
Engineering Applications of Artificial Intelligence
Class decomposition for GA-based classifier agents - a Pitt approach
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IEEE Transactions on Neural Networks
Parallel growing and training of neural networks using output parallelism
IEEE Transactions on Neural Networks
Efficient classification for multiclass problems using modular neural networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Hierarchical Incremental Class Learning (HICL) is a new task decomposition method that addresses the pattern classification problem. The HICL is proven to be a good classifier but closer examination reveals areas for potential improvement. This paper proposes a theoretical model to evaluate the performance of HICL and presents an approach to improve the classification accuracy of HICL by applying the concept of Reduced Pattern Training (RPT). The theoretical analysis shows that HICL can achieve better classification accuracy than Output Parallelism [Guan and Li: IEEE Transaction on Neural Networks, 13 (2002), 542---550]. The procedure for RPT is described and compared with the original training procedure. The RPT reduces systematically the size of the training data set based on the order of sub-networks built. The results from four benchmark classification problems show much promise for the improved model.