The Strength of Weak Learnability
Machine Learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Self-organizing maps
Data mining: concepts and techniques
Data mining: concepts and techniques
Machine Learning
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Improving Generalization Ability of Self-Generating Neural Networks Through Ensemble Averaging
PADKK '00 Proceedings of the 4th Pacific-Asia Conference on Knowledge Discovery and Data Mining, Current Issues and New Applications
Optimizing a Multiple Classifier System
PRICAI '02 Proceedings of the 7th Pacific Rim International Conference on Artificial Intelligence: Trends in Artificial Intelligence
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Fast k-Nearest Neighbor Classification Using Cluster-Based Trees
IEEE Transactions on Pattern Analysis and Machine Intelligence
Efficient Incremental Learning Using Self-Organizing Neural Grove
Neural Information Processing
Self-Organizing Neural Grove and Its Parallel and Distributed Performance
ICONIP '09 Proceedings of the 16th International Conference on Neural Information Processing: Part II
Self-organizing neural grove and its distributed performance
ICA3PP'10 Proceedings of the 10th international conference on Algorithms and Architectures for Parallel Processing - Volume Part II
Hi-index | 0.00 |
Multiple classifier systems (MCS) have become popular during the last decade. Self-generating neural tree (SGNT) is one of the suitable base-classifiers for MCS because of the simple setting and fast learning. However, the computation cost of the MCS increases in proportion to the number of SGNT. In an earlier paper, we proposed a pruning method for the structure of the SGNT in the MCS to reduce the computation cost. In this paper, we propose a novel pruning method for more effective processing and we call this model as self-organizing neural grove (SONG). The pruning method is constructed from an on-line pruning method and an off-line pruning method. Experiments have been conducted to compare the SONG with an unpruned MCS based on SGNT, an MCS based on C4.5, and k-nearest neighbor method. The results show that the SONG can improve its classification accuracy as well as reducing the computation cost.