C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Self-organizing maps
Data mining: concepts and techniques
Data mining: concepts and techniques
Machine Learning
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Efficient Incremental Learning Using Self-Organizing Neural Grove
Neural Information Processing
Negative correlation in incremental learning
Natural Computing: an international journal
Hi-index | 0.00 |
Recently, multiple classifier systems (MCS) have been used for practical applications to improve classification accuracy. Self-generating neural networks (SGNN) are one of the suitable base-classifiers for MCS because of their simple setting and fast learning. However, the computational cost of the MCS increases in proportion to the number of SGNN. In this paper, we propose a novel pruning method for the structure of the SGNN in the MCS. Experiments have been conducted to compare the pruned MCS with an unpruned MCS, the MCS based on C4.5, and k-nearest neighbor method. The results show that the pruned MCS can improve its classification accuracy as well as reducing the computational cost.