C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Data mining: concepts and techniques
Data mining: concepts and techniques
Machine Learning
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Self-Organizing Maps
Efficient Mining from Large Databases by Query Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Improving Generalization Ability of Self-Generating Neural Networks Through Ensemble Averaging
PADKK '00 Proceedings of the 4th Pacific-Asia Conference on Knowledge Discovery and Data Mining, Current Issues and New Applications
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
Fusers Based on Classifier Response and Discriminant Function --- Comparative Study
HAIS '08 Proceedings of the 3rd international workshop on Hybrid Artificial Intelligence Systems
Some Remarks on Chosen Methods of Classifier Fusion Based on Weighted Voting
HAIS '09 Proceedings of the 4th International Conference on Hybrid Artificial Intelligence Systems
Improving performance of a multiple classifier system using self-generating neural networks
MCS'03 Proceedings of the 4th international conference on Multiple classifier systems
Modification of nested hyperrectangle exemplar as a proposition of information fusion method
IDEAL'09 Proceedings of the 10th international conference on Intelligent data engineering and automated learning
ICCOMP'06 Proceedings of the 10th WSEAS international conference on Computers
A survey of multiple classifier systems as hybrid systems
Information Fusion
Hi-index | 0.00 |
Recently, multiple classifier systems (MCS) have been used for practical applications to improve classification accuracy. Self-generating neural networks (SGNN) are one of the suitable base-classifiers for MCS because of their simple setting and fast learning. However, the computation cost of the MCS increases in proportion to the number of SGNN. In this paper, we propose a novel optimization method for the structure of the SGNN in the MCS. We compare the optimized MCS with two sampling methods. Experiments have been conducted to compare the optimized MCS with an unoptimized MCS, the MCS based on C4.5, and k-nearest neighbor. The results show that the optimized MCS can improve its classification accuracy as well as reducing the computation cost.