Connectionist learning procedures
Artificial Intelligence
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
C4.5: programs for machine learning
C4.5: programs for machine learning
Automated knowledge acquisition
Automated knowledge acquisition
Structural learning with forgetting
Neural Networks
Machine Learning
Boosting the margin: A new explanation for the effectiveness of voting methods
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Performance analysis of pattern classifier combination by plurality voting
Pattern Recognition Letters
Ensemble of Evolving Neural Networks in Classification
Neural Processing Letters
Ensemble of Genetic Programming Models for Designing Reactive Power Controllers
HIS '05 Proceedings of the Fifth International Conference on Hybrid Intelligent Systems
CIMCA '05 Proceedings of the International Conference on Computational Intelligence for Modelling, Control and Automation and International Conference on Intelligent Agents, Web Technologies and Internet Commerce Vol-1 (CIMCA-IAWTIC'06) - Volume 01
An ensemble method in hybrid real-coded genetic algorithm with pruning for data classification
AIAP'07 Proceedings of the 25th conference on Proceedings of the 25th IASTED International Multi-Conference: artificial intelligence and applications
A new evolutionary system for evolving artificial neural networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
We have proposed an ensemble method using hybrid real-coded genetic algorithm with pruning (HRGA/P) for superior generalization ability in classification. In order to further improve its performance, this paper proposes to use a novel hybrid real-coded genetic algorithm with pruning (HRGA/Pr) instead of HRGA/P for estimating classifiers. A crucial idea here is to replace the evaluation of the entire classifier by the original Rumelhart's regularizer to that of each unit as an additive criterion term for reducing the complexity. It is intended for improving the generalization ability of the classifier with efficiently exploring the simple structure of the classifier by execution of the additional criterion. Accordingly, the resulting classifiers are expected to be structurally simple and have superior generalization ability in classification. Applications of the proposed method to an iris classification problem well demonstrate its effectiveness. Our experimental results indicate that it has superior generalization ability for test data (classification rate: 98.3%) than the conventional algorithms such as backpropagation (classification rate: 94.1%) and structural learning with forgetting (classification rate: 95.0%).