C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
An Efficient Method To Estimate Bagging‘s Generalization Error
Machine Learning
Feature selection for ensembles
AAAI '99/IAAI '99 Proceedings of the sixteenth national conference on Artificial intelligence and the eleventh Innovative applications of artificial intelligence conference innovative applications of artificial intelligence
Machine Learning
PUBLIC: A Decision Tree Classifier that Integrates Building and Pruning
Data Mining and Knowledge Discovery
Diversity versus Quality in Classification Ensembles Based on Feature Selection
ECML '00 Proceedings of the 11th European Conference on Machine Learning
SLIQ: A Fast Scalable Classifier for Data Mining
EDBT '96 Proceedings of the 5th International Conference on Extending Database Technology: Advances in Database Technology
ICDAR '95 Proceedings of the Third International Conference on Document Analysis and Recognition (Volume 1) - Volume 1
Diverse ensembles for active learning
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Diversity/Accuracy and Ensemble Classifier Design
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 3 - Volume 03
Enhancing SNNB with local accuracy estimation and ensemble techniques
DASFAA'05 Proceedings of the 10th international conference on Database Systems for Advanced Applications
Mining decision rules on data streams in the presence of concept drifts
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
In the research area of decision tree, numerous researchers have been focusing on improving the predictive accuracy. However, obvious improvement can hardly be made until the introduction of the ensemble classifier. In this paper, we propose an Evolutionary Attribute-Oriented Ensemble Classifier (EAOEC) to improve the accuracy of sub-classifiers and at the same time maintain the diversity among them. EAOEC uses the idea of evolution to choose proper attribute subset for the building of every sub-classifier. To avoid the huge computation cost for the evolution, EAOEC uses the gini value gained during the construction of a sub-tree as the evolution basis to build the next sub-tree. Eventually, EAOEC classifier uses uniform weight voting to combine all sub-classifiers and experiments show that EAOEC can efficiently improve the predictive accuracy.