Stacking with an Extended Set of Meta-level Attributes and MLR
ECML '02 Proceedings of the 13th European Conference on Machine Learning
Stacking with Multi-response Model Trees
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Rough set Based Ensemble Classifier forWeb Page Classification
Fundamenta Informaticae
Rough Ensemble Classifier: A Comparative Study
WILF '09 Proceedings of the 8th International Workshop on Fuzzy Logic and Applications
Specializing for predicting obesity and its co-morbidities
Journal of Biomedical Informatics
Selection-fusion approach for classification of datasets with missing values
Pattern Recognition
The impact of random samples in ensemble classifiers
Proceedings of the 2010 ACM Symposium on Applied Computing
A comparison of three voting methods for bagging with the MLEM2 algorithm
IDEAL'10 Proceedings of the 11th international conference on Intelligent data engineering and automated learning
Classification by cluster analysis: a new meta-learning based approach
MCS'11 Proceedings of the 10th international conference on Multiple classifier systems
Sentiment classification via integrating multiple feature presentations
Proceedings of the 21st international conference companion on World Wide Web
Rough set Based Ensemble Classifier forWeb Page Classification
Fundamenta Informaticae
Assembling the optimal sentiment classifiers
WISE'12 Proceedings of the 13th international conference on Web Information Systems Engineering
Hi-index | 0.00 |
Abstract. Meta decision trees (MTs) are a method for combining multiple classifiers. We present an integration of the algorithm MLC4.5 for learning MTs into the Weka data mining suite. We compare classifier ensembles combined with MDTs to bagged and boosted decision trees, and to classifier ensembles combined with other methods: voting and stacking with three different meta-level classifiers (ordinary decision trees, naive Bayes, and multi-response linear regression -MLR).