Machine Learning
On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
On Comparing Classifiers: Pitfalls toAvoid and a Recommended Approach
Data Mining and Knowledge Discovery
Boosting Methods for Regression
Machine Learning
Computational Statistics & Data Analysis - Nonlinear methods and data mining
An Evaluation of Grading Classifiers
IDA '01 Proceedings of the 4th International Conference on Advances in Intelligent Data Analysis
Tree induction vs. logistic regression: a learning-curve analysis
The Journal of Machine Learning Research
Machine Learning
Memory-based morphological analysis
ACL '99 Proceedings of the 37th annual meeting of the Association for Computational Linguistics on Computational Linguistics
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Issues in stacked generalization
Journal of Artificial Intelligence Research
Combinations of weak classifiers
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Even though many ensemble techniques have been proposed, there is as yet no clear picture of which method is best. In this study, we propose a technique that uses different subsets of the same training dataset with the concurrent usage of a voting (for classification problems) or averaging methodology (for regression problems) for combining different learners instead of similar learners. We performed a comparison of the proposed ensemble with other well known ensembles that use the same base learners and the proposed technique had better accuracy in most cases.