Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Efficient Margin Maximizing with Boosting
The Journal of Machine Learning Research
Boosting by weighting critical and erroneous samples
Neurocomputing
Moderating the outputs of support vector machine classifiers
IEEE Transactions on Neural Networks
A Dynamically Adjusted Mixed Emphasis Method for Building Boosting Ensembles
IEEE Transactions on Neural Networks
Multi-label ensemble based on variable pairwise constraint projection
Information Sciences: an International Journal
Double-base asymmetric AdaBoost
Neurocomputing
Hi-index | 0.02 |
Real Adaboost ensembles with weighted emphasis (RA-we) on erroneous and critical (near the classification boundary) samples have recently been proposed, leading to improved performance when an adequate combination of these terms is selected. However, finding the optimal emphasis adjustment is not an easy task. In this paper, we propose to make a fusion of the outputs of RA-we ensembles trained with different emphasis adjustments by means of a generalized voting scheme. The resulting committee of RA-we ensembles can retain the performance of the best RA-we component and even, occasionally, can improve it. Additionally, we present an ensemble selection strategy that removes from the committee RA-we ensembles with very poor performance. Experimental results show that these committees frequently outperform RA and RA-we with cross validated emphasis.