Machine Learning
Error reduction through learning multiple descriptions
Machine Learning
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
An Adaptive Version of the Boost by Majority Algorithm
Machine Learning
Computational Statistics & Data Analysis - Nonlinear methods and data mining
MadaBoost: A Modification of AdaBoost
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Machine Learning
Totally corrective boosting algorithms that maximize the margin
ICML '06 Proceedings of the 23rd international conference on Machine learning
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Robust Pose Recognition of the Obscured Human Body
International Journal of Computer Vision
MLMI'11 Proceedings of the Second international conference on Machine learning in medical imaging
On the effect of calibration in classifier combination
Applied Intelligence
Automated morphological classification of lung cancer subtypes using H&E tissue images
Machine Vision and Applications
Boosting-SVM: effective learning with reduced data dimension
Applied Intelligence
Hi-index | 0.00 |
This paper introduces a robust variant of AdaBoost, cw-AdaBoost, that uses weight perturbation to reduce variance error, and is particularly effective when dealing with data sets, such as microarray data, which have large numbers of features and small number of instances. The algorithm is compared with AdaBoost, Arcing and MultiBoost, using twelve gene expression datasets, using 10-fold cross validation. The new algorithm consistently achieves higher classification accuracy over all these datasets. In contrast to other AdaBoost variants, the algorithm is not susceptible to problems when a zero-error base classifier is encountered.