The Strength of Weak Learnability
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Optimizing classifiers for imbalanced training sets
Proceedings of the 1998 conference on Advances in neural information processing systems II
Explicitly representing expected cost: an alternative to ROC representation
Proceedings of the sixth ACM SIGKDD international conference on Knowledge discovery and data mining
AdaCost: Misclassification Cost-Sensitive Boosting
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
A Comparative Study of Cost-Sensitive Boosting Algorithms
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Robust Real-Time Face Detection
International Journal of Computer Vision
An introduction to ROC analysis
Pattern Recognition Letters - Special issue: ROC analysis in pattern recognition
Proceedings of the 24th international conference on Machine learning
Cost-sensitive boosting for classification of imbalanced data
Pattern Recognition
Evidence Contrary to the Statistical View of Boosting
The Journal of Machine Learning Research
Learning when training data are costly: the effect of class distribution on tree induction
Journal of Artificial Intelligence Research
The foundations of cost-sensitive learning
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
IEEE Transactions on Pattern Analysis and Machine Intelligence
Shedding light on the asymmetric learning capability of AdaBoost
Pattern Recognition Letters
Hi-index | 0.01 |
Based on the use of different exponential bases to define class-dependent error bounds, a new and highly efficient asymmetric boosting scheme, coined as AdaBoostDB (Double-Base), is proposed. Supported by a fully theoretical derivation procedure, unlike most of the other approaches in the literature, our algorithm preserves all the formal guarantees and properties of original (cost-insensitive) AdaBoost, similarly to the state-of-the-art Cost-Sensitive AdaBoost algorithm. However, the key advantage of AdaBoostDB is that our novel derivation scheme enables an extremely efficient conditional search procedure, dramatically improving and simplifying the training phase of the algorithm. Experiments, both over synthetic and real datasets, reveal that AdaBoostDB is able to save over 99% training time with regard to Cost-Sensitive AdaBoost, providing the same cost-sensitive results. This computational advantage of AdaBoostDB can make a difference in problems managing huge pools of weak classifiers in which boosting techniques are commonly used.