Communications of the ACM
The Strength of Weak Learnability
Machine Learning
Original Contribution: Stacked generalization
Neural Networks
Cryptographic limitations on learning Boolean formulae and finite automata
Journal of the ACM (JACM)
The weighted majority algorithm
Information and Computation
Boosting a weak learning algorithm by majority
Information and Computation
Machine Learning
Boosting classifiers regionally
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Combining Classifiers with Meta Decision Trees
Machine Learning
Boosting the margin: A new explanation for the effectiveness of voting methods
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Learning When to Trust Which Experts
EuroCOLT '97 Proceedings of the Third European Conference on Computational Learning Theory
How boosting the margin can also boost classifier complexity
ICML '06 Proceedings of the 23rd international conference on Machine learning
Hi-index | 0.00 |
AdaBoost is a well-recognized ensemble method to improveprediction accuracy over the base learning algorithm. However, itis prone to overfitting the training instances [18]. Freund,Mansour and Schapire [5] established that using exponentialweighting scheme in combining classifiers reduces the problem ofoverfitting. Also, Helmbold, Kwek and Pitt [7] showed in theprediction using a pool of experts framework an instance-basedweighting scheme improves performance. Motivated by these results,we propose here an instance-based exponential weighting scheme inwhich the weights of the base classifiers are adjusted according tothe test instance x. Here, a competency classifier c_i isconstructed for each base classifier h_i to predict whether thebase classifier's guess of x's label can be trusted and adjust theweight of h_i accordingly. We show that this instance-basedexponential weighting scheme enhances the performance ofAdaBoost.