A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Improving regression estimation: Averaging methods for variance reduction with extensions to general convex measure optimization
An introduction to computational learning theory
An introduction to computational learning theory
Machine Learning
Averaging regularized estimators
Neural Computation
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Logistic Regression, AdaBoost and Bregman Distances
Machine Learning
Incorporating Prior Knowledge into Boosting
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Boosting the margin: A new explanation for the effectiveness of voting methods
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
A Unified Bias-Variance Decomposition for Zero-One and Squared Loss
Proceedings of the Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence
MadaBoost: A Modification of AdaBoost
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Leveraging the margin more carefully
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Robust boosting and its relation to bagging
Proceedings of the eleventh ACM SIGKDD international conference on Knowledge discovery in data mining
An empirical comparison of supervised learning algorithms
ICML '06 Proceedings of the 23rd international conference on Machine learning
How boosting the margin can also boost classifier complexity
ICML '06 Proceedings of the 23rd international conference on Machine learning
Avoiding Boosting Overfitting by Removing Confusing Samples
ECML '07 Proceedings of the 18th European conference on Machine Learning
Hi-index | 0.00 |
Boosting methods while being among the best classification methods developed so far, are known to degrade performance in case of noisy data and overlapping classes. In this paper we propose a new upper generalization bound for weighted averages of hypotheses, which uses posterior estimates for training objects and is based on reduction of binary classification problem with overlapping classes to a deterministic problem. If we are given accurate posterior estimates, proposed bound is lower than existing bound by Schapire et al [25]. We design an AdaBoost-like algorithm which optimizes proposed generalization bound and show that incorporated with good posterior estimates it performs better than the standard AdaBoost on real-world data sets.