On the boosting ability of top-down decision tree learning algorithms
Journal of Computer and System Sciences
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Boosting Neighborhood-Based Classifiers
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Stopping criterion for boosting based data reduction techniques: from binary to multiclass problem
The Journal of Machine Learning Research
Bregman Divergences and Surrogates for Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Leveraging k-NN for generic classification boosting
Neurocomputing
Nearest neighbor pattern classification
IEEE Transactions on Information Theory
Boosting k-NN for Categorization of Natural Scenes
International Journal of Computer Vision
Hi-index | 0.00 |
It is an admitted fact that mainstream boosting algorithms like AdaBoost do not perform well to estimate class conditional probabilities. In this paper, we analyze, in the light of this problem, a recent algorithm, unn, which leverages nearest neighbors while minimizing a convex loss. Our contribution is threefold. First, we show that there exists a subclass of surrogate losses, elsewhere called balanced, whose minimization brings simple and statistically efficient estimators for Bayes posteriors. Second, we show explicit convergence rates towards these estimators for unn, for any such surrogate loss, under a Weak Learning Assumption which parallels that of classical boosting results. Third and last, we provide experiments and comparisons on synthetic and real datasets, including the challenging SUN computer vision database. Results clearly display that boosting nearest neighbors may provide highly accurate estimators, sometimes more than a hundred times more accurate than those of other contenders like support vector machines.