A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Improved boosting algorithms using confidence-rated predictions
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Reduction Techniques for Instance-BasedLearning Algorithms
Machine Learning
An improved bound on the finite-sample risk of the nearest neighbor rule
Pattern Recognition Letters
Boosting Neighborhood-Based Classifiers
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Instance Pruning as an Information Preserving Problem
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
On Feature Selection: A New Filter Model
Proceedings of the Twelfth International Florida Artificial Intelligence Research Society Conference
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Improved heterogeneous distance functions
Journal of Artificial Intelligence Research
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Identifying and eliminating mislabeled training instances
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Nonlinear Boosting Projections for Ensemble Construction
The Journal of Machine Learning Research
ECML '07 Proceedings of the 18th European conference on Machine Learning
Graph-Based Discrete Differential Geometry for Critical Instance Filtering
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part II
Boosting nearest neighbors for the efficient estimation of posteriors
ECML PKDD'12 Proceedings of the 2012 European conference on Machine Learning and Knowledge Discovery in Databases - Volume Part I
Bagging and Boosting statistical machine translation systems
Artificial Intelligence
Hi-index | 0.00 |
So far, boosting has been used to improve the quality of moderately accurate learning algorithms, by weighting and combining many of their weak hypotheses into a final classifier with theoretically high accuracy. In a recent work (Sebban, Nock and Lallich, 2001), we have attempted to adapt boosting properties to data reduction techniques. In this particular context, the objective was not only to improve the success rate, but also to reduce the time and space complexities due to the storage requirements of some costly learning algorithms, such as nearest-neighbor classifiers. In that framework, each weak hypothesis, which is usually built and weighted from the learning set, is replaced by a single learning instance. The weight given by boosting defines in that case the relevance of the instance, and a statistical test allows one to decide whether it can be discarded without damaging further classification tasks. In Sebban, Nock and Lallich (2001), we addressed problems with two classes. It is the aim of the present paper to relax the class constraint, and extend our contribution to multiclass problems. Beyond data reduction, experimental results are also provided on twenty-three datasets, showing the benefits that our boosting-derived weighting rule brings to weighted nearest neighbor classifiers.