Machine Learning
Machine Learning
Discriminant Adaptive Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Combination of Multiple Classifiers Using Local Accuracy Estimates
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
A comparative study of neural network based feature extraction paradigms
Pattern Recognition Letters
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Machine Learning
Combining Classifiers with Meta Decision Trees
Machine Learning
Locally Adaptive Metric Nearest-Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
FeatureBoost: A Meta-Learning Algorithm that Improves Model Robustness
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Nearest Neighbors in Random Subspaces
SSPR '98/SPR '98 Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
Optimizing Nearest Neighbour in Random Subspaces using a Multi-Objective Genetic Algorithm
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 1 - Volume 01
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 1 - Volume 01
Boosting Nearest Neighbor Classi.ers for Multiclass Recognition
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops - Volume 03
Boosting the distance estimation
Pattern Recognition Letters
Ensembling evidential k-nearest neighbor classifiers through multi-modal perturbation
Applied Soft Computing
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Nonlinear Boosting Projections for Ensemble Construction
The Journal of Machine Learning Research
Boosting random subspace method
Neural Networks
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Ensembling local learners ThroughMultimodal perturbation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A complete fuzzy discriminant analysis approach for face recognition
Applied Soft Computing
Prediction of Sequential Values for Debt Recovery
CIARP '09 Proceedings of the 14th Iberoamerican Conference on Pattern Recognition: Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
Fast exact k nearest neighbors search using an orthogonal search tree
Pattern Recognition
Class imbalance methods for translation initiation site recognition in DNA sequences
Knowledge-Based Systems
Ensemble based sensing anomaly detection in wireless sensor networks
Expert Systems with Applications: An International Journal
Accurate Prediction of Coronary Artery Disease Using Reliable Diagnosis System
Journal of Medical Systems
Boosting k-NN for Categorization of Natural Scenes
International Journal of Computer Vision
Computers & Mathematics with Applications
Expert Systems with Applications: An International Journal
Hi-index | 12.05 |
The k-nearest neighbors classifier is one of the most widely used methods of classification due to several interesting features, such as good generalization and easy implementation. Although simple, it is usually able to match, and even beat, more sophisticated and complex methods. However, no successful method has been reported so far to apply boosting to k-NN. As boosting methods have proved very effective in improving the generalization capabilities of many classification algorithms, proposing an appropriate application of boosting to k-nearest neighbors is of great interest. Ensemble methods rely on the instability of the classifiers to improve their performance, as k-NN is fairly stable with respect to resampling, these methods fail in their attempt to improve the performance of k-NN classifier. On the other hand, k-NN is very sensitive to input selection. In this way, ensembles based on subspace methods are able to improve the performance of single k-NN classifiers. In this paper we make use of the sensitivity of k-NN to input space for developing two methods for boosting k-NN. The two approaches modify the view of the data that each classifier receives so that the accurate classification of difficult instances is favored. The two approaches are compared with the classifier alone and bagging and random subspace methods with a marked and significant improvement of the generalization error. The comparison is performed using a large test set of 45 problems from the UCI Machine Learning Repository. A further study on noise tolerance shows that the proposed methods are less affected by class label noise than the standard methods.