Voting over Multiple Condensed Nearest Neighbors
Artificial Intelligence Review - Special issue on lazy learning
Combining Nearest Neighbor Classifiers Through Multiple Feature Subsets
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Choosing k for two-class nearest neighbour classifiers with unbalanced classes
Pattern Recognition Letters
Learning from imbalanced data sets with boosting and data generation: the DataBoost-IM approach
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 1 - Volume 01
Adaptive Boosting with Leader based Learners for Classification of Large Handwritten Data
HIS '04 Proceedings of the Fourth International Conference on Hybrid Intelligent Systems
Boosting Nearest Neighbor Classi.ers for Multiclass Recognition
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops - Volume 03
Fast condensed nearest neighbor rule
ICML '05 Proceedings of the 22nd international conference on Machine learning
Boosting the distance estimation
Pattern Recognition Letters
Boosting for Learning Multiple Classes with Imbalanced Class Distribution
ICDM '06 Proceedings of the Sixth International Conference on Data Mining
The class imbalance problem: A systematic study
Intelligent Data Analysis
A divide-and-conquer approach to the pairwise opposite class-nearest neighbor (POC-NN) algorithm
Pattern Recognition Letters
Class confidence weighted kNN algorithms for imbalanced data sets
PAKDD'11 Proceedings of the 15th Pacific-Asia conference on Advances in knowledge discovery and data mining - Volume Part II
Ensembling local learners ThroughMultimodal perturbation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Nearest neighbor pattern classification
IEEE Transactions on Information Theory
The condensed nearest neighbor rule (Corresp.)
IEEE Transactions on Information Theory
The reduced nearest neighbor rule (Corresp.)
IEEE Transactions on Information Theory
An algorithm for a selective nearest neighbor decision rule (Corresp.)
IEEE Transactions on Information Theory
The condensed nearest neighbor rule using the concept of mutual nearest neighborhood (Corresp.)
IEEE Transactions on Information Theory
Comparing Boosting and Bagging Techniques With Noisy and Imbalanced Data
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Hi-index | 0.10 |
Though the k-nearest neighbor (k-NN) pattern classifier is an effective learning algorithm, it can result in large model sizes. To compensate, a number of variant algorithms have been developed that condense the model size of the k-NN classifier at the expense of accuracy. To increase the accuracy of these condensed models, we present a direct boosting algorithm for the k-NN classifier that creates an ensemble of models with locally modified distance weighting. An empirical study conducted on 10 standard databases from the UCI repository shows that this new Boosted k-NN algorithm has increased generalization accuracy in the majority of the datasets and never performs worse than standard k-NN.