Discriminant Adaptive Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Reduction Techniques for Instance-BasedLearning Algorithms
Machine Learning
Locally Adaptive Metric Nearest-Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning Weighted Metrics to Minimize Nearest-Neighbor Classification Error
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improving nearest neighbor rule with a simple adaptive distance measure
Pattern Recognition Letters
Weighting fuzzy classification rules using receiver operating characteristics (ROC) analysis
Information Sciences: an International Journal
Information Sciences: an International Journal
Weighted Instance Typicality Search (WITS): A nearest neighbor data reduction algorithm
Intelligent Data Analysis
Selecting representative examples and attributes by a genetic algorithm
Intelligent Data Analysis
A proposed method for learning rule weights in fuzzy rule-based classification systems
Fuzzy Sets and Systems
Neighborhood rough set based heterogeneous feature subset selection
Information Sciences: an International Journal
Information Sciences: an International Journal
Class confidence weighted kNN algorithms for imbalanced data sets
PAKDD'11 Proceedings of the 15th Pacific-Asia conference on Advances in knowledge discovery and data mining - Volume Part II
Information Sciences: an International Journal
A novel prototype reduction method for the K-nearest neighbor algorithm with K≥1
PAKDD'10 Proceedings of the 14th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part II
Expert Systems with Applications: An International Journal
Noisy data elimination using mutual k-nearest neighbor for classification mining
Journal of Systems and Software
K Nearest Neighbor Equality: Giving equal chance to all existing classes
Information Sciences: an International Journal
BINER: BINary search based efficient regression
MLDM'12 Proceedings of the 8th international conference on Machine Learning and Data Mining in Pattern Recognition
Time series classification by class-specific Mahalanobis distance measures
Advances in Data Analysis and Classification
Fuzzy nearest neighbor algorithms: Taxonomy, experimental analysis and prospects
Information Sciences: an International Journal
Hi-index | 0.07 |
The performance of Nearest Neighbor (NN) classifier is known to be sensitive to the distance (or similarity) function used in classifying a test instance. Another major disadvantage of NN is that it uses all training instances in the generalization phase. This can cause slow execution speed and high storage requirement when dealing with large datasets. In the past research, many solutions have been proposed to handle one or both of the above problems. In the scheme proposed in this paper, we tackle both of these problems by assigning a weight to each training instance. The weight of a training instance is used in the generalization phase to calculate the distance (or similarity) of a query pattern to that instance. The basic NN classifier can be viewed as a special case of this scheme that treats all instances equally (by assigning equal weight to all training instances). Using this form of weighted similarity measure, we propose a learning algorithm that attempts to maximize the leave-one-out (LV1) classification rate of the NN rule by adjusting the weights of the training instances. At the same time, the algorithm reduces the size of the training set and can be viewed as a powerful instance reduction technique. An instance having zero weight is not used in the generalization phase and can be virtually removed from the training set. We show that our scheme has comparable or better performance than some recent methods proposed in the literature for the task of learning the distance function and/or prototype reduction.