Discriminant Adaptive Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Artificial Intelligence Review - Special issue on lazy learning
A re-examination of text categorization methods
Proceedings of the 22nd annual international ACM SIGIR conference on Research and development in information retrieval
Unsupervised Feature Selection Using Feature Similarity
IEEE Transactions on Pattern Analysis and Machine Intelligence
Density-Based Multiscale Data Condensation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Locally Adaptive Metric Nearest-Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Formulating distance functions via the kernel trick
Proceedings of the eleventh ACM SIGKDD international conference on Knowledge discovery in data mining
Improving nearest neighbor rule with a simple adaptive distance measure
Pattern Recognition Letters
Information-theoretic metric learning
Proceedings of the 24th international conference on Machine learning
Creating diverse nearest-neighbour ensembles using simultaneous metaheuristic feature selection
Pattern Recognition Letters
Similarity Search: The Metric Space Approach
Similarity Search: The Metric Space Approach
k-nearest-neighbor Bayes-risk estimation
IEEE Transactions on Information Theory
A learning scheme for a fuzzy k-NN rule
Pattern Recognition Letters
Expert Systems with Applications: An International Journal
Hi-index | 0.10 |
In this paper, we propose a modified version of the k-nearest neighbor (kNN) algorithm. We first introduce a new affinity function for distance measure between a test point and a training point which is an approach based on local learning. A new similarity function using this affinity function is proposed next for the classification of the test patterns. The widely used convention of k, i.e., k=[@/N] is employed, where N is the number of data used for training purpose. The proposed modified kNN algorithm is applied on fifteen numerical datasets from the UCI machine learning data repository. Both 5-fold and 10-fold cross-validations are used. The average classification accuracy, obtained from our method is found to exceed some well-known clustering algorithms.