Sparse multinomial kernel discriminant analysis (sMKDA)
Pattern Recognition
Classification of pulse waveforms using edit distance with real penalty
EURASIP Journal on Advances in Signal Processing
Multi-class leveraged k-NN for image classification
ACCV'10 Proceedings of the 10th Asian conference on Computer vision - Volume Part III
An automatic recognition method of journal impact factor manipulation
Journal of Information Science
Hubness-based fuzzy measures for high-dimensional k-nearest neighbor classification
MLDM'11 Proceedings of the 7th international conference on Machine learning and data mining in pattern recognition
Pulse waveform classification using ERP-Based difference-weighted KNN classifier
ICMB'10 Proceedings of the Second international conference on Medical Biometrics
Multi-label weighted k-nearest neighbor classifier with adaptive weight estimation
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
Neighborhood selection and eigenvalues for embedding data complex in low dimension
ACIIDS'12 Proceedings of the 4th Asian conference on Intelligent Information and Database Systems - Volume Part I
About eigenvalues from embedding data complex in low dimension
ICSI'12 Proceedings of the Third international conference on Advances in Swarm Intelligence - Volume Part II
Boosting k-NN for Categorization of Natural Scenes
International Journal of Computer Vision
Hi-index | 0.00 |
Nearest neighbor (NN) rule is one of the simplest and the most important methods in pattern recognition. In this paper, we propose a kernel difference-weighted k-nearest neighbor (KDF-KNN) method for pattern classification. The proposed method defines the weighted KNN rule as a constrained optimization problem, and we then propose an efficient solution to compute the weights of different nearest neighbors. Unlike traditional distance-weighted KNN which assigns different weights to the nearest neighbors according to the distance to the unclassified sample, difference-weighted KNN weighs the nearest neighbors by using both the correlation of the differences between the unclassified sample and its nearest neighbors. To take into account the effective nonlinear structure information, we further extend difference-weighted KNN to its kernel version KDF-KNN. Our experimental results indicate that KDF-WKNN is much better than the original KNN and the distance-weighted KNN methods, and is comparable to or better than several state-of-the-art methods in terms of classification accuracy.