Artificial Intelligence Review - Special issue on lazy learning
Learning Weighted Metrics to Minimize Nearest-Neighbor Classification Error
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition Letters
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Iterative RELIEF for Feature Weighting: Algorithms, Theories, and Applications
IEEE Transactions on Pattern Analysis and Machine Intelligence
Large Margin Feature Weighting Method via Linear Programming
IEEE Transactions on Knowledge and Data Engineering
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
Class-dependent projection based method for text categorization
Pattern Recognition Letters
Class confidence weighted kNN algorithms for imbalanced data sets
PAKDD'11 Proceedings of the 15th Pacific-Asia conference on Advances in knowledge discovery and data mining - Volume Part II
Dimensionality reduction using genetic algorithms
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Local Feature Weighting in Nearest Prototype Classification
IEEE Transactions on Neural Networks
Hi-index | 0.10 |
Different approaches of feature weighting and k-value selection to improve the nearest neighbour technique can be found in the literature. In this work, we show an evolutionary approach called k-Label Dependent Evolutionary Distance Weighting (kLDEDW) which calculates a set of local weights depending on each class besides an optimal k value. Thus, we attempt to carry out two improvements simultaneously: we locally transform the feature space to improve the accuracy of the k-nearest-neighbour rule whilst we search for the best value for k from the training data. Rigorous statistical tests demonstrate that our approach improves the general k-nearest-neighbour rule and several approaches based on local weighting.