Self-organization and associative memory: 3rd edition
Self-organization and associative memory: 3rd edition
Reduction Techniques for Instance-BasedLearning Algorithms
Machine Learning
Adaptive Quasiconformal Kernel Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion
IEEE Transactions on Pattern Analysis and Machine Intelligence
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 2 - Volume 02
Learning Weighted Metrics to Minimize Nearest-Neighbor Classification Error
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improving nearest neighbor rule with a simple adaptive distance measure
Pattern Recognition Letters
Letters: Adaptive local hyperplane classification
Neurocomputing
A method of learning weighted similarity function to improve the performance of nearest neighbor
Information Sciences: an International Journal
Class confidence weighted kNN algorithms for imbalanced data sets
PAKDD'11 Proceedings of the 15th Pacific-Asia conference on Advances in knowledge discovery and data mining - Volume Part II
BINER: BINary search based efficient regression
MLDM'12 Proceedings of the 8th international conference on Machine Learning and Data Mining in Pattern Recognition
Projected-prototype based classifier for text categorization
Knowledge-Based Systems
Hi-index | 0.00 |
In this paper, a novel prototype reduction algorithm is proposed, which aims at reducing the storage requirement and enhancing the online speed while retaining the same level of accuracy for a K-nearest neighbor (KNN) classifier To achieve this goal, our proposed algorithm learns the weighted similarity function for a KNN classifier by maximizing the leave-one-out cross-validation accuracy Unlike the classical methods PW, LPD and WDNN which can only work with K=1, our developed algorithm can work with K≥1 This flexibility allows our learning algorithm to have superior classification accuracy and noise robustness The proposed approach is assessed through experiments with twenty real world benchmark data sets In all these experiments, the proposed approach shows it can dramatically reduce the storage requirement and online time for KNN while having equal or better accuracy than KNN, and it also shows comparable results to several prototype reduction methods proposed in literature.