Finding Prototypes For Nearest Neighbor Classifiers
IEEE Transactions on Computers
Learning Similarity Functions from Qualitative Feedback
ECCBR '08 Proceedings of the 9th European conference on Advances in Case-Based Reasoning
Distributed learning with data reduction
Transactions on computational collective intelligence IV
Hi-index | 0.00 |
We have deveioped two novel methods to improve K-nearest neighbor (K-NN) classifications. First, we introduce a new technique to greatly reduce the template size. This significantly improves classification time with no accuracy drop. Secondly, we introduce a preprocessing procedure to preclude a large part of prototype patterns which are unlikely to match the unknown pattern. This again accelerates the classification procedure considerably. The simulation results on the GSC digit recognizer [1] show that the accommodation of two procedures to K-NN search achieves 7 times faster than the original one without any decay in classification accuracy.