Editing for the k-nearest neighbors rule by a genetic algorithm
Pattern Recognition Letters - Special issue on genetic algorithms
Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Unsupervised Learning of Finite Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Kernel Nearest-Neighbor Algorithm
Neural Processing Letters
Discriminant Waveletfaces and Nearest Feature Classifiers for Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Decontamination of Training Samples for Supervised Pattern Recognition Methods
Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
Enhancing Density-Based Data Reduction Using Entropy
Neural Computation
Rapid and brief communication: Center-based nearest neighbor classifier
Pattern Recognition
A memetic algorithm for evolutionary prototype selection: A scaling up approach
Pattern Recognition
Expert Systems with Applications: An International Journal
Particle swarm optimization for prototype reduction
Neurocomputing
Cluster-based nearest-neighbour classifier and its application on the lightning classification
Journal of Computer Science and Technology
A stochastic approach to wilson's editing algorithm
IbPRIA'05 Proceedings of the Second Iberian conference on Pattern Recognition and Image Analysis - Volume Part II
Using evolutionary algorithms as instance selection for data reduction in KDD: an experimental study
IEEE Transactions on Evolutionary Computation
The Nearest Neighbor Algorithm of Local Probability Centers
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Face recognition using the nearest feature line method
IEEE Transactions on Neural Networks
ATISA: Adaptive Threshold-based Instance Selection Algorithm
Expert Systems with Applications: An International Journal
Combining classifiers using nearest decision prototypes
Applied Soft Computing
Evolutionary instance selection for text classification
Journal of Systems and Software
Hi-index | 12.05 |
The main two drawbacks of nearest neighbor based classifiers are: high CPU costs when the number of samples in the training set is high and performance extremely sensitive to outliers. Several attempts of overcoming such drawbacks have been proposed in the pattern recognition field aimed at selecting/generating an adequate subset of prototypes from the training set. The problem addressed in this paper concerns the comparison of methods for prototype reduction; several methods for finding a good set of prototypes are evaluated: particle swarm optimization; clustering algorithm; genetic algorithm; learning prototypes and distances. Experiments are carried out on several classification problems in order to evaluate the considered approaches in conjunction with different nearest neighbor based classifiers: 1-nearest-neighbor classifier, 5-nearest-neighbor classifier, nearest feature plane based classifier, nearest feature line based classifier. Moreover, we propose a method for creating an ensemble of the classifiers, where each classifier is trained with a different reduced set of prototypes. Since these prototypes are generated using a supervised optimization function, we have called our ensemble: ''supervised bagging''. The training phase consists in repeating N times the prototype generation, then the scores resulting from classifying a test pattern using each set of prototypes are combined by the ''vote rule''. The reported results show the superiority of this method with respect to the well known bagging approach for building ensembles of classifiers. Our results are obtained when 1-nearest-neighbor classifier is coupled with a ''supervised'' bagging ensemble of learning prototypes and distances. As expected, the approaches for prototype reduction proposed for 1-nearest-neighbor classifier do not work so well when other classifiers are tested. In our experiments the best method for prototype reduction when different classifiers are used is the genetic algorithm.