Using genetic algorithms to improve pattern classification performance
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Vector quantization and signal compression
Vector quantization and signal compression
Editing for the k-nearest neighbors rule by a genetic algorithm
Pattern Recognition Letters - Special issue on genetic algorithms
Two soft relatives of learning vector quantization
Neural Networks
Learning vector quantization with training count (LVQTC)
Neural Networks
Self-Organizing Maps
Fuzzy Models and Algorithms for Pattern Recognition and Image Processing
Fuzzy Models and Algorithms for Pattern Recognition and Image Processing
A Bootstrap Technique for Nearest Neighbor Classifier Design
IEEE Transactions on Pattern Analysis and Machine Intelligence
Finding Prototypes For Nearest Neighbor Classifiers
IEEE Transactions on Computers
Multiple-prototype classifier design
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Nearest prototype classification: clustering, genetic algorithms, or random search?
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Will the real iris data please stand up?
IEEE Transactions on Fuzzy Systems
Repairs to GLVQ: a new family of competitive learning schemes
IEEE Transactions on Neural Networks
Presupervised and post-supervised prototype classifier design
IEEE Transactions on Neural Networks
An Empirical Evaluation of Common Vector Based Classification Methods and Some Extensions
SSPR & SPR '08 Proceedings of the 2008 Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition
Hi-index | 0.00 |
Comparisons made in two studies of 21 methods for finding prototypes upon which to base the nearest prototype classifier are discussed. The criteria used to compare the methods are by whether they: (i) select or extract point prototypes; (ii) employ pre- or post-supervision; and (iii) specify the number of prototypes a priori, or obtain this number "automatically". Numerical experiments with 5 data sets suggest that pre-supervised, extraction methods offer a better chance for success to the casual user than post-supervised, selection schemes. Our calculations also suggest that methods which find the "best" number of prototypes "automatically" are not superior to user specification of this parameter.