Self-organization and associative memory: 3rd edition
Self-organization and associative memory: 3rd edition
A Nearest Hyperrectangle Learning Method
Machine Learning
Machine learning, neural and statistical classification
Machine learning, neural and statistical classification
Learning Class Regions by the Union of Ellipsoids
ICPR '96 Proceedings of the International Conference on Pattern Recognition (ICPR '96) Volume IV-Volume 7472 - Volume 7472
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
A Branch and Bound Algorithm for Computing k-Nearest Neighbors
IEEE Transactions on Computers
IPADE: iterative prototype adjustment for nearest neighbor classification
IEEE Transactions on Neural Networks
A prototype classifier based on gravitational search algorithm
Applied Soft Computing
MT-CGP: mixed type cartesian genetic programming
Proceedings of the 14th annual conference on Genetic and evolutionary computation
Efficient dataset size reduction by finding homogeneous clusters
Proceedings of the Fifth Balkan Conference in Informatics
A novel prototype generation technique for handwriting digit recognition
Pattern Recognition
On the use of meta-learning for instance selection: An architecture and an experimental study
Information Sciences: an International Journal
Hi-index | 0.01 |
Prototype classifiers are a type of pattern classifiers, whereby a number of prototypes are designed for each class so as they act as representatives of the patterns of the class. Prototype classifiers are considered among the simplest and best performers in classification problems. However, they need careful positioning of prototypes to capture the distribution of each class region and/or to define the class boundaries. Standard methods, such as learning vector quantization (LVQ), are sensitive to the initial choice of the number and the locations of the prototypes and the learning rate. In this article, a new prototype classification method is proposed, namely self-generating prototypes (SGP). The main advantage of this method is that both the number of prototypes and their locations are learned from the training set without much human intervention. The proposed method is compared with other prototype classifiers such as LVQ, self-generating neural tree (SGNT) and K-nearest neighbor (K-NN) as well as Gaussian mixture model (GMM) classifiers. In our experiments, SGP achieved the best performance in many measures of performance, such as training speed, and test or classification speed. Concerning number of prototypes, and test classification accuracy, it was considerably better than the other methods, but about equal on average to the GMM classifiers. We also implemented the SGP method on the well-known STATLOG benchmark, and it beat all other 21 methods (prototype methods and non-prototype methods) in classification accuracy.