Some Notes on Twenty One (21) Nearest Prototype Classifiers
Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Shapeme Histogram Projection and Matching for Partial Object Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
A randomized sphere cover classifier
IDEAL'10 Proceedings of the 11th international conference on Intelligent data engineering and automated learning
Instance selection with neural networks for regression problems
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Hi-index | 0.00 |
We extend the nearest prototype classifier to a generalized nearest prototype classifier (GNPC). The GNPC uses “soft” labeling of the prototypes in the classes, thereby encompassing a variety of classifiers. Based on how the prototypes are found we distinguish between presupervised and post-supervised GNPC designs. We derive the conditions for optimality of two designs where prototypes represent: 1) the components of class-conditional mixture densities (presupervised design); or 2) the components of the unconditional mixture density (post-supervised design). An artificial data set and the “satimage” data set from the database ELENA are used to experimentally study the two approaches. A radial basis function network is used as a representative of each GNPC type. Neither the theoretical nor the experimental results indicate clear reasons to prefer one of the approaches. The post-supervised GNPC design tends to be more robust and less accurate than the presupervised one