IEEE Transactions on Pattern Analysis and Machine Intelligence
Finding Prototypes For Nearest Neighbor Classifiers
IEEE Transactions on Computers
Matlab Programming for Engineers
Matlab Programming for Engineers
Full border identification for reduction of training sets
Canadian AI'08 Proceedings of the Canadian Society for computational studies of intelligence, 21st conference on Advances in artificial intelligence
Prototype Selection for Nearest Neighbor Classification: Taxonomy and Empirical Study
IEEE Transactions on Pattern Analysis and Machine Intelligence
The condensed nearest neighbor rule (Corresp.)
IEEE Transactions on Information Theory
The reduced nearest neighbor rule (Corresp.)
IEEE Transactions on Information Theory
An algorithm for a selective nearest neighbor decision rule (Corresp.)
IEEE Transactions on Information Theory
A Taxonomy and Experimental Study on Prototype Generation for Nearest Neighbor Classification
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Hi-index | 0.01 |
This paper submits a comprehensive report of the use of order statistics (OS) for parametric pattern recognition (PR) for various distributions within the exponential family. Although the field of parametric PR has been thoroughly studied for over five decades, the use of the OS of the distributions to achieve this has not been reported. The pioneering work on using OS for classification was presented earlier for the uniform distribution and for some members of the exponential family, where it was shown that optimal PR can be achieved in a counter-intuitive manner, diametrically opposed to the Bayesian paradigm, i.e., by comparing the testing sample to a few samples distant from the mean. Apart from the results for the Gaussian and doubly exponential which are merely cited here, our new results include the Rayleigh, Gamma and certain Beta distributions. The new scheme, referred to as classification by moments of order statistics (CMOS), has an accuracy that attains Bayes' bound for symmetric distributions, and is, otherwise, very close to the optimal Bayes' bound, as has been shown both theoretically and by rigorous experimental testing. The results here also give a theoretical foundation for the families of border identification (BI) algorithms reported in the literature.