Performance Evaluation of the Nearest Feature Line Method in Image Classification and Retrieval
IEEE Transactions on Pattern Analysis and Machine Intelligence
Discriminant Waveletfaces and Nearest Feature Classifiers for Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Dissimilarity representations allow for building good classifiers
Pattern Recognition Letters
On the use of nearest feature line for speaker identification
Pattern Recognition Letters
A generalized kernel approach to dissimilarity-based classification
The Journal of Machine Learning Research
Comparison of the nearest feature classifiers for face recognition
Machine Vision and Applications
Dissimilarity-based classification for vectorial representations
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 03
Prototype selection for dissimilarity-based classifiers
Pattern Recognition
Rectified nearest feature line segment for pattern classification
Pattern Recognition
Multiresolution face recognition
Image and Vision Computing
Nearest neighbor pattern classification
IEEE Transactions on Information Theory
Face recognition using the nearest feature line method
IEEE Transactions on Neural Networks
The dissimilarity representation for structural pattern recognition
CIARP'11 Proceedings of the 16th Iberoamerican Congress conference on Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
Dissimilarity-Based classifications in eigenspaces
CIARP'11 Proceedings of the 16th Iberoamerican Congress conference on Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
Selecting feature lines in generalized dissimilarity representations for pattern recognition
Digital Signal Processing
Hi-index | 0.10 |
Even though, under representational restrictions, the nearest feature rules and the dissimilarity-based classifiers are feasible alternatives to the nearest neighbor method; individually, they may not be sufficiently powerful if a very small set of prototypes is required, e.g. when it is computationally expensive to deal with larger sets of prototypes. In this paper, we show that combining both strategies, taking advantage of their individual properties, provides an improvement, particularly for correlated data sets. The combined strategy consists in deriving an enriched (generalized) dissimilarity representation by using the nearest feature rules, namely feature lines and feature planes. On top of that enriched representation, Bayesian classifiers can be constructed in order to obtain a good generalization.