Classification with Nonmetric Distances: Image Retrieval and Class Representation
IEEE Transactions on Pattern Analysis and Machine Intelligence
A generalized kernel approach to dissimilarity-based classification
The Journal of Machine Learning Research
Feature Discovery in Non-Metric Pairwise Data
The Journal of Machine Learning Research
Feature Space Interpretation of SVMs with Indefinite Kernels
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Dissimilarity Representation for Pattern Recognition: Foundations And Applications (Machine Perception and Artificial Intelligence)
Prototype selection for dissimilarity-based classifiers
Pattern Recognition
On Euclidean Corrections for Non-Euclidean Dissimilarities
SSPR & SPR '08 Proceedings of the 2008 Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition
Weighted locally linear embedding for dimension reduction
Pattern Recognition
On Fuzzy vs. Metric Similarity Search in Complex Databases
FQAS '09 Proceedings of the 8th International Conference on Flexible Query Answering Systems
A Combine-Correct-Combine Scheme for Optimizing Dissimilarity-Based Classifiers
CIARP '09 Proceedings of the 14th Iberoamerican Conference on Pattern Recognition: Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
Non-Euclidean dissimilarities: causes and informativeness
SSPR&SPR'10 Proceedings of the 2010 joint IAPR international conference on Structural, syntactic, and statistical pattern recognition
Regularising the Ricci flow embedding
SSPR&SPR'10 Proceedings of the 2010 joint IAPR international conference on Structural, syntactic, and statistical pattern recognition
Spherical embedding and classification
SSPR&SPR'10 Proceedings of the 2010 joint IAPR international conference on Structural, syntactic, and statistical pattern recognition
Rectifying non-euclidean similarity data through tangent space reprojection
IbPRIA'11 Proceedings of the 5th Iberian conference on Pattern recognition and image analysis
Determining the cause of negative dissimilarity eigenvalues
CAIP'11 Proceedings of the 14th international conference on Computer analysis of images and patterns - Volume Part I
A new anticorrelation-based spectral clustering formulation
ACIVS'11 Proceedings of the 13th international conference on Advanced concepts for intelligent vision systems
MCBR-CDS'11 Proceedings of the Second MICCAI international conference on Medical Content-Based Retrieval for Clinical Decision Support
Hi-index | 0.00 |
Statistical learning algorithms often rely on the Euclidean distance. In practice, non-Euclidean or non-metric dissimilarity measures may arise when contours, spectra or shapes are compared by edit distances or as a consequence of robust object matching [1,2]. It is an open issue whether such measures are advantageous for statistical learning or whether they should be constrained to obey the metric axioms. The k-nearest neighbor (NN) rule is widely applied to general dissimilarity data as the most natural approach. Alternative methods exist that embed such data into suitable representation spaces in which statistical classifiers are constructed [3]. In this paper, we investigate the relation between non-Euclidean aspects of dissimilarity data and the classification performance of the direct NN rule and some classifiers trained in representation spaces. This is evaluated on a parameterized family of edit distances, in which parameter values control the strength of non-Euclidean behavior. Our finding is that the discriminative power of this measure increases with increasing non-Euclidean and non-metric aspects until a certain optimum is reached. The conclusion is that statistical classifiers perform well and the optimal values of the parameters characterize a non-Euclidean and somewhat non-metric measure.