Neural Computation
Discriminant Adaptive Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Convex Optimization
Learning a Mahalanobis Metric from Equivalence Constraints
The Journal of Machine Learning Research
Metric Learning for Text Documents
IEEE Transactions on Pattern Analysis and Machine Intelligence
Adaptive dimension reduction using discriminant analysis and K-means clustering
Proceedings of the 24th international conference on Machine learning
An Algorithm for Finding Intrinsic Dimensionality of Data
IEEE Transactions on Computers
IEEE Transactions on Pattern Analysis and Machine Intelligence
The minimum volume ellipsoid metric
Proceedings of the 29th DAGM conference on Pattern recognition
The minimum volume covering ellipsoid estimation in kernel-defined feature spaces
ECML'06 Proceedings of the 17th European conference on Machine Learning
Hi-index | 0.00 |
We first investigate the combined effect of data complexity, curse of dimensionality and the definition of the Euclidean distance on the distance measure between points. Then, based on the concepts underlying manifold learning algorithms and the minimum volume ellipsoid metric, we design an algorithm that learns a local metric on the lower dimensional manifold on which the data is lying. Experiments in the context of classification on standard benchmark data sets showed very promising results when compared to state of the art algorithms, and consistent improvements over the Euclidean distance in the context of query---based learning.