The nature of statistical learning theory
The nature of statistical learning theory
An accelerated procedure for recursive feature ranking on microarray data
Neural Networks - 2003 Special issue: Advances in neural networks research IJCNN'03
Variable selection using svm based criteria
The Journal of Machine Learning Research
Object Categorization by Learned Universal Visual Dictionary
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision - Volume 2
Generic Object Recognition with Boosting
IEEE Transactions on Pattern Analysis and Machine Intelligence
Backpropagation applied to handwritten zip code recognition
Neural Computation
Metric Learning: A Support Vector Approach
ECML PKDD '08 Proceedings of the European conference on Machine Learning and Knowledge Discovery in Databases - Part II
Distance Metric Learning for Large Margin Nearest Neighbor Classification
The Journal of Machine Learning Research
Large margin nearest neighbor classifiers
IEEE Transactions on Neural Networks
Weighted Mahalanobis Distance Kernels for Support Vector Machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper introduces a novel classification approach which improves the performance of support vector machines (SVMs) by learning a distance metric. The metric learned is a Mahalanobis metric previously trained so that examples from different classes are separated with a large margin. The learned metric is used to define a kernel function for SVM classification. In this context, the metric can be seen as a linear transformation of the original inputs before applying an SVM classifier that uses Euclidean distances. This transformation increases the separability of classes in the transformed space where the classification is applied. Experiments demonstrate significant improvements in classification tasks on various data sets.