Duality and Geometry in SVM Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Maximal margin classification for metric spaces
Journal of Computer and System Sciences - Special issue: Learning theory 2003
Complexity of pattern classes and the Lipschitz property
Theoretical Computer Science
Metric embedding for kernel classification rules
Proceedings of the 25th international conference on Machine learning
Linear-Time Computation of Similarity Measures for Sequential Data
The Journal of Machine Learning Research
Reproducing kernel banach spaces for machine learning
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
The Journal of Machine Learning Research
Reproducing Kernel Banach Spaces for Machine Learning
The Journal of Machine Learning Research
Classification Using Geometric Level Sets
The Journal of Machine Learning Research
Soft Nearest Convex Hull Classifier
Proceedings of the 2010 conference on ECAI 2010: 19th European Conference on Artificial Intelligence
Metric anomaly detection via asymmetric risk minimization
SIMBAD'11 Proceedings of the First international conference on Similarity-based pattern recognition
Regularized learning in Banach spaces as an optimization problem: representer theorems
Journal of Global Optimization
Vector-valued reproducing kernel Banach spaces with applications to multi-task learning
Journal of Complexity
Efficient regression in metric spaces via approximate lipschitz extension
SIMBAD'13 Proceedings of the Second international conference on Similarity-Based Pattern Recognition
Hi-index | 0.00 |
The goal of this article is to develop a framework for large margin classification in metric spaces. We want to find a generalization of linear decision functions for metric spaces and define a corresponding notion of margin such that the decision function separates the training points with a large margin. It will turn out that using Lipschitz functions as decision functions, the inverse of the Lipschitz constant can be interpreted as the size of a margin. In order to construct a clean mathematical setup we isometrically embed the given metric space into a Banach space and the space of Lipschitz functions into its dual space. To analyze the resulting algorithm, we prove several representer theorems. They state that there always exist solutions of the Lipschitz classifier which can be expressed in terms of distance functions to training points. We provide generalization bounds for Lipschitz classifiers in terms of the Rademacher complexities of some Lipschitz function classes. The generality of our approach can be seen from the fact that several well-known algorithms are special cases of the Lipschitz classifier, among them the support vector machine, the linear programming machine, and the 1-nearest neighbor classifier.