Duality and Geometry in SVM Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
A generalized kernel approach to dissimilarity-based classification
The Journal of Machine Learning Research
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Distance--Based Classification with Lipschitz Functions
The Journal of Machine Learning Research
On linear separability of data sets in feature space
Neurocomputing
Grassmann discriminant analysis: a unifying view on subspace-based learning
Proceedings of the 25th international conference on Machine learning
Linear Programming Boosting by Column and Row Generation
DS '09 Proceedings of the 12th International Conference on Discovery Science
Reproducing kernel banach spaces for machine learning
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
The Journal of Machine Learning Research
Reproducing Kernel Banach Spaces for Machine Learning
The Journal of Machine Learning Research
Euclidean distances, soft and spectral clustering on weighted graphs
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part I
A method to construct the mapping to the feature space for the dot product kernels
ICMLC'05 Proceedings of the 4th international conference on Advances in Machine Learning and Cybernetics
Regularized learning in Banach spaces as an optimization problem: representer theorems
Journal of Global Optimization
Vector-valued reproducing kernel Banach spaces with applications to multi-task learning
Journal of Complexity
Hi-index | 0.00 |
In order to apply the maximum margin method in arbitrary metric spaces, we suggest to embed the metric space into a Banach or Hilbert space and to perform linear classification in this space. We propose several embeddings and recall that an isometric embedding in a Banach space is always possible while an isometric embedding in a Hilbert space is only possible for certain metric spaces. As a result, we obtain a general maximum margin classification algorithm for arbitrary metric spaces (whose solution is approximated by an algorithm of Graepel et al. (International Conference on Artificial Neural Networks 1999, pp. 304-309)). Interestingly enough, the embedding approach, when applied to a metric which can be embedded into a Hilbert space, yields the support vector machine (SVM) algorithm, which emphasizes the fact that its solution depends on the metric and not on the kernel. Furthermore, we give upper bounds of the capacity of the function classes corresponding to both embeddings in terms of Rademacher averages. Finally, we compare the capacities of these function classes directly.