Advances in Large Margin Classifiers
Advances in Large Margin Classifiers
Learning to Classify Text Using Support Vector Machines: Methods, Theory and Algorithms
Learning to Classify Text Using Support Vector Machines: Methods, Theory and Algorithms
Training Invariant Support Vector Machines
Machine Learning
A Support Vector Machine with a Hybrid Kernel and Minimal Vapnik-Chervonenkis Dimension
IEEE Transactions on Knowledge and Data Engineering
Column-generation boosting methods for mixture of kernels
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Fast String Kernels using Inexact Matching for Protein Sequences
The Journal of Machine Learning Research
Kernel methods for predicting protein--protein interactions
Bioinformatics
A New Multiple Kernel Approach for Visual Concept Learning
MMM '09 Proceedings of the 15th International Multimedia Modeling Conference on Advances in Multimedia Modeling
Kernel based support vector machine via semidefinite programming: Application to medical diagnosis
Computers and Operations Research
Multiple Kernel Learning Algorithms
The Journal of Machine Learning Research
Hi-index | 0.00 |
Despite the excellent applicability of kernel methods, there seems to be no systematic way of choosing appropriate kernel functions or the optimum parameters. Therefore, the performance of support vector machines (SVMs) cannot be easily optimized. To address this problem, a general procedure is suggested to produce nonparametric and efficient kernels. This is achieved by finding an empirical and theoretical connection between positive semidefinite matrices and certain metric space properties. The Gaussian kernel turns out to be a special case of the new framework. Comprehensive experiments on eleven real-world datasets and seven synthetic datasets demonstrate a clear advantage in favor of the proposed kernels. However, several important problems remain unresolved.