The nature of statistical learning theory
The nature of statistical learning theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Gene functional classification from heterogeneous data
RECOMB '01 Proceedings of the fifth annual international conference on Computational biology
Response Surface Methodology: Process and Product in Optimization Using Designed Experiments
Response Surface Methodology: Process and Product in Optimization Using Designed Experiments
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
A Generalized Hidden Markov Model for the Recognition of Human Genes in DNA
Proceedings of the Fourth International Conference on Intelligent Systems for Molecular Biology
Learning the Kernel Matrix with Semi-Definite Programming
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Proceedings of the 5th International Conference on Intelligent Systems for Molecular Biology
Text classification using string kernels
The Journal of Machine Learning Research
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Learning the Kernel Function via Regularization
The Journal of Machine Learning Research
A statistical framework for genomic data fusion
Bioinformatics
Feature space perspectives for learning the kernel
Machine Learning
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
Localized multiple kernel learning
Proceedings of the 25th international conference on Machine learning
Consistency of the Group Lasso and Multiple Kernel Learning
The Journal of Machine Learning Research
Hi-index | 0.01 |
In recent years, several methods have been proposed to combine multiple kernels using a weighted linear sum of kernels. These different kernels may be using information coming from multiple sources or may correspond to using different notions of similarity on the same source. We note that such methods, in addition to the usual ones of the canonical support vector machine formulation, introduce new regularization parameters that affect the solution quality and, in this work, we propose to optimize them using response surface methodology on cross-validation data. On several bioinformatics and digit recognition benchmark data sets, we compare multiple kernel learning and our proposed regularized variant in terms of accuracy, support vector count, and the number of kernels selected. We see that our proposed variant achieves statistically similar or higher accuracy results by using fewer kernel functions and/or support vectors through suitable regularization; it also allows better knowledge extraction because unnecessary kernels are pruned and the favored kernels reflect the properties of the problem at hand.