Dynamically adapting kernels in support vector machines
Proceedings of the 1998 conference on Advances in neural information processing systems II
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Learning the Kernel with Hyperkernels
The Journal of Machine Learning Research
Learning the Kernel Function via Regularization
The Journal of Machine Learning Research
A statistical framework for genomic data fusion
Bioinformatics
Kernel methods for predicting protein--protein interactions
Bioinformatics
Gradient-Based Adaptation of General Gaussian Kernels
Neural Computation
Learning convex combinations of continuously parameterized basic kernels
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Learning interpretable SVMs for biological sequence classification
RECOMB'05 Proceedings of the 9th Annual international conference on Research in Computational Molecular Biology
Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms
IEEE Transactions on Neural Networks
Better multiclass classification via a margin-optimized single binary problem
Pattern Recognition Letters
Regularized Fitted Q-Iteration: Application to Planning
Recent Advances in Reinforcement Learning
L2 regularization for learning kernels
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Rademacher chaos complexities for learning the kernel problem
Neural Computation
Algorithms for learning kernels based on centered alignment
The Journal of Machine Learning Research
Incorporation of radius-info can be simple with SimpleMKL
Neurocomputing
Regularization techniques for learning with matrices
The Journal of Machine Learning Research
Multiple spectral kernel learning and a gaussian complexity computation
Neural Computation
On the convergence rate of lp-norm multiple kernel learning
The Journal of Machine Learning Research
Hi-index | 0.00 |
Consider the problem of learning a kernel for use in SVM classification. We bound the estimation error of a large margin classifier when the kernel, relative to which this margin is defined, is chosen from a family of kernels based on the training sample. For a kernel family with pseudodimensiondφ, we present a bound of $\sqrt{\tilde{\mathcal{O}}{({d_{\phi}}+1/\gamma^2)}/n}$on the estimation error for SVMs with margin γ. This is the first bound in which the relation between the margin term and the family-of-kernels term is additive rather then multiplicative. The pseudodimension of families of linear combinations of base kernels is the number of base kernels. Unlike in previous (multiplicative) bounds, there is no non-negativity requirement on the coefficients of the linear combinations. We also give simple bounds on the pseudodimension for families of Gaussian kernels.