Convex Optimization
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
A DC-programming algorithm for kernel selection
ICML '06 Proceedings of the 23rd international conference on Machine learning
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
Invited talk: Can learning kernels help performance?
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
L2 regularization for learning kernels
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
lp-Norm Multiple Kernel Learning
The Journal of Machine Learning Research
Statistics for High-Dimensional Data: Methods, Theory and Applications
Statistics for High-Dimensional Data: Methods, Theory and Applications
Learning convex combinations of continuously parameterized basic kernels
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Hi-index | 0.00 |
The success of kernel-based learning methods depends on the choice of kernel. Recently, kernel learning methods have been proposed that use data to select the most appropriate kernel, usually by combining a set of base kernels. We introduce a new algorithm for kernel learning that combines a continuous set of base kernels, without the common step of discretizing the space of base kernels. We demonstrate that our new method achieves state-of-the-art performance across a variety of real-world datasets. Furthermore, we explicitly demonstrate the importance of combining the right dictionary of kernels, which is problematic for methods that combine a finite set of base kernels chosen a priori. Our method is not the first approach to work with continuously parameterized kernels. We adopt a two-stage kernel learning approach. We also show that our method requires substantially less computation than previous such approaches, and so is more amenable to multi-dimensional parameterizations of base kernels, which we demonstrate.