Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Convex Optimization
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
A statistical framework for genomic data fusion
Bioinformatics
Learning low-rank kernel matrices
ICML '06 Proceedings of the 23rd international conference on Machine learning
New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds
Mathematical Programming: Series A and B
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
A scalable modular convex solver for regularized risk minimization
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
Multi-kernel multi-label learning with max-margin concept network
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Gender classification from unaligned facial images using support subspaces
Information Sciences: an International Journal
Hi-index | 0.00 |
Kernel methods have been applied successfully in many applications. The kernel matrix plays an important role in kernel-based learning methods, but the "ideal" kernel matrix is usually unknown in practice and needs to be estimated. In this paper, we propose to directly learn the "ideal" kernel matrix (called the optimal neighborhood kernel matrix) from a pre-specified kernel matrix for improved classification performance. We assume that the prespecified kernel matrix generated from the specific application is a noisy observation of the ideal one. The resulting optimal neighborhood kernel matrix is shown to be the summation of the pre-specified kernel matrix and a rank-one matrix. We formulate the problem of learning the optimal neighborhood kernel as a constrained quartic problem, and propose to solve it using two methods: level method and constrained gradient descent. Empirical results on several benchmark data sets demonstrate the efficiency and effectiveness of the proposed algorithms.