Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
In Defense of One-Vs-All Classification
The Journal of Machine Learning Research
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Multi-task feature and kernel selection for SVMs
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
More efficiency in multiple kernel learning
Proceedings of the 24th international conference on Machine learning
Multiclass multiple kernel learning
Proceedings of the 24th international conference on Machine learning
A shared-subspace learning framework for multi-label classification
ACM Transactions on Knowledge Discovery from Data (TKDD)
Correlated multi-label feature selection
Proceedings of the 20th ACM international conference on Information and knowledge management
Multitask learning using regularized multiple kernel learning
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
Multi-kernel multi-label learning with max-margin concept network
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Online Multiple Kernel Classification
Machine Learning
Protein function prediction by integrating multiple kernels
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Protein Function Prediction using Multi-label Ensemble Classification
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Hi-index | 0.00 |
For classification with multiple labels, a common approach is to learn a classifier for each label. With a kernel-based classifier, there are two options to set up kernels: select a specific kernel for each label or the same kernel for all labels. In this work, we present a unified framework for multi-label multiple kernel learning, in which the above two approaches can be considered as two extreme cases. Moreover, our framework allows the kernels shared partially among multiple labels, enabling flexible degrees of label commonality. We systematically study how the sharing of kernels among multiple labels affects the performance based on extensive experiments on various benchmark data including images and microarray data. Interesting findings concerning efficacy and efficiency are reported.