A Bayesian/Information Theoretic Model of Learning to Learn viaMultiple Task Sampling
Machine Learning - Special issue on inductive transfer
Machine Learning - Special issue on inductive transfer
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Regularized multi--task learning
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
More generality in efficient multiple kernel learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
On multiple kernel learning with multiple labels
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Joint covariate selection and joint subspace selection for multiple classification problems
Statistics and Computing
Multiple Kernel Learning Algorithms
The Journal of Machine Learning Research
Penalty for Sparse Linear and Sparse Multiple Kernel Multitask Learning
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Empirical success of kernel-based learning algorithms is very much dependent on the kernel function used. Instead of using a single fixed kernel function, multiple kernel learning (MKL) algorithms learn a combination of different kernel functions in order to obtain a similarity measure that better matches the underlying problem. We study multitask learning (MKL) problems and formulate a novel MTL algorithm that trains coupled but nonidentical MKL models across the tasks. The proposed algorithm is especially useful for tasks that have different input and/or output space characteristics and is computationally very efficient. Empirical results on three data sets validate the generalization performance and the efficiency of our approach.