Machine Learning - Special issue on inductive transfer
Support vector machines are universally consistent
Journal of Complexity
On the algorithmic implementation of multiclass kernel-based vector machines
The Journal of Machine Learning Research
Task clustering and gating for bayesian multitask learning
The Journal of Machine Learning Research
Regularized multi--task learning
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Learning Multiple Tasks with Kernel Methods
The Journal of Machine Learning Research
Off-line Handwritten Signature GPDS-960 Corpus
ICDAR '07 Proceedings of the Ninth International Conference on Document Analysis and Recognition - Volume 02
The Journal of Machine Learning Research
Convex multi-task feature learning
Machine Learning
Model-based signature verification with rotation invariant features
Pattern Recognition
A model of inductive bias learning
Journal of Artificial Intelligence Research
Off-line signature verification based on multitask learning
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part III
Machine Learning
Multitask Multiclass Support Vector Machines
ICDMW '11 Proceedings of the 2011 IEEE 11th International Conference on Data Mining Workshops
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Beyond cross-domain learning: Multiple-domain nonnegative matrix factorization
Engineering Applications of Artificial Intelligence
Multi-task learning with one-class SVM
Neurocomputing
Hi-index | 0.01 |
Multitask learning or learning multiple related tasks simultaneously has shown a better performance than learning these tasks independently. Most approaches to multitask multiclass problems decompose them into multiple multitask binary problems, and thus cannot effectively capture inherent correlations between classes. Although very elegant, traditional multitask support vector machines are restricted by the fact that different learning tasks have to share the same set of classes. In this paper, we present an approach to multitask multiclass support vector machines based on the minimization of regularization functionals. We cast multitask multiclass problems into a constrained optimization problem with a quadratic objective function. Therefore, our approach can learn multitask multiclass problems directly and effectively. This approach can learn in two different scenarios: label-compatible and label-incompatible multitask learning. We can easily generalize the linear multitask learning method to the non-linear case using kernels. A number of experiments, including comparisons with other multitask learning methods, indicate that our approach for multitask multiclass problems is very encouraging.