Manifold regularized multi-task learning

  • Authors:
  • Peipei Yang;Xu-Yao Zhang;Kaizhu Huang;Cheng-Lin Liu

  • Affiliations:
  • National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China;National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China;National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China;National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China

  • Venue:
  • ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part III
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multi-task learning (MTL) has drawn a lot of attentions in machine learning. By training multiple tasks simultaneously, information can be better shared across tasks. This leads to significant performance improvement in many problems. However, most existing methods assume that all tasks are related or their relationship follows a simple and specified structure. In this paper, we propose a novel manifold regularized framework for multi-task learning. Instead of assuming simple relationship among tasks, we propose to learn task decision functions as well as a manifold structure from data simultaneously. As manifold could be arbitrarily complex, we show that our proposed framework can contain many recent MTL models, e.g. RegMTL and cCMTL, as special cases. The framework can be solved by alternatively learning all tasks and the manifold structure. In particular, learning all tasks with the manifold regularization can be solved as a single-task learning problem, while the manifold structure can be obtained by successive Bregman projection on a convex feasible set. On both synthetic and real datasets, we show that our method can outperform the other competitive methods.