An Information-Theoretic Approach for Multi-task Learning

  • Authors:
  • Pei Yang;Qi Tan;Hao Xu;Yehua Ding

  • Affiliations:
  • School of Computer Science, South China University of Technology, Guangzhou 510640;School of Computer Science, South China University of Technology, Guangzhou 510640 and School of Computer Science, South China Normal University, Guangzhou 510631;School of Computer Science, South China University of Technology, Guangzhou 510640;School of Computer Science, South China University of Technology, Guangzhou 510640

  • Venue:
  • ADMA '09 Proceedings of the 5th International Conference on Advanced Data Mining and Applications
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multi-task learning utilizes labeled data from other "similar" tasks and can achieve efficient knowledge-sharing between tasks. In this paper, a novel information-theoretic multi-task learning model, i.e. IBMTL, is proposed. The key idea of IBMTL is to minimize the loss mutual information during the classification, while constrain the Kullback Leibler divergence between multiple tasks to some maximal level. The basic trade-off is between maximize the relevant information while minimize the "dissimilarity" between multiple tasks. The IBMTL algorithm is compared with TrAdaBoost which extends AdaBoost for transfer learning. The experiments were conducted on two data sets for transfer learning, Email spam-filtering data set and sentiment classification data set. The experimental results demonstrate that IBMTL outperforms TrAdaBoost.