OMS-TL: a framework of online multiple source transfer learning

  • Authors:
  • Liang Ge;Jing Gao;Aidong Zhang

  • Affiliations:
  • The State University of New York at Buffalo, Buffalo, NY, USA;The State University of New York at Buffalo, Buffalo, NY, USA;The State University of New York at Buffalo, Buffalo, NY, USA

  • Venue:
  • Proceedings of the 22nd ACM international conference on Conference on information & knowledge management
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Transfer learning has benefitted many real-world applications where labeled data are abundant in source domains but scarce on the target domain. As there are usually multiple relevant domains where knowledge can be transferred, Multiple Source Transfer Learning (MSTL) has recently attracted much attention. Most existing MSTL methods work in an offline fashion in that they have to store all the data on the target domain before learning. However, in some time-critical applications where the data arrive sequentially in large volume, a fast and scalable online method that can transfer knowledge from multiple source domains is much needed. To achieve this end, in this paper, we propose a new framework of Online Multiple Source Transfer Learning (OMS-TL). The framework is based on a convex optimization problem where knowledge transferred from multiple source domains are guided by the information on the target domain. The proposed method is fast, scalable and enjoys the theoretical guarantees of standard online algorithms. Extensive experiments are conducted on three real-life data sets. The results show that the performance of OMS-TL is close to that of its offline counterpart, which bears comparable performance to existing baseline methods. Furthermore, the proposed method has great scalability and fast response time.