Multi-task learning with one-class SVM

  • Authors:
  • Xiyan He;Gilles Mourot;Didier Maquin;José Ragot;Pierre Beauseroy;André Smolarz;Edith Grall-Maës

  • Affiliations:
  • -;-;-;-;-;-;-

  • Venue:
  • Neurocomputing
  • Year:
  • 2014

Quantified Score

Hi-index 0.01

Visualization

Abstract

Multi-task learning technologies have been developed to be an effective way to improve the generalization performance by training multiple related tasks simultaneously. The determination of the relatedness between tasks is usually the key to the formulation of a multi-task learning method. In this paper, we make the assumption that when tasks are related to each other, usually their models are close enough, that is, their models or their model parameters are close to a certain mean function. Following this task relatedness assumption, two multi-task learning formulations based on one-class support vector machines (one-class SVM) are presented. With the help of new kernel design, both multi-task learning methods can be solved by the optimization program of a single one-class SVM. Experiments conducted on both low-dimensional nonlinear toy dataset and high-dimensional textured images show that our approaches lead to very encouraging results.