Efficient training of graph-regularized multitask SVMs

  • Authors:
  • Christian Widmer;Marius Kloft;Nico Görnitz;Gunnar Rätsch

  • Affiliations:
  • Memorial Sloan-Kettering Cancer Center, New York, USA, FML, Max-Planck Society, Tübingen, Germany;Machine Learning Laboratory, TU Berlin, Germany;Machine Learning Laboratory, TU Berlin, Germany;Memorial Sloan-Kettering Cancer Center, New York, USA, FML, Max-Planck Society, Tübingen, Germany

  • Venue:
  • ECML PKDD'12 Proceedings of the 2012 European conference on Machine Learning and Knowledge Discovery in Databases - Volume Part I
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present an optimization framework for graph-regularized multi-task SVMs based on the primal formulation of the problem. Previous approaches employ a so-called multi-task kernel (MTK) and thus are inapplicable when the numbers of training examples n is large (typically nthree orders of magnitude over LibSVM and SVMLight for several standard benchmarks as well as challenging data sets from the application domain of computational biology. Combining our optimization methodology with the COFFIN large-scale learning framework [3], we are able to train a multi-task SVM using over 1,000,000 training points stemming from 4 different tasks. An efficient C++ implementation of our algorithm is being made publicly available as a part of the SHOGUN machine learning toolbox [4].