Coordinate Descent Method for Large-scale L2-loss Linear Support Vector Machines

  • Authors:
  • Kai-Wei Chang;Cho-Jui Hsieh;Chih-Jen Lin

  • Affiliations:
  • -;-;-

  • Venue:
  • The Journal of Machine Learning Research
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Linear support vector machines (SVM) are useful for classifying large-scale sparse data. Problems with sparse features are common in applications such as document classification and natural language processing. In this paper, we propose a novel coordinate descent algorithm for training linear SVM with the L2-loss function. At each step, the proposed method minimizes a one-variable sub-problem while fixing other variables. The sub-problem is solved by Newton steps with the line search technique. The procedure globally converges at the linear rate. As each sub-problem involves only values of a corresponding feature, the proposed approach is suitable when accessing a feature is more convenient than accessing an instance. Experiments show that our method is more efficient and stable than state of the art methods such as Pegasos and TRON.