Second-order smo improves svm online and active learning

  • Authors:
  • Tobias Glasmachers;Christian Igel

  • Affiliations:
  • Institut für Neuroinformatik, Ruhr-Universität Bochum, 44780 Bochum, Germany. Tobias.Glasmachers@neuroinformatik.ruhr-uni-bochum.de;Institut für Neuroinformatik, Ruhr-Universität Bochum, 44780 Bochum, Germany. c.igel@ieee.org

  • Venue:
  • Neural Computation
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Iterative learning algorithms that approximate the solution of support vector machines (SVMs) have two potential advantages. First, they allow online and active learning. Second, for large data sets, computing the exact SVM solution may be too time-consuming, and an efficient approximation can be preferable. The powerful LASVM iteratively approaches the exact SVM solution using sequential minimal optimization (SMO). It allows efficient online and active learning. Here, this algorithm is considerably improved in speed and accuracy by replacing the working set selection in the SMO steps. A second-order working set selection strategy, which greedily aims at maximizing the progress in each single step, is incorporated.