First and Second Order SMO Algorithms for LS-SVM Classifiers

  • Authors:
  • Jorge López;Johan A. Suykens

  • Affiliations:
  • Departamento de Ingeniería Informática and Instituto de Ingeniería del Conocimiento, Universidad Autónoma de Madrid, Madrid, Spain 28049;Department of Electrical Engineering, ESAT-SCD/SISTA, Katholieke Universiteit Leuven, Leuven, Belgium 3001

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Least squares support vector machine (LS-SVM) classifiers have been traditionally trained with conjugate gradient algorithms. In this work, completing the study by Keerthi et al., we explore the applicability of the SMO algorithm for solving the LS-SVM problem, by comparing First Order and Second Order working set selections concentrating on the RBF kernel, which is the most usual choice in practice. It turns out that, considering all the range of possible values of the hyperparameters, Second Order working set selection is altogether more convenient than First Order. In any case, whichever the selection scheme is, the number of kernel operations performed by SMO appears to scale quadratically with the number of patterns. Moreover, asymptotic convergence to the optimum is proved and the rate of convergence is shown to be linear for both selections.