SSVM: A Smooth Support Vector Machine for Classification

  • Authors:
  • Yuh-Jye Lee;O. L. Mangasarian

  • Affiliations:
  • Computer Sciences Department, University of Wisconsin, 1210 West Dayton Street, Madison, Wisconsin 53706 USA. yuh-jye@cs.wisc.edu;Computer Sciences Department, University of Wisconsin, 1210 West Dayton Street, Madison, Wisconsin 53706 USA. olvi@cs.wisc.edu

  • Venue:
  • Computational Optimization and Applications
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

Smoothing methods, extensively used for solving important mathematical programming problems and applications, are applied here to generate and solve an unconstrained smooth reformulation of the support vector machine for pattern classification using a completely arbitrary kernel. We term such reformulation a smooth support vector machine (SSVM). A fast Newton–Armijo algorithm for solving the SSVM converges globally and quadratically. Numerical results and comparisons are given to demonstrate the effectiveness and speed of the algorithm. On six publicly available datasets, tenfold cross validation correctness of SSVM was the highest compared with four other methods as well as the fastest. On larger problems, SSVM was comparable or faster than SVMlight (T. Joachims, in Advances in Kernel Methods—Support Vector Learning, MIT Press: Cambridge, MA, 1999), SOR (O.L. Mangasarian and David R. Musicant, IEEE Transactions on Neural Networks, vol. 10, pp. 1032–1037, 1999) and SMO (J. Platt, in Advances in Kernel Methods—Support Vector Learning, MIT Press: Cambridge, MA, 1999). SSVM can also generate a highly nonlinear separating surface such as a checkerboard.