Short Communication: A geometric method for model selection in support vector machine

  • Authors:
  • Xinjun Peng;Yifei Wang

  • Affiliations:
  • Department of Computational Mathematics, Shanghai Normal University, Shanghai 200234, PR China;Department of Mathematics, Shanghai University, Shanghai 200444, PR China

  • Venue:
  • Expert Systems with Applications: An International Journal
  • Year:
  • 2009

Quantified Score

Hi-index 12.05

Visualization

Abstract

Support vector machine (SVM) has become one of the most popular methods in machine learning during the last years. The parameters' selection in SVM is an important step in achieving a high performance learning machine. Some methods are proposed by minimizing an estimate of generalization error based on bound of leave-one-out (LOO) bound, empirical error, etc. These methods have to optimize many quadratic programming problems and compute an inversion of the Gram-Schmidt matrix, which cause to be time-consuming in large-scale problems. This paper introduces a fast incremental method to optimize the kernel parameters in SVM by combining a geometric algorithm on SVM and an approximation of the gradient of the empirical error. This method shows an online way to update the kernel parameters and work set in incremental learning, which reduces the resources required both CPU time and storage space. The numerical tests on some benchmarks confirm our method.