Semismooth Newton support vector machine

  • Authors:
  • Shui-sheng Zhou;Hong-wei Liu;Li-hua Zhou;Feng Ye

  • Affiliations:
  • Department of Mathematics, School of Science, Xidian University, Xi'an, Shaanxi 710071, China;Department of Mathematics, School of Science, Xidian University, Xi'an, Shaanxi 710071, China;School of Computer, Xidian University, Xi'an, Shaanxi 710071, China;Department of Mathematics, School of Science, Xidian University, Xi'an, Shaanxi 710071, China

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2007

Quantified Score

Hi-index 0.10

Visualization

Abstract

Support vector machines can be posed as quadratic programming problems in a variety of ways. This paper investigates the 2-norm soft margin SVM with an additional quadratic penalty for the bias term that leads to a positive definite quadratic program in feature space only with the nonnegative constraint. An unconstrained programming problem is proposed as the Lagrangian dual of the quadratic programming for the linear classification problem. The resulted problem minimizes a differentiable convex piecewise quadratic function with lower dimensions in input space, and a Semismooth Newton algorithm is introduced to solve it quickly, then a Semismooth Newton Support Vector Machine (SNSVM) is presented. After the kernel matrix is factorized by the Cholesky factorization or the incomplete Cholesky factorization, the nonlinear kernel classification problem can also be solved by SNSVM, and the complexity of the algorithms has no apparent increase. Many numerical experiments demonstrate that our algorithm is comparable with the similar algorithms such as Lagrangian Support Vector Machines (LSVM) and Semismooth Support Vector Machines (SSVM).