epsilon-SSVR: A Smooth Support Vector Machine for epsilon-Insensitive Regression

  • Authors:
  • Yuh-Jye Lee;Wen-Feng Hsieh;Chien-Ming Huang

  • Affiliations:
  • -;-;-

  • Venue:
  • IEEE Transactions on Knowledge and Data Engineering
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

A new smoothing strategy for solving \epsilon{\hbox{-}}{\rm{support}} vector regression (\epsilon{\hbox{-}}{\rm{SVR}}), tolerating a small error in fitting a given data set linearly or nonlinearly, is proposed in this paper. Conventionally, \epsilon{\hbox{-}}{\rm{SVR}} is formulated as a constrained minimization problem, namely, a convex quadratic programming problem. We apply the smoothing techniques that have been used for solving the support vector machine for classification, to replace the \epsilon{\hbox{-}}{\rm{insensitive}} loss function by an accurate smooth approximation. This will allow us to solve \epsilon{\hbox{-}}{\rm{SVR}} as an unconstrained minimization problem directly. We term this reformulated problem as \epsilon{\hbox{-}}{\rm{smooth}} support vector regression (\epsilon{\hbox{-}}{\rm{SSVR}}). We also prescribe a Newton-Armijo algorithm that has been shown to be convergent globally and quadratically to solve our \epsilon{\hbox{-}}{\rm{SSVR}}. In order to handle the case of nonlinear regression with a massive data set, we also introduce the reduced kernel technique in this paper to avoid the computational difficulties in dealing with a huge and fully dense kernel matrix. Numerical results and comparisons are given to demonstrate the effectiveness and speed of the algorithm.