1-Norm least squares twin support vector machines

  • Authors:
  • Shangbing Gao;Qiaolin Ye;Ning Ye

  • Affiliations:
  • The School of Computer Engineering, Huaiyin Institute of Technology, Huai'an, PR China;School of Information Technology, Nanjing Forestry University, Nanjing, PR China;School of Information Technology, Nanjing Forestry University, Nanjing, PR China

  • Venue:
  • Neurocomputing
  • Year:
  • 2011

Quantified Score

Hi-index 0.01

Visualization

Abstract

During the last few years, nonparallel plane classifiers, such as Multisurface Proximal Support Vector Machine via Generalized Eigenvalues (GEPSVM), and Least Squares TWSVM (LSTSVM), have attracted much attention. However, there are not any modifications of them that have been presented to automatically select the input features. This motivates the rush towards new classifiers. In this paper, we develop a new nonparallel plane classifier, which is designed for automatically selecting the relevant features. We first introduce a Tikhonov regularization (TR) term that is usually used for regularizing least squares into the LSTSVM learning framework, and then convert this formulation to a linear programming (LP) problem. By minimizing an exterior penalty (EP) problem of the dual of the LP formulation and using a fast generalized Newton algorithm, our method yields very sparse solutions, such that it generates a classifier that depends on only a smaller number of input features. In other words, this approach is capable of suppressing input features. This makes the classifier easier to store and faster to compute in the classification phase. Lastly, experiments on both toy and real problems disclose the effectiveness of our method.