A general soft method for learning SVM classifiers with L1-norm penalty

  • Authors:
  • Qing Tao;Gao-Wei Wu;Jue Wang

  • Affiliations:
  • New Star Research Institute of Applied Tech, Hefei 230031, PR China and Laboratory of Complex Systems and Intelligence Science, Institute of Automation, Chinese Academy of Sciences, Beijing 100080 ...;Division of Intelligent Software Systems, Institute of Computing Technology, Chinese Academy of Sciences, 100080, PR China;Laboratory of Complex Systems and Intelligence Science, Institute of Automation, Chinese Academy of Sciences, Beijing 100080, PR China

  • Venue:
  • Pattern Recognition
  • Year:
  • 2008

Quantified Score

Hi-index 0.01

Visualization

Abstract

Based on the geometric interpretation of support vector machines (SVMs), this paper presents a general technique that allows almost all the existing L"2-norm penalty based geometric algorithms, including Gilbert's algorithm, Schlesinger-Kozinec's (SK) algorithm and Mitchell-Dem'yanov-Malozemov's (MDM) algorithm, to be softened to achieve the corresponding learning L"1-SVM classifiers. Intrinsically, the resulting soft algorithms are to find @e-optimal nearest points between two soft convex hulls. Theoretical analysis has indicated that our proposed soft algorithms are essentially generalizations of the corresponding existing hard algorithms, and consequently, they have the same properties of convergence and almost the identical cost of computation. As a specific example, the problem of solving @n-SVMs by the proposed soft MDM algorithm is investigated and the corresponding solution procedure is specified and analyzed. To validate the general soft technique, several real classification experiments are conducted with the proposed L"1-norm based MDM algorithms and numerical results have demonstrated that their performance is competitive to that of the corresponding L"2-norm based algorithms, such as SK and MDM algorithms.