Sparse learning for support vector classification

  • Authors:
  • Kaizhu Huang;Danian Zheng;Jun Sun;Yoshinobu Hotta;Katsuhito Fujimoto;Satoshi Naoi

  • Affiliations:
  • National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China;Fujitsu R&D Center Co. Ltd., Beijing, China;Fujitsu R&D Center Co. Ltd., Beijing, China;Fujitsu Laboratories Ltd., Kawasaki, Japan;Fujitsu Laboratories Ltd., Kawasaki, Japan;Fujitsu Laboratories Ltd., Kawasaki, Japan

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2010

Quantified Score

Hi-index 0.10

Visualization

Abstract

This paper provides a sparse learning algorithm for Support Vector Classification (SVC), called Sparse Support Vector Classification (SSVC), which leads to sparse solutions by automatically setting the irrelevant parameters exactly to zero. SSVC adopts the L"0-norm regularization term and is trained by an iteratively reweighted learning algorithm. We show that the proposed novel approach contains a hierarchical-Bayes interpretation. Moreover, this model can build up close connections with some other sparse models. More specifically, one variation of the proposed method is equivalent to the zero-norm classifier proposed in (Weston et al., 2003); it is also an extended and more flexible framework in parallel with the Sparse Probit Classifier proposed by Figueiredo (2003). Theoretical justifications and experimental evaluations on two synthetic datasets and seven benchmark datasets show that SSVC offers competitive performance to SVC but needs significantly fewer Support Vectors.