On minimum class locality preserving variance support vector machine

  • Authors:
  • Xiaoming Wang;Fu-lai Chung;Shitong Wang

  • Affiliations:
  • School of Information Technology, Jiangnan University, WuXi, JiangSu, China;Department of Computing, Hong Kong Polytechnic University, Hong Kong, China;School of Information Technology, Jiangnan University, WuXi, JiangSu, China and Department of Computing, Hong Kong Polytechnic University, Hong Kong, China

  • Venue:
  • Pattern Recognition
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this paper, a so-called minimum class locality preserving variance support machine (MCLPV_SVM) algorithm is presented by introducing the basic idea of the locality preserving projections (LPP), which can be seen as a modified class of support machine (SVM) and/or minimum class variance support machine (MCVSVM). MCLPV_SVM, in contrast to SVM and MCVSVM, takes the intrinsic manifold structure of the data space into full consideration and inherits the characteristics of SVM and MCVSVM. We discuss in the paper the linear case, the small sample size case and the nonlinear case of the MCLPV_SVM. Similar to MCVSVM, the MCLPV_SVM optimization problem in the small sample size case is solved by using dimensionality reduction through principal component analysis (PCA) and one in the nonlinear case is transformed into an equivalent linear MCLPV_SVM problem under kernel PCA (KPCA). Experimental results on real datasets indicate the effectiveness of the MCLPV_SVM by comparing it with SVM and MCVSVM.