Subspace based linear programming support vector machines

  • Authors:
  • Syogo Takeuchi;Takuya Kitamura;Shigeo Abe;Kazuhiro Fukui

  • Affiliations:
  • Graduate School of Engineering, Kobe University, Kobe, Japan;Graduate School of Engineering, Kobe University, Kobe, Japan;Graduate School of Engineering, Kobe University, Kobe, Japan;Graduate School of Systems and Information Engineering, University of Tsukuba, Tsukuba, Japan

  • Venue:
  • IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In subspace methods, the subspace associated with a class is represented by a small number of vectors called dictionaries and using the dictionaries the similarity measure is defined and an input is classified into the class with the highest similarity. Usually, each dictionary is given an equal weight. But if subspaces of different classes overlap, the similarity measures for the overlapping regions will not give useful information for classification. In this paper, we propose optimizing the weights for the dictionaries using the idea of support vector machines (SVMs). Namely, first we map the input space into the empirical feature space, perform kernel principal component analysis (KPCA) for each class, and define a similarity measure. Then considering that the similarity measure corresponds to the hyperplane, we formulate the optimization problem as maximizing the margin between the class associated with the dictionaries and the remaining classes. The optimization problem results in all-at-once formulation of linear SVMs. We demonstrate the effectiveness of the proposed method with that of the conventional methods for two-class problems.