2009 Special Issue: Subspace-based support vector machines for pattern classification

  • Authors:
  • Takuya Kitamura;Syogo Takeuchi;Shigeo Abe;Kazuhiro Fukui

  • Affiliations:
  • Graduate School of Engineering, Kobe University, Kobe, Japan;Graduate School of Engineering, Kobe University, Kobe, Japan;Graduate School of Engineering, Kobe University, Kobe, Japan;Graduate School of Systems and Information Engineering, University of Tsukuba, Tsukuba, Japan

  • Venue:
  • Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we discuss subspace-based support vector machines (SS-SVMs), in which an input vector is classified into the class with the maximum similarity. Namely, for each class we define the weighted similarity measure using the vectors called dictionaries that represent the class, and optimize the weights so that the margin between classes is maximized. Because the similarity measure is defined for each class, for a data sample the similarity measure to which the data sample belongs needs to be the largest among all the similarity measures. Introducing slack variables, we define these constraints either by equality constraints or inequality constraints. As a result we obtain subspace-based least squares SVMs (SSLS-SVMs) and subspace-based linear programming SVMs (SSLP-SVMs). To speedup training of SSLS-SVMs, which are similar to LS-SVMs by all-at-once formulation, we also propose SSLS-SVMs by one-against-all formulation, which optimize each similarity measure separately. Using two-class problems, we clarify the difference of SSLS-SVMs and SSLP-SVMs and evaluate the effectiveness of the proposed methods over the conventional methods with equal weights and with weights equal to eigenvalues.