Subspace based least squares support vector machines for pattern classification

  • Authors:
  • Takuya Kitamura;Shigeo Abe;Kazuhiro Fukui

  • Affiliations:
  • Graduate School of Engineering, Kobe University, Kobe, Japan;Graduate School of Engineering, Kobe University, Kobe, Japan;Graduate School of Systems and Information Engineering, University of Tsukuba, Tsukuba, Japan

  • Venue:
  • IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we discuss subspace based least squares support vector machines (SSLS-SVMs), in which an input vector is classified into the class with the maximum similarity. Namely, we define the similarity measure for each class by the weighted sum of vectors called dictionaries and optimize the weights so that the margin between classes is optimized. Because the similarity measure is defined for each class, the similarity measure associated with a data sam pie needs to be the largest among all the similarity measures. Introducing slack variables we define these constraints by equality constraints. Then the proposed SSLS-SVMs is similar to LS-SVMs by all-at-once form ulation. Because all-at-once form ulation is inefficient, we also propose SSLS-SVMs by one-against-all form ulation. We demonstrate the effectiveness of the proposed methods with the conventional method for two-class problems.