Least squares one-class support vector machine

  • Authors:
  • Young-Sik Choi

  • Affiliations:
  • Department of Computer Engineering at Korea Aerospace University, Goyang City, Gyeonggi Province 412-791, Republic of Korea

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2009

Quantified Score

Hi-index 0.10

Visualization

Abstract

In this paper, we reformulate a standard one-class SVM (support vector machine) and derive a least squares version of the method, which we call LS (least squares) one-class SVM. The LS one-class SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. One can use the distance to the hyperplane as a proximity measure to determine which objects resemble training objects better than others. This differs from the standard one-class SVMs that detect which objects resemble training objects. We demonstrate the performance of the LS one-class SVM on relevance ranking with positive examples, and also present the comparison with traditional methods including the standard one-class SVM. The experimental results indicate the efficacy of the LS one-class SVM.