Efficient performance estimate for one-class support vector machine

  • Authors:
  • Quang-Anh Tran;Xing Li;Haixin Duan

  • Affiliations:
  • Network Research Center, Tsinghua University, Beijing 100084, China;Network Research Center, Tsinghua University, Beijing 100084, China;Network Research Center, Tsinghua University, Beijing 100084, China

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2005

Quantified Score

Hi-index 0.10

Visualization

Abstract

This letter proposes and analyzes a method (@x@a@r-estimate) to estimate the generalization performance of one-class support vector machine (SVM) for novelty detection. The method is an extended version of the @x@a-estimate method, which is used to estimate the generalization performance of standard SVM for classification. Our method is derived from analyzing the connection between one-class SVM and standard SVM. Without any computation intensive re-sampling, the method is computationally much more efficient than leave-one-out method, since it can be computed immediately from the decision function of one-class SVM. Using our method to estimate the error rate is more precise than using the fraction of support vectors and a parameter @n of one-class SVM. We also propose that the fraction of support vectors characterizes the precision of one-class SVM. A theoretical analysis and experiments on an artificial data and a widely known handwritten digit recognition set (MNIST) show that our method can effectively estimate the generalization performance of one-class SVM for novelty detection.