Efficient reduction of support vectors in kernel-based methods

  • Authors:
  • Takumi Kobayashi;Nobuyuki Otsu

  • Affiliations:
  • National Institute of Advanced Industrial Science and Technology, Tsukuba, Japan;National Institute of Advanced Industrial Science and Technology, Tsukuba, Japan

  • Venue:
  • ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Kernel-based methods, e.g., support vector machine (SVM), produce high classification performances. However, the computation becomes time-consuming as the number of the vectors supporting the classifier increases. In this paper, we propose a method for reducing the computational cost of classification by kernel-based methods while retaining the high performance. By using linear algebra of a kernel Gram matrix of the support vectors (SVs) at low computational cost, the method efficiently prunes the redundant SVs which are unnecessary for constructing the classifier. The pruning is based on the evaluation of the performance of the classifier formed by the reduced SVs in SVM. In the experiment of classification using SVM for various datasets, the feasibility of the evaluation criterion and the effectiveness of the proposed method are demonstrated.