Fast Support Vector Machine Classification using linear SVMs

  • Authors:
  • Karina Zapien Arreola;Janis Fehr;Hans Burkhardt

  • Affiliations:
  • INSA de Rouen LITIS, France;University of Freiburg, Germany;University of Freiburg, Germany

  • Venue:
  • ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 03
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a classification method based on a decision tree whose nodes consist of linear Support Vector Machines (SVMs). Each node defines a decision hyperplane that classifies part of the feature space. For large classification problems (with many Support Vectors (SVs)) it has the advantage that the classification time does not depend on the number of SVs. Here, the classification of a new sample can be calculated by the dot product with the orthogonal vector of each hyperplane. The number of nodes in the tree has shown to be much smaller than the number of SVs in a non-linear SVM, thus, a significant speedup in classification time can be achieved. For non-linear separable problems, the trivial solution (zero vector) of a linear SVM is analyzed and a new formulation of the optimization problem is given to avoid it.