Multi-class Support Vector Machine Simplification

  • Authors:
  • Ducdung Nguyen;Kazunori Matsumoto;Kazuo Hashimoto;Yasuhiro Takishima;Daichi Takatori;Masahiro Terabe

  • Affiliations:
  • KDDI R&D Laboratories Inc, Fujimino-city, Japan 356-8502;KDDI R&D Laboratories Inc, Fujimino-city, Japan 356-8502;Graduate School of Information Sciences, Tohoku University, Sendai-city, Japan 980-8579;KDDI R&D Laboratories Inc, Fujimino-city, Japan 356-8502;Graduate School of Information Sciences, Tohoku University, Sendai-city, Japan 980-8579;Graduate School of Information Sciences, Tohoku University, Sendai-city, Japan 980-8579

  • Venue:
  • PRICAI '08 Proceedings of the 10th Pacific Rim International Conference on Artificial Intelligence: Trends in Artificial Intelligence
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

In support vector learning, computational complexity of testing phase scales linearly with number of support vectors (SVs) included in the solution --- support vector machine (SVM). Among different approaches, reduced set methods speed-up the testing phase by replacing original SVM with a simplified one that consists of smaller number of SVs, called reduced vectors (RV). In this paper we introduce an extension of the bottom-up method for binary-class SVMs to multi-class SVMs. The extension includes: calculations for optimally combining two multi-weighted SVs, selection heuristic for choosing a good pair of SVs for replacing them with a newly created vector, and algorithm for reducing the number of SVs included in a SVM classifier. We show that our method possesses key advantages over others in terms of applicability, efficiency and stability. In constructing RVs, it requires finding a single maximum point of a one-variable function. Experimental results on public datasets show that simplified SVMs can run faster original SVMs up to 100 times with almost no change in predictive accuracy.