An Efficient Support Vector Machine Learning Method with Second-Order Cone Programming for Large-Scale Problems

  • Authors:
  • Rameswar Debnath;Masakazu Muramatsu;Haruhisa Takahashi

  • Affiliations:
  • Department of Information and Communication Engineering, The University of Electro-Communications, Tokyo, Japan 182-8585;Department of Computer Science, The University of Electro-Communications, Tokyo, Japan 182-8585;Department of Information and Communication Engineering, The University of Electro-Communications, Tokyo, Japan 182-8585

  • Venue:
  • Applied Intelligence
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we propose a new fast learning algorithm for the support vector machine (SVM). The proposed method is based on the technique of second-order cone programming. We reformulate the SVM's quadratic programming problem into the second-order cone programming problem. The proposed method needs to decompose the kernel matrix of SVM's optimization problem, and the decomposed matrix is used in the new optimization problem. Since the kernel matrix is positive semidefinite, the dimension of the decomposed matrix can be reduced by decomposition (factorization) methods. The performance of the proposed method depends on the dimension of the decomposed matrix. Experimental results show that the proposed method is much faster than the quadratic programming solver LOQO if the dimension of the decomposed matrix is small enough compared to that of the kernel matrix. The proposed method is also faster than the method proposed in (S. Fine and K. Scheinberg, 2001) for both low-rank and full-rank kernel matrices. The working set selection is an important issue in the SVM decomposition (chunking) method. We also modify Hsu and Lin's working set selection approach to deal with large working set. The proposed approach leads to faster convergence.