Fast pruning superfluous support vectors in SVMs

  • Authors:
  • Xun Liang;Yuefeng Ma;Yangbo He;Li Yu;Rong-Chang Chen;Tao Liu;Xiaoping Yang;Tung-Shou Chen

  • Affiliations:
  • School of Information, Renmin University of China, Beijing 100872, China and Institute of Computer Science and Technology, Peking University, Beijing 100871, China;School of Information, Renmin University of China, Beijing 100872, China;Institute of Computer Science and Technology, Peking University, Beijing 100871, China;School of Information, Renmin University of China, Beijing 100872, China;Department of Management, National Taichung University of Science and Technology;School of Information, Renmin University of China, Beijing 100872, China;School of Information, Renmin University of China, Beijing 100872, China;Department of Management, National Taichung University of Science and Technology

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2013

Quantified Score

Hi-index 0.10

Visualization

Abstract

Support vector machines (SV machines, SVMs) often contain many SVs, which reduce runtime speeds of decision functions. To simplify the decision functions and improve SVM succinctness, the efforts to remove SVs in trained SVMs have been made. By meticulously designing some pruning coefficients and solving for the rest, this paper presents a simple method for fast removing superfluous SVs. The method empowers users to remove those SVs in a single iteration, thereby significantly enhancing the pruning speed of currently used methods, which remove the SVs one by one. The existence and uniqueness of the fast pruning coefficients are shown. The nexus of primal and dual optimizations is illustrated geometrically. The fast pruning method can also be applied to other kernel-based machines without any modifications. The computational complexity is discussed. Examples are given first and experiments on larger data sets demonstrate the effectiveness of the fast simplification method.