Adaptive pruning algorithm for least squares support vector machine classifier

  • Authors:
  • Xiaowei Yang;Jie Lu;Guangquan Zhang

  • Affiliations:
  • South China Univ. of Technol., Sch. of Math. Sci., 510641, Guangzhou, China and Univ. of Technol., Sydney, Fac. of Info. Technol., NSW, Australia and Jilin Univ., Key Lab. of Symbolic Comp. and Kn ...;University of Technology, Sydney, Faculty of Information Technology, PO Box 123, 2007, Broadway, NSW, Australia;University of Technology, Sydney, Faculty of Information Technology, PO Box 123, 2007, Broadway, NSW, Australia

  • Venue:
  • Soft Computing - A Fusion of Foundations, Methodologies and Applications
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

As a new version of support vector machine (SVM), least squares SVM (LS-SVM) involves equality instead of inequality constraints and works with a least squares cost function. A well-known drawback in the LS-SVM applications is that the sparseness is lost. In this paper, we develop an adaptive pruning algorithm based on the bottom-to-top strategy, which can deal with this drawback. In the proposed algorithm, the incremental and decremental learning procedures are used alternately and a small support vector set, which can cover most of the information in the training set, can be formed adaptively. Using this set, one can construct the final classifier. In general, the number of the elements in the support vector set is much smaller than that in the training set and a sparse solution is obtained. In order to test the efficiency of the proposed algorithm, we apply it to eight UCI datasets and one benchmarking dataset. The experimental results show that the presented algorithm can obtain adaptively the sparse solutions with losing a little generalization performance for the classification problems with no-noises or noises, and its training speed is much faster than sequential minimal optimization algorithm (SMO) for the large-scale classification problems with no-noises.