Fast and efficient strategies for model selection of Gaussian support vector machine

  • Authors:
  • Zongben Xu;Mingwei Dai;Deyu Meng

  • Affiliations:
  • Institute for Information and System Sciences, Faculty of Science, Xi'an Jiaotong University, Xi'an, China;Institute for Information and System Sciences, Faculty of Science, Xi'an Jiaotong University, Xi'an, China;Institute for Information and System Sciences, Faculty of Science, Xi'an Jiaotong University, Xi'an, China

  • Venue:
  • IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Two strategies for selecting the kernel parameter (σ) and the penalty coefficient (C) of Gaussian support vector machines (SVMs) are suggested in this paper. Based on viewing the model parameter selection problem as a recognition problem in visual systems, a direct parameter setting formula for the kernel parameter is derived through finding a visual scale at which the global and local structures of the given data set can be preserved in the feature space, and the difference between the two structures can be maximized. In addition, we propose a heuristic algorithm for the selection of the penalty coefficient through identifying the classification extent of a training datum in the implementation process of the sequential minimal optimization (SMO) procedure, which is a well-developed and commonly used algorithm in SVM training. We then evaluate the suggested strategies with a series of experiments on 13 benchmark problems and three real-world data sets, as compared with the traditional 5-cross validation (5-CV) method and the recently developed radius-margin bound (RM) method. The evaluation shows that in terms of efficiency and generalization capabilities, the new strategies outperform the current methods, and the performance is uniform and stable.