Sparse support vector regressors based on forward basis selection

  • Authors:
  • Shigenori Muraoka;Shigeo Abe

  • Affiliations:
  • Graduate School of Engineering, Kobe University, Kobe, Japan;Graduate School of Engineering, Kobe University, Kobe, Japan

  • Venue:
  • IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Support Vector Regressors (SVRs) usually give sparse solutions but as a regression problem becomes more difficult the number of support vectors increases and thus sparsity is lost. To solve this problem, in this paper we propose sparse support vector regressors (S-SVRs) trained in the reduced empirical feature space. First by forward selection we select the training data samples, which minimize the regression error estimated by kernel least squares. Then in the reduced empirical feature space spanned by the selected, mapped training data, we train the SVR in the dual form. Since the mapped support vectors obtained by training the S-SVR are expressed by the linear combination of the selected, mapped training data, the support vectors, in the sense that form a solution, are selected training data. By computer simulation, we compare performance of the proposed method with that of the regular SVR and that of the sparse SVR based on Cholesky factorization.