Convergence improvement of active set training for support vector regressors

  • Authors:
  • Shigeo Abe;Ryousuke Yabuwaki

  • Affiliations:
  • Graduate School of Engineering, Kobe University, Kobe, Japan;Graduate School of Engineering, Kobe University, Kobe, Japan

  • Venue:
  • ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

In our previous work we have discussed the training method of a support vector regressor (SVR) by active set training based on Newton's method. In this paper, we discuss convergence improvement by modifying the training method. To stabilize convergence for a large epsilon tube, we calculate the bias term according to the signs of the previous variables, not the updated variables. And to speed up calculating the inverse matrix by the Cholesky factorization during iteration steps, at the first iteration step, we keep the factorized matrix. And at the subsequent steps we restart the Cholesky factorization at the point where the variable in the working set is replaced. By computer experiments we show that by the proposed method the convergence is stabilized for a large epsilon tube and the incremental Cholesky factorization speeds up training.