A novel sequential minimal optimization algorithm for support vector regression

  • Authors:
  • Jun Guo;Norikazu Takahashi;Tetsuo Nishi

  • Affiliations:
  • Department of Computer Science and Communication Engineering, Kyushu Univ., Fukuoka, Japan;Department of Computer Science and Communication Engineering, Kyushu Univ., Fukuoka, Japan;Faculty of Science and Engineering, Waseda Univ., Tokyo, Japan

  • Venue:
  • ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

A novel sequential minimal optimization (SMO) algorithm for support vector regression is proposed. This algorithm is based on Flake and Lawrence's SMO in which convex optimization problems with l variables are solved instead of standard quadratic programming problems with 2l variables where l is the number of training samples, but the strategy for working set selection is quite different. Experimental results show that the proposed algorithm is much faster than Flake and Lawrence's SMO and comparable to the fastest conventional SMO.