Letters: Direct simplification for kernel regression machines

  • Authors:
  • Wenwu He;Zhizhong Wang

  • Affiliations:
  • Department of Mathematics and Physics, Fujian University of Technology, Fuzhou, Fujian 350108, China;School of Mathematical Science and Computing Technology, Central South University, Changsha 410075, China

  • Venue:
  • Neurocomputing
  • Year:
  • 2008

Quantified Score

Hi-index 0.01

Visualization

Abstract

Kernel machines have been widely used in learning. However, standard algorithms are often time consuming. To this end, we propose a new method, direct simplification (DS) for imposing the sparsity of kernel regression machines. Different to the existing sparse methods, DS performs approximation and optimization in a unified framework by incrementally finding a set of basis functions that minimizes the primal risk function directly. The main advantage of our method lies in its ability to form very good approximations for kernel regression machines with a clear control on the computation complexity as well as the training time. Experiments on two real time series and two benchmarks assess the feasibility of our method and show that DS can obtain better performance with fewer bases compared with two-step-type sparse method.