Sparse gaussian processes using backward elimination

  • Authors:
  • Liefeng Bo;Ling Wang;Licheng Jiao

  • Affiliations:
  • Institute of Intelligent Information Processing and, National Key Laboratory for Radar Signal Processing, Xidian University, Xi’an, China;Institute of Intelligent Information Processing and, National Key Laboratory for Radar Signal Processing, Xidian University, Xi’an, China;Institute of Intelligent Information Processing and, National Key Laboratory for Radar Signal Processing, Xidian University, Xi’an, China

  • Venue:
  • ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Gaussian Processes (GPs) have state of the art performance in regression. In GPs, all the basis functions are required for prediction; hence its test speed is slower than other learning algorithms such as support vector machines (SVMs), relevance vector machine (RVM), adaptive sparseness (AS), etc. To overcome this limitation, we present a backward elimination algorithm, called GPs-BE that recursively selects the basis functions for GPs until some stop criterion is satisfied. By integrating rank-1 update, GPs-BE can be implemented at a reasonable cost. Extensive empirical comparisons confirm the feasibility and validity of the proposed algorithm.