Fast exact leave-one-out cross-validation of sparse least-squares support vector machines

  • Authors:
  • Gavin C. Cawley;Nicola L. C. Talbot

  • Affiliations:
  • School of Computing Sciences, University of East Anglia, Norwich NR4 7TJ, UK;School of Computing Sciences, University of East Anglia, Norwich NR4 7TJ, UK

  • Venue:
  • Neural Networks
  • Year:
  • 2004

Quantified Score

Hi-index 0.01

Visualization

Abstract

Leave-one-out cross-validation has been shown to give an almost unbiased estimator of the generalisation properties of statistical models, and therefore provides a sensible criterion for model selection and comparison. In this paper we show that exact leave-one-out cross-validation of sparse Least-Squares Support Vector Machines (LS-SVMs) can be implemented with a computational complexity of only O(ln2) floating point operations, rather than the O(l2n2) operations of a naïve implementation, where l is the number of training patterns and n is the number of basis vectors. As a result, leave-one-out cross-validation becomes a practical proposition for model selection in large scale applications. For clarity the exposition concentrates on sparse least-squares support vector machines in the context of non-linear regression, but is equally applicable in a pattern recognition setting.