Leave-one-out bounds for kernel methods

  • Authors:
  • Tong Zhang

  • Affiliations:
  • IBM T. J. Watson Research Center, Yorktown Heights, NY

  • Venue:
  • Neural Computation
  • Year:
  • 2003

Quantified Score

Hi-index 0.06

Visualization

Abstract

In this article, we study leave-one-out style cross-validation bounds for kernel methods. The essential element in our analysis is a bound on the parameter estimation stability for regularized kernel formulations. Using this result, we derive bounds on expected leave-one-out cross-validation errors, which lead to expected generalization bounds for various kernel algorithms. In addition, we also obtain variance bounds for leave-one-out errors. We apply our analysis to some classification and regression problems and compare them with previous results.