A Leave-One-out Cross Validation Bound for Kernel Methods with Applications in Learning

  • Authors:
  • Tong Zhang

  • Affiliations:
  • -

  • Venue:
  • COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
  • Year:
  • 2001

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this paper, we prove a general leave-one-out style cross-validation bound for Kernel methods. We apply this bound to some classification and regression problems, and compare the results with previously known bounds. One aspect of our analysis is that the derived expected generalization bounds reflect both approximation (bias) and learning (variance) properties of the underlying kernel methods. We are thus able to demonstrate the universality of certain learning formulations.