Generalization Performance of Classifiers in Terms of Observed Covering Numbers

  • Authors:
  • John Shawe-Taylor;Nello Cristianini

  • Affiliations:
  • -;-

  • Venue:
  • EuroCOLT '99 Proceedings of the 4th European Conference on Computational Learning Theory
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

It is known that the covering numbers of a function class on a double sample (length 2m) can be used to bound the generalization performance of a classifier by using a margin based analysis. In this paper we show that one can utilize an analogous argument in terms of the observed covering numbers on a single m-sample (being the actual observed data points). The significance of this is that for certain interesting classes of functions, such as support vector machines, there are new techniques which allow one to find good estimates for such covering numbers in terms of the speed of decay of the eigenvalues of a Gram matrix. These covering numbers can be much less than a priori bounds indicate in situations where the particular data received is "easy". The work can be considered an extension of previous results which provided generalization performance bounds in terms of the VC-dimension of the class of hypotheses restricted to the sample, with the considerable advantage that the covering numbers can be readily computed, and they often are small.