Generalization error analysis for polynomial kernel methods: algebraic geometrical approach

  • Authors:
  • Kazushi Ikeda

  • Affiliations:
  • Graduate School of Informatics, Kyoto University, Kyoto, Japan

  • Venue:
  • ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

The generalization properties of learning classifiers with a polynomial kernel function are examined here. We first show that the generalization error of the learning machine depends on the properties of the separating curve, that is, the intersection of the input surface and the true separating hyperplane in the feature space. When the input space is one-dimensional, the problem is decomposed to as many one-dimensional problems as the number of the intersecting points. Otherwise, the generalization error is determined by the class of the separating curve. Next, we consider how the class of the separating curve depends on the true separating function. The class is maximum when the true separating polynomial function is irreducible and smaller otherwise. In either case, the class depends only on the true function and does not on the dimension of the feature space. The results imply that the generalization error does not increase even when the dimension of the feature space gets larger and that the so-called overmodeling does not occur in the kernel learning.