On the Vgamma Dimension for Regression in Reproducing Kernel Hilbert Spaces

  • Authors:
  • Theodoros Evgeniou;Massimiliano Pontil

  • Affiliations:
  • -;-

  • Venue:
  • ALT '99 Proceedings of the 10th International Conference on Algorithmic Learning Theory
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a computation of the Vγ dimension for regression in bounded subspaces of Reproducing Kernel Hilbert Spaces (RKHS) for the Support Vector Machine (SVM) regression Ɛ-insensitive loss function LƐ, and general Lp loss functions. Finiteness of the Vγ dimension is shown, which also proves uniform convergence in probability for regression machines in RKHS subspaces that use the LƐ or general Lp loss functions. This paper presents a novel proof of this result. It also presents a computation of an upper bound of the Vγ dimension under some conditions, that leads to an approach for the estimation of the empirical Vγ dimension given a set of training data.