Learning with generalization capability by kernel methods of bounded complexity

  • Authors:
  • Věra Kůrková;Marcello Sanguineti

  • Affiliations:
  • Institute of Computer Science, Academy of Sciences of the Czech Republic Pod Vodárenskou věží 2, 182 07, Prague 8, Czech Republic;Department of Communications, Computer, and System Sciences (DIST) University of Genoa, Via Opera Pia 13, 16145 Genova, Italy

  • Venue:
  • Journal of Complexity
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Learning from data with generalization capability is studied in the framework of minimization of regularized empirical error functionals over nested families of hypothesis sets with increasing model complexity. For Tikhonov's regularization with kernel stabilizers, minimization over restricted hypothesis sets containing for a fixed integer n only linear combinations of all n-tuples of kernel functions is investigated. Upper bounds are derived on the rate of convergence of suboptimal solutions from such sets to the optimal solution achievable without restrictions on model complexity. The bounds are of the form 1/n multiplied by a term that depends on the size of the sample of empirical data, the vector of output data, the Gram matrix of the kernel with respect to the input data, and the regularization parameter.