Learning rates for regularized classifiers using multivariate polynomial kernels

  • Authors:
  • Hongzhi Tong;Di-Rong Chen;Lizhong Peng

  • Affiliations:
  • LMAM, School of Mathematical Sciences, Peking University, Beijing 100871, PR China and Department of Mathematics, University of International Business and Economics, Beijing 100029, PR China;Department of Mathematics and LMIB, Beijing University of Aeronautics and Astronautics, Beijing 100083, PR China;LMAM, School of Mathematical Sciences, Peking University, Beijing 100871, PR China

  • Venue:
  • Journal of Complexity
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Regularized classifiers (a leading example is support vector machine) are known to be a kind of kernel-based classification methods generated from Tikhonov regularization schemes, and the polynomial kernels are the original and also probably the most important kernels used in them. In this paper, we provide an error analysis for the regularized classifiers using multivariate polynomial kernels. We introduce Bernstein-Durrmeyer polynomials, whose reproducing kernel Hilbert space norms and approximation properties in L^1 space play a key role in the analysis of regularization error. We also introduce the standard estimation of sample error, and derive explicit learning rates for these algorithms.