Learning Rates for Regularized Classifiers Using Trigonometric Polynomial Kernels

  • Authors:
  • Feilong Cao;Dan Wu;Joonwhoan Lee

  • Affiliations:
  • Department of Information and Mathematics Sciences, China Jiliang University, Hangzhou, People's Republic of China 310018;Department of Information and Mathematics Sciences, China Jiliang University, Hangzhou, People's Republic of China 310018;Division of Computer Science and Engineering, Chonbuk National University, Jeonju, South Korea 561-756

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Regularized classifiers are known to be a kind of kernel-based classification methods generated from Tikhonov regularization schemes, and the trigonometric polynomial kernels are ones of the most important kernels and play key roles in signal processing. The main target of this paper is to provide convergence rates of classification algorithms generated by regularization schemes with trigonometric polynomial kernels. As a special case, an error analysis for the support vector machines (SVMs) soft margin classifier is presented. The norms of Fejér operator in reproducing kernel Hilbert space and properties of approximation of the operator in L 1 space with periodic function play key roles in the analysis of regularization error. Some new bounds on the learning rate of regularization algorithms based on the measure of covering number for normalized loss functions are established. Together with the analysis of sample error, the explicit learning rates for SVM are also derived.