Multi-kernel regularized classifiers

  • Authors:
  • Qiang Wu;Yiming Ying;Ding-Xuan Zhou

  • Affiliations:
  • Department of Mathematics, City University of Hong Kong, Kowloon, Hong Kong, China;Department of Mathematics, City University of Hong Kong, Kowloon, Hong Kong, China;Department of Mathematics, City University of Hong Kong, Kowloon, Hong Kong, China

  • Venue:
  • Journal of Complexity
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

A family of classification algorithms generated from Tikhonov regularization schemes are considered. They involve multi-kernel spaces and general convex loss functions. Our main purpose is to provide satisfactory estimates for the excess misclassification error of these multi-kernel regularized classifiers when the loss functions achieve the zero value. The error analysis consists of two parts: regularization error and sample error. Allowing multi-kernels in the algorithm improves the regularization error and approximation error, which is one advantage of the multi-kernel setting. For a general loss function, we show how to bound the regularization error by the approximation in some weighted L^q spaces. For the sample error, we use a projection operator. The projection in connection with the decay of the regularization error enables us to improve convergence rates in the literature even for the one-kernel schemes and special loss functions: least-square loss and hinge loss for support vector machine soft margin classifiers. Existence of the optimization problem for the regularization scheme associated with multi-kernels is verified when the kernel functions are continuous with respect to the index set. Concrete examples, including Gaussian kernels with flexible variances and probability distributions with some noise conditions, are used to illustrate the general theory.