Consistency of support vector machines and other regularized kernel classifiers

  • Authors:
  • I. Steinwart

  • Affiliations:
  • Los Alamos Nat. Lab., NM

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2005

Quantified Score

Hi-index 754.84

Visualization

Abstract

It is shown that various classifiers that are based on minimization of a regularized risk are universally consistent, i.e., they can asymptotically learn in every classification task. The role of the loss functions used in these algorithms is considered in detail. As an application of our general framework, several types of support vector machines (SVMs) as well as regularization networks are treated. Our methods combine techniques from stochastics, approximation theory, and functional analysis