On the Noise Model of Support Vector Machines Regression

  • Authors:
  • Massimiliano Pontil;Sayan Mukherjee;Federico Girosi

  • Affiliations:
  • -;-;-

  • Venue:
  • ALT '00 Proceedings of the 11th International Conference on Algorithmic Learning Theory
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

Support Vector Machines Regression (SVMR) is a learning technique where the goodness of fit is measured not by the usual quadratic loss function (the mean square error), but by a different loss function called the Ɛ-Insensitive Loss Function (ILF), which is similar to loss functions used in the field of robust statistics. The quadratic loss function is well justified under the assumption of Gaussian additive noise. However, the noise model underlying the choice of the ILF is not clear. In this paper the use of the ILF is justified under the assumption that the noise is additive and Gaussian, where the variance and mean of the Gaussian are random variables. The probability distributions for the variance and mean will be stated explicitly. While this work is presented in the framework of SVMR, it can be extended to justify nonquadratic loss functions in any Maximum Likelihood or Maximum AP osteriori approach. It applies not only to the ILF, but to a much broader class of loss functions.