A Short Review of Statistical Learning Theory
WIRN VIETRI 2002 Proceedings of the 13th Italian Workshop on Neural Nets-Revised Papers
Adapting Kernels by Variational Approach in SVM
AI '02 Proceedings of the 15th Australian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
Support Vector Machines: Theory and Applications
Machine Learning and Its Applications, Advanced Lectures
On the Noise Model of Support Vector Machines Regression
ALT '00 Proceedings of the 11th International Conference on Algorithmic Learning Theory
Parameter detection of thin films from their x-ray reflectivity by support vector machines
Applied Numerical Mathematics
L1 LASSO Modeling and Its Bayesian Inference
AI '08 Proceedings of the 21st Australasian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
Optimization procedure for predicting nonlinear time series based on a non-Gaussian noise model
MICAI'07 Proceedings of the artificial intelligence 6th Mexican international conference on Advances in artificial intelligence
Mixture of the robust L1 distributions and its applications
AI'07 Proceedings of the 20th Australian joint conference on Advances in artificial intelligence
Full length article: Support vector machines regression with l1-regularizer
Journal of Approximation Theory
On-line Support Vector Regression of the transition model for the Kalman filter
Image and Vision Computing
Hi-index | 0.00 |
Support Vector Machines Regression (SVMR) is a regression technique which has been recently introduced by V. Vapnik and his collaborators (Vapnik, 1995; Vapnik, Golowich and Smola, 1996). In SVMR the goodness of fit is measured not by the usual quadratic loss function (the mean square error), but by a different loss function called {\it Vapnik''s $\epsilon$- insensitive loss function}, which is similar to the ``robust'''' loss functions introduced by Huber (Huber, 1981). The quadratic loss function is well justified under the assumption of Gaussian additive noise. However, the noise model underlying the choice of Vapnik''s loss function is less clear. In this paper the use of Vapnik''s loss function is shown to be equivalent to a model of additive and Gaussian noise, where the variance and mean of the Gaussian are random variables. The probability distributions for the variance and mean will be stated explicitly. While this work is presented in the framework of SVMR, it can be extended to justify non-quadratic loss functions in any Maximum Likelihood or Maximum A Posteriori approach. It applies not only to Vapnik''s loss function, but to a much broader class of loss functions.