Leave-One-Out Bounds for Support Vector Regression Model Selection
Neural Computation
An Equivalence between SILF-SVR and Ordinary Kriging
Neural Processing Letters
Support vector machines for dyadic data
Neural Computation
Cybernetics and Systems Analysis
Gaussian processes and limiting linear models
Computational Statistics & Data Analysis
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
Dimensionality Estimation, Manifold Learning and Function Approximation using Tensor Voting
The Journal of Machine Learning Research
Feature selection for support vector regression using probabilistic prediction
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Computational Statistics & Data Analysis
On-line Support Vector Regression of the transition model for the Kalman filter
Image and Vision Computing
Hi-index | 0.00 |
In this paper, we use a unified loss function, called the soft insensitive loss function, for Bayesian support vector regression. We follow standard Gaussian processes for regression to set up the Bayesian framework, in which the unified loss function is used in the likelihood evaluation. Under this framework, the maximum a posteriori estimate of the function values corresponds to the solution of an extended support vector regression problem. The overall approach has the merits of support vector regression such as convex quadratic programming and sparsity in solution representation. It also has the advantages of Bayesian methods for model adaptation and error bars of its predictions. Experimental results on simulated and real-world data sets indicate that the approach works well even on large data sets.