The nature of statistical learning theory
The nature of statistical learning theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Learning from Data: Concepts, Theory, and Methods
Learning from Data: Concepts, Theory, and Methods
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Hi-index | 0.00 |
We consider the practical advantage of the Bayesian approach over maximum a posteriori methods in its ability to smoothen the landscape of generalization performance measures in the space of hyperparameters, which is vitally important for determining the optimal hyperparameters. The variational method is used to approximate the intractable distribution. Using the leave-one-out error of support vector regression as an example, we demonstrate a further advantage of this method in the analytical estimation of the leave-one-out error, without doing the cross-validation. Comparing our theory with the simulations on both artificial (the "sinc" function) and benchmark (the Boston Housing) data sets, we get a good agreement.