Better subset regression using the nonnegative garrote
Technometrics
Bayesian Learning for Neural Networks
Bayesian Learning for Neural Networks
Evaluation of gaussian processes and other methods for non-linear regression
Evaluation of gaussian processes and other methods for non-linear regression
Predictive Approaches for Choosing Hyperparameters in Gaussian Processes
Neural Computation
Predictive Approaches for Choosing Hyperparameters in Gaussian Processes
Neural Computation
Fast generalized cross-validation algorithm for sparse model learning
Neural Computation
Preventing Over-Fitting during Model Selection via Bayesian Regularisation of the Hyper-Parameters
The Journal of Machine Learning Research
Learning non-stationary system dynamics online using Gaussian processes
Proceedings of the 32nd DAGM conference on Pattern recognition
An efficient EM approach to parameter learning of the mixture of gaussian processes
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part II
Multiple parameter selection for LS-SVM using smooth leave-one-out error
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
Resampling methods for meta-model validation with recommendations for evolutionary computation
Evolutionary Computation
A probabilistic least squares approach to ordinal regression
AI'12 Proceedings of the 25th Australasian joint conference on Advances in Artificial Intelligence
Journal of Multivariate Analysis
Hi-index | 0.00 |
Gaussian processes are powerful regression models specified by parameterized mean and covariance functions. Standard approaches to choose these parameters (known by the name hyperparameters) are maximum likelihood and maximum a posteriori. In this article, we propose and investigate predictive approaches based on Geisser's predictive sample reuse (PSR) methodology and the related Stone's cross-validation (CV) methodology. More specifically, we derive results for Geisser's surrogate predictive probability (GPP), Geisser's predictive mean square error (GPE), and the standard CV error and make a comparative study. Within an approximation we arrive at the generalized cross-validation (GCV) and establish its relationship with the GPP and GPE approaches. These approaches are tested on a number of problems. Experimental results show that these approaches are strongly competitive with the existing approaches.