On average case complexity of linear problems with noisy information
Journal of Complexity
A practical Bayesian framework for backpropagation networks
Neural Computation
Numerical recipes in C (2nd ed.): the art of scientific computing
Numerical recipes in C (2nd ed.): the art of scientific computing
Almost optimal differentiation using noisy data
Journal of Approximation Theory
Regression with input-dependent noise: a Gaussian process treatment
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Prediction with Gaussian processes: from linear regression to linear prediction and beyond
Learning in graphical models
Finite-dimensional approximation of Gaussian processes
Proceedings of the 1998 conference on Advances in neural information processing systems II
General bounds on Bayes errors for regression with Gaussian processes
Proceedings of the 1998 conference on Advances in neural information processing systems II
Learning curves for Gaussian processes
Proceedings of the 1998 conference on Advances in neural information processing systems II
Upper and Lower Bounds on the Learning Curve for Gaussian Processes
Machine Learning
Learning Curves for Gaussian Processes Models: Fluctuations and Universality
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
The Effect of the Input Density Distribution on Kernel-based Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Learning curves for Gaussian processes via numerical cubature integration
ICANN'11 Proceedings of the 21th international conference on Artificial neural networks - Volume Part I
Can gaussian process regression be made robust against model mismatch?
Proceedings of the First international conference on Deterministic and Statistical Methods in Machine Learning
Understanding gaussian process regression using the equivalent kernel
Proceedings of the First international conference on Deterministic and Statistical Methods in Machine Learning
Predicting nonstationary time series with multi-scale gaussian processes model
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Random walk kernels and learning curves for Gaussian process regression on random graphs
The Journal of Machine Learning Research
Hi-index | 0.00 |
We consider the problem of calculating learning curves (i.e., average generalization performance) of gaussian processes used for regression. On the basis of a simple expression for the generalization error, in terms of the eigenvalue decomposition of the covariance function, we derive a number of approximation schemes. We identify where these become exact and compare with existing bounds on learning curves; the new approximations, which can be used for any input space dimension, generally get substantially closer to the truth. We also study possible improvements to our approximations. Finally, we use a simple exactly solvable learning scenario to show that there are limits of principle on the quality of approximations and bounds expressible solely in terms of the eigenvalue spectrum of the covariance function.