Regularization theory and neural networks architectures
Neural Computation
Prediction with Gaussian processes: from linear regression to linear prediction and beyond
Learning in graphical models
General bounds on Bayes errors for regression with Gaussian processes
Proceedings of the 1998 conference on Advances in neural information processing systems II
Learning curves for Gaussian process regression: approximations and bounds
Neural Computation
Handbook of Mathematical Functions, With Formulas, Graphs, and Mathematical Tables,
Handbook of Mathematical Functions, With Formulas, Graphs, and Mathematical Tables,
Fast inference in generalized linear models via expected log-likelihoods
Journal of Computational Neuroscience
Hi-index | 0.00 |
The equivalent kernel [1] is a way of understanding how Gaussian process regression works for large sample sizes based on a continuum limit. In this paper we show how to approximate the equivalent kernel of the widely-used squared exponential (or Gaussian) kernel and related kernels. This is easiest for uniform input densities, but we also discuss the generalization to the non-uniform case. We show further that the equivalent kernel can be used to understand the learning curves for Gaussian processes, and investigate how kernel smoothing using the equivalent kernel compares to full Gaussian process regression.