Sparse on-line Gaussian processes
Neural Computation
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Healing the relevance vector machine through augmentation
ICML '05 Proceedings of the 22nd international conference on Machine learning
Neural Computation
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
A Unifying View of Sparse Approximate Gaussian Process Regression
The Journal of Machine Learning Research
Sparse multiscale gaussian process regression
Proceedings of the 25th international conference on Machine learning
A framework for evaluating approximation methods for Gaussian process regression
The Journal of Machine Learning Research
Chebyshev polynomial Kalman filter
Digital Signal Processing
A KNN based kalman filter Gaussian process regression
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Multiple instance learning via Gaussian processes
Data Mining and Knowledge Discovery
Hi-index | 0.00 |
We present a new sparse Gaussian Process (GP) model for regression. The key novel idea is to sparsify the spectral representation of the GP. This leads to a simple, practical algorithm for regression tasks. We compare the achievable trade-offs between predictive accuracy and computational requirements, and show that these are typically superior to existing state-of-the-art sparse approximations. We discuss both the weight space and function space representations, and note that the new construction implies priors over functions which are always stationary, and can approximate any covariance function in this class.