Matrix computations (3rd ed.)
Constructive incremental learning from only local information
Neural Computation
Sparse on-line Gaussian processes
Neural Computation
On-Line Support Vector Machine Regression
ECML '02 Proceedings of the 13th European Conference on Machine Learning
Sparse Online Greedy Support Vector Regression
ECML '02 Proceedings of the 13th European Conference on Machine Learning
Locally Weighted Projection Regression: Incremental Real Time Learning in High Dimensional Space
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Accurate on-line support vector regression
Neural Computation
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Algorithmic Learning in a Random World
Algorithmic Learning in a Random World
Incremental Online Learning in High Dimensions
Neural Computation
Prediction, Learning, and Games
Prediction, Learning, and Games
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Fast Kernel Classifiers with Online and Active Learning
The Journal of Machine Learning Research
Most likely heteroscedastic Gaussian process regression
Proceedings of the 24th international conference on Machine learning
Measuring empirical computational complexity
Proceedings of the the 6th joint meeting of the European software engineering conference and the ACM SIGSOFT symposium on The foundations of software engineering
Tracking the best hyperplane with a simple budget Perceptron
Machine Learning
The Forgetron: A Kernel-Based Perceptron on a Budget
SIAM Journal on Computing
A Library for Locally Weighted Projection Regression
The Journal of Machine Learning Research
Bounded Kernel-Based Online Learning
The Journal of Machine Learning Research
Sparse Spectrum Gaussian Process Regression
The Journal of Machine Learning Research
An identity for kernel ridge regression
ALT'10 Proceedings of the 21st international conference on Algorithmic learning theory
On-line regression algorithms for learning mechanical models of robots: A survey
Robotics and Autonomous Systems
IEEE Transactions on Signal Processing
The kernel recursive least-squares algorithm
IEEE Transactions on Signal Processing
Improved Risk Tail Bounds for On-Line Algorithms
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Novel applications in unstructured and non-stationary human environments require robots that learn from experience and adapt autonomously to changing conditions. Predictive models therefore not only need to be accurate, but should also be updated incrementally in real-time and require minimal human intervention. Incremental Sparse Spectrum Gaussian Process Regression is an algorithm that is targeted specifically for use in this context. Rather than developing a novel algorithm from the ground up, the method is based on the thoroughly studied Gaussian Process Regression algorithm, therefore ensuring a solid theoretical foundation. Non-linearity and a bounded update complexity are achieved simultaneously by means of a finite dimensional random feature mapping that approximates a kernel function. As a result, the computational cost for each update remains constant over time. Finally, algorithmic simplicity and support for automated hyperparameter optimization ensures convenience when employed in practice. Empirical validation on a number of synthetic and real-life learning problems confirms that the performance of Incremental Sparse Spectrum Gaussian Process Regression is superior with respect to the popular Locally Weighted Projection Regression, while computational requirements are found to be significantly lower. The method is therefore particularly suited for learning with real-time constraints or when computational resources are limited.