Diagnostics for use with regression recursive residuals
Technometrics
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
An Introduction to Variational Methods for Graphical Models
Machine Learning
Proceedings of the 1998 conference on Advances in neural information processing systems II
Kalman Filtering and Neural Networks
Kalman Filtering and Neural Networks
Incremental Sparse Kernel Machine
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Variational Relevance Vector Machines
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
Sparse bayesian learning and the relevance vector machine
The Journal of Machine Learning Research
2005 Special Issue: Constructing Bayesian formulations of sparse kernel learning methods
Neural Networks - 2005 Special issue: IJCNN 2005
Hierarchical Bayesian Models for Regularization in Sequential Learning
Neural Computation
Adaptive Learning of Polynomial Networks: Genetic Programming, Backpropagation and Bayesian Methods
Adaptive Learning of Polynomial Networks: Genetic Programming, Backpropagation and Bayesian Methods
Variational Bayes for generalized autoregressive models
IEEE Transactions on Signal Processing
The kernel recursive least-squares algorithm
IEEE Transactions on Signal Processing
Hi-index | 0.00 |
This paper presents a sequential Bayesian approach to kernel modelling of data, which contain unusual observations and outliers. The noise is heavy tailed described as a one-dimensional mixture distribution of Gaussians. The development uses a factorised variational approximation to the posterior of all unknowns, that helps to perform tractable Bayesian inference at two levels: (1) sequential estimation of the weights distribution (including its mean vector and covariance matrix); and (2) recursive updating of the noise distribution and batch evaluation of the weights prior distribution. These steps are repeated, and the free parameters of the non-Gaussian error distribution are adapted at the end of each cycle. The reported results show that this is a robust approach that can outperform standard methods in regression and time-series forecasting.