Training multilayer perceptrons with the extended Kalman algorithm
Advances in neural information processing systems 1
IEEE Transactions on Pattern Analysis and Machine Intelligence
Regularization theory and neural networks architectures
Neural Computation
Bayesian regularization and pruning using a Laplace prior
Neural Computation
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Signal Processing: Model Based Approach
Signal Processing: Model Based Approach
On the Use of Evidence in Neural Networks
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Volatility of volatility of financial markets
Mathematical and Computer Modelling: An International Journal
Sequential Bayesian kernel modelling with non-Gaussian noise
Neural Networks
A dynamic logistic multiple classifier system for online classification
MCS'11 Proceedings of the 10th international conference on Multiple classifier systems
Hi-index | 0.00 |
We show that a hierarchical Bayesian modeling approach allows us to perform regularization in sequential learning. We identify three inference levels within this hierarchy: model selection, parameter estimation, and noise estimation. In environments where data arrive sequentially, techniques such as cross validation to achieve regularization or model selection are not possible. The Bayesian approach, with extended Kalman filtering at the parameter estimation level, allows for regularization within a minimum variance framework. A multilayer perceptron is used to generate the extended Kalman filter nonlinear measurements mapping. We describe several algorithms at the noise estimation level that allow us to implement on-line regularization. We also show the theoretical links between adaptive noise estimation in extended Kalman filtering, multiple adaptive learning rates, and multiple smoothing regularization coefficients.