A practical Bayesian framework for backpropagation networks
Neural Computation
Keeping the neural networks simple by minimizing the description length of the weights
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Learning in graphical models
An introduction to variational methods for graphical models
Learning in graphical models
Neural Computation
Learning a hierarchical belief network of independent factor analyzers
Proceedings of the 1998 conference on Advances in neural information processing systems II
Proceedings of the 1998 conference on Advances in neural information processing systems II
Learning nonlinear dynamical systems using an EM algorithm
Proceedings of the 1998 conference on Advances in neural information processing systems II
Bayesian approach for neural networks—review and case studies
Neural Networks
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Bayesian Learning for Neural Networks
Bayesian Learning for Neural Networks
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Neural and Adaptive Systems: Fundamentals through Simulations with CD-ROM
Neural and Adaptive Systems: Fundamentals through Simulations with CD-ROM
Monte Carlo Statistical Methods (Springer Texts in Statistics)
Monte Carlo Statistical Methods (Springer Texts in Statistics)
Variational Learning for Switching State-Space Models
Neural Computation
Accelerating Cyclic Update Algorithms for Parameter Estimation by Pattern Searches
Neural Processing Letters
Three easy ways for separating nonlinear mixtures?
Signal Processing - Special issue on independent components analysis and beyond
Hierarchical models of variance sources
Signal Processing - Special issue on independent components analysis and beyond
On the Effect of the Form of the Posterior Approximation in Variational Learning of ICA Models
Neural Processing Letters
2006 Special issue: Exploratory analysis of climate data using source separation methods
Neural Networks - 2006 special issue: Earth sciences and environmental applications of computational intelligence
Signal Processing - Special issue: Information theoretic signal processing
Building Blocks for Variational Bayesian Learning of Latent Variable Models
The Journal of Machine Learning Research
Temporally correlated source separation using variational Bayesian learning approach
Digital Signal Processing
Learning to Transform Time Series with a Few Examples
IEEE Transactions on Pattern Analysis and Machine Intelligence
Bayesian nonstationary source separation
Neurocomputing
Natural Conjugate Gradient in Variational Inference
Neural Information Processing
A gradient-based algorithm competitive with variational Bayesian EM for mixture of Gaussians
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Approximate Riemannian Conjugate Gradient Learning for Fixed-Form Variational Bayes
The Journal of Machine Learning Research
Mining latent sources of causal time series using nonlinear state space modeling
ACIIDS'11 Proceedings of the Third international conference on Intelligent information and database systems - Volume Part I
Long-Term prediction of time series using state-space models
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part II
State inference in variational bayesian nonlinear state-space models
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Hi-index | 0.00 |
A Bayesian ensemble learning method is introduced for unsupervised extraction of dynamic processes from noisy data. The data are assumed to be generated by an unknown nonlinear mapping from unknown factors. The dynamics of the factors are modeled using a nonlinear state-space model. The nonlinear mappings in the model are represented using multilayer perceptron networks. The proposed method is computationally demanding, but it allows the use of higher-dimensional nonlinear latent variable models than other existing approaches. Experiments with chaotic data show that the new method is able to blindly estimate the factors and the dynamic process that generated the data. It clearly outperforms currently available nonlinear prediction techniques in this very difficult test problem.