An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Bayesian parameter estimation via variational methods
Statistics and Computing
Sparse bayesian learning and the relevance vector machine
The Journal of Machine Learning Research
Inference in Hidden Markov Models (Springer Series in Statistics)
Inference in Hidden Markov Models (Springer Series in Statistics)
Variational Bayes for Continuous Hidden Markov Models and Its Application to Active Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Hidden Conditional Random Fields
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hidden Markov models with stick-breaking priors
IEEE Transactions on Signal Processing
Music Analysis Using Hidden Markov Mixture Models
IEEE Transactions on Signal Processing
Survey: Reservoir computing approaches to recurrent neural network training
Computer Science Review
A tighter bound for the echo state property
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
In this work, we propose a novel approach towards sequential data modeling that leverages the strengths of hidden Markov models and echo-state networks (ESNs) in the context of non-parametric Bayesian inference approaches. We introduce a non-stationary hidden Markov model, the time-dependent state transition probabilities of which are driven by a high-dimensional signal that encodes the whole history of the modeled observations, namely the state vector of a postulated observations-driven ESN reservoir. We derive an efficient inference algorithm for our model under the variational Bayesian paradigm, and we examine the efficacy of our approach considering a number of sequential data modeling applications.