On-line Stochastic Matching compensation for non-stationary noise
Computer Speech and Language
Automatica (Journal of IFAC)
A survey of techniques for incremental learning of HMM parameters
Information Sciences: an International Journal
The latent words language model
Computer Speech and Language
A multi-choice offer strategy for bilateral multi-issue negotiations using modified DWM learning
Proceedings of the 13th International Conference on Electronic Commerce
Hi-index | 35.69 |
Sequential or online hidden Markov model (HMM) signal processing schemes are derived, and their performance is illustrated by simulation. The online algorithms are sequential expectation maximization (EM) schemes and are derived by using stochastic approximations to maximize the Kullback-Leibler information measure. The schemes can be implemented either as filters or fixed-lag or sawtooth-lag smoothers. They yield estimates of the HMM parameters including transition probabilities, Markov state levels, and noise variance. In contrast to the offline EM algorithm (Baum-Welch scheme), which uses the fixed-interval forward-backward scheme, the online schemes have significantly reduced memory requirements and improved convergence, and they can estimate HMM parameters that vary slowly with time or undergo infrequent jump changes. Similar techniques are used to derive online schemes for extracting finite-state Markov chains imbedded in a mixture of white Gaussian noise (WGN) and deterministic signals of known functional form with unknown parameters