Self-Organizing Maps
Time Series Analysis: Forecasting and Control
Time Series Analysis: Forecasting and Control
Self-organizing mixture models
Neurocomputing
The spherical hidden markov self organizing map for learning time series data
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part I
Hi-index | 0.00 |
The self-organizing mixture models (SOMMs) were proposed as an expectation-maximization (EM) algorithm that yields topology preserving maps of data based on probabilistic mixture models. Compared to self-organizing maps, the SOMM algorithm has a clear interpretation: it maximizes the sum of data log likelihood and a penalty term that enforces self-organization. The object of this paper is to extend the SOMM algorithm to deal with multivariate time series. The standard SOMM algorithm assumes that the data are independent and identically distributed samples. However, the i.i.d. assumption is clearly inappropriate for time series. In this paper we propose the extension of the SOMM algorithm for multivariate time series, which we call self-organizing hidden Markov models (SOHMMs), by assuming that the time series is generated by hidden Markov models (HMMs).