Self-organizing hidden Markov models

  • Authors:
  • Nobuhiko Yamaguchi

  • Affiliations:
  • Faculty of Science and Engineering, Saga University, Saga-shi, Japan

  • Venue:
  • ICONIP'10 Proceedings of the 17th international conference on Neural information processing: models and applications - Volume Part II
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

The self-organizing mixture models (SOMMs) were proposed as an expectation-maximization (EM) algorithm that yields topology preserving maps of data based on probabilistic mixture models. Compared to self-organizing maps, the SOMM algorithm has a clear interpretation: it maximizes the sum of data log likelihood and a penalty term that enforces self-organization. The object of this paper is to extend the SOMM algorithm to deal with multivariate time series. The standard SOMM algorithm assumes that the data are independent and identically distributed samples. However, the i.i.d. assumption is clearly inappropriate for time series. In this paper we propose the extension of the SOMM algorithm for multivariate time series, which we call self-organizing hidden Markov models (SOHMMs), by assuming that the time series is generated by hidden Markov models (HMMs).