A recurrent log-linearized Gaussian mixture network

  • Authors:
  • T. Tsuji;Nan Bu;O. Fukuda;M. Kaneko

  • Affiliations:
  • Dept. of Artificial Complex Syst. Eng., Hiroshima Univ., Higashi, Japan;-;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Context in time series is one of the most useful and interesting characteristics for machine learning. In some cases, the dynamic characteristic would be the only basis for achieving a possible classification. A novel neural network, which is named "a recurrent log-linearized Gaussian mixture network (R-LLGMN)," is proposed in this paper for classification of time series. The structure of this network is based on a hidden Markov model (HMM), which has been well developed in the area of speech recognition. R-LLGMN can as well be interpreted as an extension of a probabilistic neural network using a log-linearized Gaussian mixture model, in which recurrent connections have been incorporated to make temporal information in use. Some simulation experiments are carried out to compare R-LLGMN with the traditional estimator of HMM as classifiers, and finally, pattern classification experiments for EEG signals are conducted. It is indicated from these experiments that R-LLGMN can successfully classify not only artificial data but real biological data such as EEG signals.