An efficient EM approach to parameter learning of the mixture of gaussian processes

  • Authors:
  • Yan Yang;Jinwen Ma

  • Affiliations:
  • Department of Information Science, School of Mathematical Sciences & LMAM Peking University, Beijing, P.R. China;Department of Information Science, School of Mathematical Sciences & LMAM Peking University, Beijing, P.R. China

  • Venue:
  • ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part II
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

The mixture of Gaussian processes (MGP) is an important probabilistic model which is often applied to the regression and classification of temporal data. But the existing EM algorithms for its parameter learning encounters a hard difficulty on how to compute the expectations of those assignment variables (as the hidden ones). In this paper, we utilize the leave-one-out cross-validation probability decomposition for the conditional probability and develop an efficient EM algorithm for the MGP model in which the expectations of the assignment variables can be solved directly in the E-step. In the M-step, a conjugate gradient method under a standard Wolfe-Powell line search is implemented to learn the parameters. Furthermore, the proposed EM algorithm can be carried out in a hard cutting way such that each data point is assigned to the GP expert with the highest posterior in the E-step and then the parameters of each GP expert can be learned with these assigned data points in the M-step. Therefore, it has a potential advantage of handling large datasets in comparison with those soft cutting methods. The experimental results demonstrate that our proposed EM algorithm is effective and efficient.