On convergence properties of the em algorithm for gaussian mixtures
Neural Computation
Asymptotic Convergence Rate of the EM Algorithm for Gaussian Mixtures
Neural Computation
On the correct convergence of the EM algorithm for Gaussian mixtures
Pattern Recognition
Asymptotic convergence properties of the em algorithm for mixture of experts
Neural Computation
Accelerating EM: an empirical study
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Hi-index | 0.00 |
For Gaussian mixture, a comparative analysis has been made on the convergence rate by the Expectation-Maximization (EM) algorithm and its two types of modifications. One is a variant of the EM algorithm (denoted by VEM) which uses the old value of mean vectors instead of the latest updated one in the current updating of the covariance matrices. The other is obtained by adding a momentum term in the EM updating equation, called the Momentum EM algorithm (MEM). Their up-bound convergence rates have been obtained, including an extension and a modification of those given in Xu & Jordan (1996). It has been shown that the EM algorithm and VEM are equivalent in their local convergence and rates, and that the MEM can speed up the convergence of the EM algorithm if a suitable amount of momentum is added. Moreover, a theoretical guide on how to add momentum is proposed, and a possible approach for further speeding up the convergence is suggested.