Accelerating Expectation-Maximization Algorithms with Frequent Updates

  • Authors:
  • Jiangtao Yin;Yanfeng Zhang;Lixin Gao

  • Affiliations:
  • -;-;-

  • Venue:
  • CLUSTER '12 Proceedings of the 2012 IEEE International Conference on Cluster Computing
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Expectation Maximization is a popular approach for parameter estimation in many applications such as image understanding, document classification, or genome data analysis. Despite the popularity of EM algorithms, it is challenging to efficiently implement these algorithms in a distributed environment. In particular, many EM algorithms that frequently update the parameters have been shown to be much more efficient than their concurrent counterparts. Accordingly, we propose two approaches to parallelize such EM algorithms in a distributed environment so as to scale to massive data sets. We prove that both approaches maintain the convergence properties of the EM algorithms. Based on the approaches, we design and implement a distributed framework, FreEM, to support the implementation of frequent updates for the EM algorithms. We show its efficiency through three well-known EM applications: k-means clustering, fuzzy c-means clustering and parameter estimation for the Gaussian Mixture model. We evaluate our framework on both a local cluster of machines and the Amazon EC2 cloud. Our evaluation shows that the EM algorithms with frequent updates implemented on FreEM can run much faster than those implementations with traditional concurrent updates.