A fast algorithm for robust mixtures in the presence of measurement errors
IEEE Transactions on Neural Networks
A fast implementation of the EM algorithm for mixture of multinomials
ADMA'06 Proceedings of the Second international conference on Advanced Data Mining and Applications
State of the art in photon density estimation
ACM SIGGRAPH 2012 Courses
Progressive expectation-maximization for hierarchical volumetric photon mapping
EGSR'11 Proceedings of the Twenty-second Eurographics conference on Rendering
Approximate gaussian mixtures for large scale vocabularies
ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part III
A comparative study of novel robust clustering algorithms
Intelligent Data Analysis
Hi-index | 0.00 |
Motivated by the poor performance (linear complexity) of the EM algorithm in clustering large data sets, and inspired by the successful accelerated versions of related algorithms like k-means, we derive an accelerated variant of the EM algorithm for Gaussian mixtures that: (1) offers speedups that are at least linear in the number of data points, (2) ensures convergence by strictly increasing a lower bound on the data log-likelihood in each learning step, and (3) allows ample freedom in the design of other accelerated variants. We also derive a similar accelerated algorithm for greedy mixture learning, where very satisfactory results are obtained. The core idea is to define a lower bound on the data log-likelihood based on a grouping of data points. The bound is maximized by computing in turn (i) optimal assignments of groups of data points to the mixture components, and (ii) optimal re-estimation of the model parameters based on average sufficient statistics computed over groups of data points. The proposed method naturally generalizes to mixtures of other members of the exponential family. Experimental results show the potential of the proposed method over other state-of-the-art acceleration techniques.