More influence means less work: fast latent dirichlet allocation by influence scheduling

  • Authors:
  • Mirwaes Wahabzada;Kristian Kersting;Anja Pilz;Christian Bauckhage

  • Affiliations:
  • Fraunhofer Institute for Intelligent Analysis and Information Systems, Sankt Augustin, Germany;Fraunhofer Institute for Intelligent Analysis and Information Systems, Sankt Augustin, Germany;Fraunhofer Institute for Intelligent Analysis and Information Systems, Sankt Augustin, Germany;Fraunhofer Institute for Intelligent Analysis and Information Systems, Sankt Augustin, Germany

  • Venue:
  • Proceedings of the 20th ACM international conference on Information and knowledge management
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

There have recently been considerable advances in fast inference for (online) latent Dirichlet allocation (LDA). While it is widely recognized that the scheduling of documents in stochastic optimization and in turn in LDA may have significant consequences, this issue remains largely unexplored. Instead, practitioners schedule documents essentially uniformly at random, due perhaps to ease of implementation, and to the lack of clear guidelines on scheduling the documents. In this work, we address this issue and propose to schedule documents for an update that exert a disproportionately large influence on the topics of the corpus before less influential ones. More precisely, we justify to sample documents randomly biased towards those ones with higher norms to form mini-batches. On several real-world datasets, including 3M articles from Wikipedia and 8M from PubMed, we demonstrate that the resulting influence scheduled LDA can handily analyze massive document collections and find topic models as good or better than those found with online LDA, often at a fraction of time.