Crafting Papers on Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
The Journal of Machine Learning Research
Comparing clusterings: an axiomatic view
ICML '05 Proceedings of the 22nd international conference on Machine learning
ICML '06 Proceedings of the 23rd international conference on Machine learning
Topics over time: a non-Markov continuous-time model of topical trends
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
An HDP-HMM for systems with state persistence
Proceedings of the 25th international conference on Machine learning
Multi-task compressive sensing with Dirichlet process priors
Proceedings of the 25th international conference on Machine learning
Fast nonparametric matrix factorization for large-scale collaborative filtering
Proceedings of the 32nd international ACM SIGIR conference on Research and development in information retrieval
Logistic Stick-Breaking Process
The Journal of Machine Learning Research
Accounting for data dependencies within a hierarchical dirichlet process mixture model
Proceedings of the 20th ACM international conference on Information and knowledge management
Modeling topic hierarchies with the recursive chinese restaurant process
Proceedings of the 21st ACM international conference on Information and knowledge management
Hi-index | 0.00 |
Dirichlet process mixture (DPM) model is one of the most important Bayesian nonparametric models owing to its efficiency of inference and flexibility for various applications. A fundamental assumption made by DPM model is that all data items are generated from a single, shared DP. This assumption, however, is restrictive in many practical settings where samples are generated from a collection of dependent DPs, each associated with a point in some covariate space. For example, documents in the proceedings of a conference are organized by year, or photos may be tagged and recorded with GPS locations. We present a general method for constructing dependent Dirichlet processes (DP) on arbitrary covariate space. The approach is based on restricting and projecting a DP defined on a space of continuous functions with different domains, which results in a collection of dependent random measures, each associated with a point in covariate space and is marginally DP distributed. The constructed collection of dependent DPs can be used as a nonparametric prior of infinite dynamic mixture models, which allow each mixture component to appear/disappear and vary in a subspace of covariate space. Furthermore, we discuss choices of base distributions of functions in a variety of settings as a flexible method to control dependencies. In addition, we develop an efficient Gibbs sampler for model inference where all underlying random measures are integrated out. Finally, experiment results on temporal modeling and spatial modeling datasets demonstrate the effectiveness of the method in modeling dynamic mixture models on different types of covariates.