The dynamic hierarchical Dirichlet process
Proceedings of the 25th international conference on Machine learning
The infinite hidden Markov random field model
IEEE Transactions on Neural Networks
A nonparametric Bayesian approach toward robot learning by demonstration
Robotics and Autonomous Systems
A reservoir-driven non-stationary hidden Markov model
Pattern Recognition
A spatially-constrained normalized Gamma process prior
Expert Systems with Applications: An International Journal
Hi-index | 35.69 |
We develop a hidden Markov mixture model based on a Dirichlet process (DP) prior, for representation of the statistics of sequential data for which a single hidden Markov model (HMM) may not be sufficient. The DP prior has an intrinsic clustering property that encourages parameter sharing, and this naturally reveals the proper number of mixture components. The evaluation of posterior distributions for all model parameters is achieved in two ways: 1) via a rigorous Markov chain Monte Carlo method; and 2) approximately and efficiently via a variational Bayes formulation. Using DP HMM mixture models in a Bayesian setting, we propose a novel scheme for music analysis, highlighting the effectiveness of the DP HMM mixture model. Music is treated as a time-series data sequence and each music piece is represented as a mixture of HMMs. We approximate the similarity of two music pieces by computing the distance between the associated HMM mixtures. Experimental results are presented for synthesized sequential data and from classical music clips. Music similarities computed using DP HMM mixture modeling are compared to those computed from Gaussian mixture modeling, for which the mixture modeling is also performed using DP. The results show that the performance of DP HMM mixture modeling exceeds that of the DP Gaussian mixture modeling.