Speaker identification and verification using Gaussian mixture speaker models
Speech Communication
Probabilistic latent semantic indexing
Proceedings of the 22nd annual international ACM SIGIR conference on Research and development in information retrieval
The Journal of Machine Learning Research
Time Series Classification Using Gaussian Mixture Models of Reconstructed Phase Spaces
IEEE Transactions on Knowledge and Data Engineering
Probabilistic latent semantic analysis
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Topic modeling with network regularization
Proceedings of the 17th international conference on World Wide Web
Modeling hidden topics on document manifold
Proceedings of the 17th ACM conference on Information and knowledge management
Probabilistic community discovery using hierarchical latent Gaussian mixture model
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 1
Hi-index | 0.00 |
Mixture models, such as Gaussian Mixture Model, have been widely used in many applications for modeling data. Gaussian mixture model (GMM) assumes that data points are generated from a set of Gaussian models with the same set of mixture weights. A natural extension of GMM is the probabilistic latent semantic analysis (PLSA) model, which assigns different mixture weights for each data point. Thus, PLSA is more flexible than the GMM method. However, as a tradeoff, PLSA usually suffers from the overfitting problem. In this paper, we propose a regularized probabilistic latent semantic analysis model (RPLSA), which can properly adjust the amount of model flexibility so that not only the training data can be fit well but also the model is robust to avoid the overfitting problem. We conduct empirical study for the application of speaker identification to show the effectiveness of the new model. The experiment results on the NIST speaker recognition dataset indicate that the RPLSA model outperforms both the GMM and PLSA models substantially. The principle of RPLSA of appropriately adjusting model flexibility can be naturally extended to other applications and other types of mixture models.