A view of the EM algorithm that justifies incremental, sparse, and other variants
Learning in graphical models
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature Selection for Unsupervised Learning
The Journal of Machine Learning Research
Simultaneous Feature Selection and Clustering Using Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Cloud model-based data attributes reduction for clustering
Proceedings of the 1st international conference on Forensic applications and techniques in telecommunications, information, and multimedia and workshop
Computational Intelligence and Security
A new feature selection method for Gaussian mixture clustering
Pattern Recognition
The mixtures of Student's t-distributions as a robust framework for rigid registration
Image and Vision Computing
Unsupervised feature selection for multi-cluster data
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
A Bayesian framework for image segmentation with spatially varying mixtures
IEEE Transactions on Image Processing
Target segmentation in scenes with diverse background
SCIA'11 Proceedings of the 17th Scandinavian conference on Image analysis
Feature subset-wise mixture model-based clustering via local search algorithm
AI'10 Proceedings of the 23rd Canadian conference on Advances in Artificial Intelligence
Unsupervised feature selection for linked social media data
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part III
Unsupervised Feature Selection with Feature Clustering
WI-IAT '12 Proceedings of the The 2012 IEEE/WIC/ACM International Joint Conferences on Web Intelligence and Intelligent Agent Technology - Volume 01
Learning finite Beta-Liouville mixture models via variational bayes for proportional data clustering
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 0.14 |
We present a Bayesian method for mixture model training that simultaneously treats the feature selection and the model selection problem. The method is based on the integration of a mixture model formulation that takes into account the saliency of the features and a Bayesian approach to mixture learning that can be used to estimate the number of mixture components. The proposed learning algorithm follows the variational framework and can simultaneously optimize over the number of components, the saliency of the features, and the parameters of the mixture model. Experimental results using high-dimensional artificial and real data illustrate the effectiveness of the method.