Probabilistic Visual Learning for Object Representation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Unsupervised Texture Segmentation in a Deterministic Annealing Framework
IEEE Transactions on Pattern Analysis and Machine Intelligence
Bayesian Approaches to Gaussian Mixture Modeling
IEEE Transactions on Pattern Analysis and Machine Intelligence
Phase transitions and the perceptual organization of video sequences
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Algorithms for Model-Based Gaussian Hierarchical Clustering
SIAM Journal on Scientific Computing
A unifying review of linear Gaussian models
Neural Computation
Mixtures of probabilistic principal component analyzers
Neural Computation
Proceedings of the 1998 conference on Advances in neural information processing systems II
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Empirical Risk Approximation: An Induction Principle for Unsupervised Learning
Empirical Risk Approximation: An Induction Principle for Unsupervised Learning
IEEE Transactions on Neural Networks
Unsupervised Learning of Finite Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Clustered principal components for precomputed radiance transfer
ACM SIGGRAPH 2003 Papers
Almost autonomous training of mixtures of principal component analyzers
Pattern Recognition Letters
Robust feature extractions from geometric data using geometric algebra
SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
Automatic model selection by cross-validation for probabilistic PCA
Neural Processing Letters
Hi-index | 0.00 |
In the domain of unsupervised learning, mixtures of gaussians have become a popular tool for statistical modeling. For this class of generative models, we present a complexity control scheme, which provides an effective means for avoiding the problem of overfitting usually encountered with unconstrained (mixtures of) gaussians in high dimensions. According to some prespecified level of resolution as implied by a fixed variance noise model, the scheme provides an automatic selection of the dimensionalities of some local signal subspaces by maximum likelihood estimation. Together with a resolution-based control scheme for adjusting the number of mixture components, we arrive at an incremental model refinement procedure within a common deterministic annealing framework, which enables an efficient exploration of the model space. The advantages of the resolution-based framework are illustrated by experimental results on synthetic and high-dimensional real-world data.