Hierarchical mixtures of experts and the EM algorithm
Neural Computation
ML estimation of the multivariate t distribution and the EM algorithm
Journal of Multivariate Analysis
Fast and accurate acoustic modelling with semi-continuous HMMs
Speech Communication
Mixtures of probabilistic principal component analyzers
Neural Computation
Unsupervised Learning of Finite Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Cluster analysis: a further approach based on density estimation
Computational Statistics & Data Analysis
Mixture structure analysis using the Akaike Information Criterion and the bootstrap
Statistics and Computing
Robust mixture modelling using the t distribution
Statistics and Computing
An unsupervised and non-parametric Bayesian classifier
Pattern Recognition Letters
Machine Learning
Stability of Stochastic Approximation under Verifiable Conditions
SIAM Journal on Control and Optimization
On-line EM Algorithm for the Normalized Gaussian Network
Neural Computation
The Lincoln tied-mixture HMM continuous speech recognizer
ICASSP '91 Proceedings of the Acoustics, Speech, and Signal Processing, 1991. ICASSP-91., 1991 International Conference
EEG signal classification using wavelet feature extraction and a mixture of expert model
Expert Systems with Applications: An International Journal
Neural Networks
Constrained monotone EM algorithms for finite mixture of multivariate Gaussians
Computational Statistics & Data Analysis
A Direct Method for Building Sparse Kernel Learning Algorithms
The Journal of Machine Learning Research
Statistical analysis of hyper-spectral data: a non-Gaussian approach
EURASIP Journal on Applied Signal Processing
IEEE Transactions on Pattern Analysis and Machine Intelligence
A linear fit gets the correct monotonicity directions
Machine Learning
A statistical model of cluster stability
Pattern Recognition
Simplifying Mixture Models Using the Unscented Transform
IEEE Transactions on Pattern Analysis and Machine Intelligence
Robust fuzzy clustering using mixtures of Student's-t distributions
Pattern Recognition Letters
A new feature selection method for Gaussian mixture clustering
Pattern Recognition
Mixture-model cluster analysis using information theoretical criteria
Intelligent Data Analysis
Multivariate Student-t self-organizing maps
Neural Networks
Robust Bayesian mixture modelling
Neurocomputing
On the sparseness of 1-norm support vector machines
Neural Networks
IEEE Transactions on Signal Processing
Second-order heavy-tailed distributions and tail analysis
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing
Mismatch in high-rate entropy-constrained vector quantization
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Unsupervised Learning of Gaussian Mixtures Based on Variational Component Splitting
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Most of the current approaches to mixture modeling consider mixture components from a few families of probability distributions, in particular from the Gaussian family. The reasons of these preferences can be traced to their training algorithms, typically versions of the Expectation-Maximization (EM) method. The re-estimation equations needed by this method become very complex as the mixture components depart from the simplest cases. Here we propose to use a stochastic approximation method for probabilistic mixture learning. Under this method it is straightforward to train mixtures composed by a wide range of mixture components from different families. Hence, it is a flexible alternative for mixture learning. Experimental results are presented to show the probability density and missing value estimation capabilities of our proposal.