Probability Density Estimation Using Adaptive Activation Function Neurons
Neural Processing Letters
Recognition of Affective Communicative Intent in Robot-Directed Speech
Autonomous Robots
A Greedy EM Algorithm for Gaussian Mixture Learning
Neural Processing Letters
Efficient greedy learning of Gaussian mixture models
Neural Computation
Pairwise Clustering with Matrix Factorisation and the EM Algorithm
ECCV '02 Proceedings of the 7th European Conference on Computer Vision-Part II
A Better Method than Tail-fitting Algorithm for Jitter Separation Based on Gaussian Mixture Model
Journal of Electronic Testing: Theory and Applications
Learning Gaussian mixture models with entropy-based criteria
IEEE Transactions on Neural Networks
Active learning schemes for reduced dimensionality hyperspectral classification
Asilomar'09 Proceedings of the 43rd Asilomar conference on Signals, systems and computers
A novel evolutionary clustering algorithm based on Gaussian mixture model
ICCOMP'06 Proceedings of the 10th WSEAS international conference on Computers
Two entropy-based methods for learning unsupervised gaussian mixture models
SSPR'06/SPR'06 Proceedings of the 2006 joint IAPR international conference on Structural, Syntactic, and Statistical Pattern Recognition
Color image segmentation through unsupervised gaussian mixture models
IBERAMIA-SBIA'06 Proceedings of the 2nd international joint conference, and Proceedings of the 10th Ibero-American Conference on AI 18th Brazilian conference on Advances in Artificial Intelligence
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part II
Hi-index | 0.00 |
We address the problem of probability density function estimation using a Gaussian mixture model updated with the expectation-maximization (EM) algorithm. To deal with the case of an unknown number of mixing kernels, we define a new measure for Gaussian mixtures, called total kurtosis, which is based on the weighted sample kurtoses of the kernels. This measure provides an indication of how well the Gaussian mixture fits the data. Then we propose a new dynamic algorithm for Gaussian mixture density estimation which monitors the total kurtosis at each step of the EM algorithm in order to decide dynamically on the correct number of kernels and possibly escape from local maxima. We show the potential of our technique in approximating unknown densities through a series of examples with several density estimation problems