Evolutionary algorithms in theory and practice: evolution strategies, evolutionary programming, genetic algorithms
Bayesian Ying-Yang machine, clustering and number of clusters
Pattern Recognition Letters - special issue on pattern recognition in practice V
Learning mixture models using a genetic version of the EM algorithm
Pattern Recognition Letters
How to solve it: modern heuristics
How to solve it: modern heuristics
Unsupervised Learning of Finite Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Efficient greedy learning of Gaussian mixture models
Neural Computation
BYY harmony learning, structural RPCL, and topological self-organizing on mixture models
Neural Networks - New developments in self-organizing maps
Learning Mixtures of Gaussians
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
SMEM Algorithm for Mixture Models
Neural Computation
On convergence properties of the em algorithm for gaussian mixtures
Neural Computation
Clustering in image space for place recognition and visualannotations for human-robot interaction
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
An integer-coded evolutionary approach for mixture maximum likelihood clustering
Pattern Recognition Letters
Spatial Fuzzy Clustering Using Varying Coefficients
ADMA '07 Proceedings of the 3rd international conference on Advanced Data Mining and Applications
On-line Arabic handwriting recognition system based on visual encoding and genetic algorithm
Engineering Applications of Artificial Intelligence
Evolutionary maximum likelihood image compression
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
Pattern Recognition Letters
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on human computing
The use of vanishing point for the classification of reflections from foreground mask in videos
IEEE Transactions on Image Processing
Learning mixture models via component-wise parameter smoothing
Computational Statistics & Data Analysis
Learning the number of Gaussian cusing hypothesis test
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Data classification with a generalized Gaussian components based density estimation algorithm
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Learning Gaussian mixture models with entropy-based criteria
IEEE Transactions on Neural Networks
Improving motif refinement using hybrid expectation maximization and random projection
ISB '10 Proceedings of the International Symposium on Biocomputing
A fast nonparametric noncausal MRF-based texture synthesis scheme using a novel FKDE algorithm
IEEE Transactions on Image Processing
PAKDD'08 Proceedings of the 12th Pacific-Asia conference on Advances in knowledge discovery and data mining
Quantization-based clustering algorithm
Pattern Recognition
Energy-efficient estimation of clock offset for inactive nodes in wireless sensor networks
IEEE Transactions on Information Theory
Morphological evolution of freeform robots
Proceedings of the 12th annual conference on Genetic and evolutionary computation
Image change detection using Gaussian mixture model and genetic algorithm
Journal of Visual Communication and Image Representation
Large margin learning of Bayesian classifiers based on Gaussian mixture models
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part III
Dual stream speech recognition using articulatory syllable models
International Journal of Speech Technology
Learning latent variable models from distributed and abstracted data
Information Sciences: an International Journal
Genetic algorithm for finding cluster hierarchies
DEXA'11 Proceedings of the 22nd international conference on Database and expert systems applications - Volume Part I
ACO-based BW algorithm for parameter estimation of hidden Markov models
International Journal of Computer Applications in Technology
Identification of nuclear magnetic resonance signals via gaussian mixture decomposition
IDA'11 Proceedings of the 10th international conference on Advances in intelligent data analysis X
Maximum likelihood estimation of Gaussian mixture models using stochastic search
Pattern Recognition
ICDM'06 Proceedings of the 6th Industrial Conference on Data Mining conference on Advances in Data Mining: applications in Medicine, Web Mining, Marketing, Image and Signal Mining
Image segmentation for robots: fast self-adapting gaussian mixture model
ICIAR'10 Proceedings of the 7th international conference on Image Analysis and Recognition - Volume Part I
VoCS'08 Proceedings of the 2008 international conference on Visions of Computer Science: BCS International Academic Conference
Remote sensing image change detection method based on contextual information
IScIDE'11 Proceedings of the Second Sino-foreign-interchange conference on Intelligent Science and Intelligent Data Engineering
Random swap EM algorithm for Gaussian mixture models
Pattern Recognition Letters
Using evolutionary algorithms for model-based clustering
Pattern Recognition Letters
Probability-based text clustering algorithm by alternately repeating two operations
Journal of Information Science
Learning symbolic representations of hybrid dynamical systems
The Journal of Machine Learning Research
Hi-index | 0.22 |
We propose a genetic-based expectation-maximization (GA-EM) algorithm for learning Gaussian mixture models from multivariate data. This algorithm is capable of selecting the number of components of the model using the minimum description length (MDL) criterion. Our approach benefits from the properties of Genetic algorithms (GA) and the EM algorithm by combination of both into a single procedure. The population-based stochastic search of the GA explores the search space more thoroughly than the EM method. Therefore, our algorithm enables escaping from local optimal solutions since the algorithm becomes less sensitive to its initialization. The GA-EM algorithm is elitist which maintains the monotonic convergence property of the EM algorithm. The experiments on simulated and real data show that the GA-EM outperforms the EM method since: 1) We have obtained a better MDL score while using exactly the same termination condition for both algorithms. 2) Our approach identifies the number of components which were used to generate the underlying data more often than the EM algorithm.