A Classification EM algorithm for clustering and two stochastic versions
Computational Statistics & Data Analysis - Special issue on optimization techniques in statistics
Beta kernel estimators for density functions
Computational Statistics & Data Analysis
An experimental comparison of model-based clustering methods
Machine Learning
Extensions to the k-Means Algorithm for Clustering Large Data Sets with Categorical Values
Data Mining and Knowledge Discovery
Computational Statistics & Data Analysis
Applications of beta-mixture models in bioinformatics
Bioinformatics
Modern Multivariate Statistical Techniques: Regression, Classification, and Manifold Learning
Modern Multivariate Statistical Techniques: Regression, Classification, and Manifold Learning
Multiobjective Genetic Algorithms for Clustering: Applications in Data Mining and Bioinformatics
Multiobjective Genetic Algorithms for Clustering: Applications in Data Mining and Bioinformatics
Model-based clustering via linear cluster-weighted models
Computational Statistics & Data Analysis
Hi-index | 0.00 |
This paper addresses the problem of estimating a density, with either a compact support or a support bounded at only one end, exploiting a general and natural form of a finite mixture of distributions. Due to the importance of the concept of multimodality in the mixture framework, unimodal beta and gamma densities are used as mixture components, leading to a flexible modeling approach. Accordingly, a mode-based parameterization of the components is provided. A partitional clustering method, named $$k$$-bumps, is also proposed; it is used as an ad hoc initialization strategy in the EM algorithm to obtain the maximum likelihood estimation of the mixture parameters. The performance of the $$k$$-bumps algorithm as an initialization tool, in comparison to other common initialization strategies, is evaluated through some simulation experiments. Finally, two real applications are presented.