Cluster ensembles --- a knowledge reuse framework for combining multiple partitions
The Journal of Machine Learning Research
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Solving cluster ensemble problems by bipartite graph partitioning
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Clustering with Bregman Divergences
The Journal of Machine Learning Research
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Classifier ensembles: Select real-world applications
Information Fusion
CONSENSUS-BASED ENSEMBLES OF SOFT CLUSTERINGS
Applied Artificial Intelligence
Introduction to Semi-Supervised Learning
Introduction to Semi-Supervised Learning
Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery
Combining supervised and unsupervised models via unconstrained probabilistic embedding
Information Sciences: an International Journal
Hi-index | 0.00 |
The combination of multiple classifiers to generate a single classifier has been shown to be very useful in practice. Similarly, several efforts have shown that cluster ensembles can improve the quality of results as compared to a single clustering solution. These observations suggest that ensembles containing both classifiers and clusterers are potentially useful as well. Specifically, clusterers provide supplementary constraints that can improve the generalization capability of the resulting classifier. This paper introduces a new algorithm named C3E that combines ensembles of classifiers and clusterers. Our experimental evaluation of C3E shows that it provides good classification accuracies in eleven tasks derived from three real-world applications. In addition, C3E produces better results than the recently introduced Bipartite Graph-based Consensus Maximization (BGCM) Algorithm, which combines multiple supervised and unsupervised models and is the algorithm most closely related to C3E.