Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Machine Learning
Wrappers for performance enhancement and oblivious decision graphs
Wrappers for performance enhancement and oblivious decision graphs
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
An analysis of diversity measures
Machine Learning
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Class-switching neural network ensembles
Neurocomputing
Handbook of Parametric and Nonparametric Statistical Procedures
Handbook of Parametric and Nonparametric Statistical Procedures
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 1
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
Multiobjective Neural Network Ensembles Based on Regularized Negative Correlation Learning
IEEE Transactions on Knowledge and Data Engineering
A multi-expert model for dialogue and behavior control of conversational robots and agents
Knowledge-Based Systems
Non-uniform layered clustering for ensemble classifier generation and optimality
ICONIP'10 Proceedings of the 17th international conference on Neural information processing: theory and algorithms - Volume Part I
Novel Layered Clustering-Based Approach for Generating Ensemble of Classifiers
IEEE Transactions on Neural Networks
Ensemble Feature Ranking for Shellfish Farm Closure Cause Identification
Proceedings of Workshop on Machine Learning for Sensory Data Analysis
Hi-index | 0.00 |
In this paper, we propose a novel cluster oriented ensemble classifier generation method and a Genetic Algorithm based approach to optimize the parameters. In the proposed method the data set is partitioned into a variable number of clusters at different layers. Base classifiers are trained on the clusters at different layers. Due to the variability of the number of clusters at different layers, the cluster compositions in one layer are different from that in another layer. Due to this difference in cluster contents, the base classifiers trained at different layers are diverse among each other. A test pattern is classified by the base classifier of the nearest cluster at each layer and the decisions from different layers are fused using majority voting. The accuracy of the proposed method depends on the number of layers and the number of clusters at the corresponding layer. A Genetic Algorithm based search is incorporated to obtain the optimal number of layers and clusters. The Genetic Algorithm is evaluated under three different objective functions: optimizing (i) accuracy, (ii) diversity, and (iii) accuracyxdiversity. We have conducted a number of experiments to evaluate the effectiveness of the different objective functions.