Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Machine Learning
Machine Learning - Special issue on learning with probabilistic representations
IEEE Transactions on Pattern Analysis and Machine Intelligence
An approach to the automatic design of multiple classifier systems
Pattern Recognition Letters - Special issue on machine learning and data mining in pattern recognition
Cost complexity-based pruning of ensemble classifiers
Knowledge and Information Systems
Ensembling neural networks: many could be better than all
Artificial Intelligence
SPRINT: A Scalable Parallel Classifier for Data Mining
VLDB '96 Proceedings of the 22th International Conference on Very Large Data Bases
Maintaining the Diversity of Genetic Programs
EuroGP '02 Proceedings of the 5th European Conference on Genetic Programming
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Selective fusion of heterogeneous classifiers
Intelligent Data Analysis
Ensemble Pruning Via Semi-definite Programming
The Journal of Machine Learning Research
Ensemble learning for free with evolutionary algorithms?
Proceedings of the 9th annual conference on Genetic and evolutionary computation
An Analysis of Ensemble Pruning Techniques Based on Ordered Aggregation
IEEE Transactions on Pattern Analysis and Machine Intelligence
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
Information theoretic combination of classifiers with application to AdaBoost
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
Selective ensemble of decision trees
RSFDGrC'03 Proceedings of the 9th international conference on Rough sets, fuzzy sets, data mining, and granular computing
A Bayesian method for constructing Bayesian belief networks from databases
UAI'91 Proceedings of the Seventh conference on Uncertainty in Artificial Intelligence
EvoApplicatons'10 Proceedings of the 2010 international conference on Applications of Evolutionary Computation - Volume Part I
Pruning adaptive boosting ensembles by means of a genetic algorithm
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
Evolutionary ensembles with negative correlation learning
IEEE Transactions on Evolutionary Computation
Inducing oblique decision trees with evolutionary algorithms
IEEE Transactions on Evolutionary Computation
GP ensembles for large-scale data classification
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Evolutionary Computation
Simultaneous training of negatively correlated neural networks inan ensemble
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Approximating discrete probability distributions with dependence trees
IEEE Transactions on Information Theory
Dynamic ensemble extreme learning machine based on sample entropy
Soft Computing - A Fusion of Foundations, Methodologies and Applications - Special Issue on Extreme Learning Machines (ELM 2011) Hangzhou, China, December 6 – 8, 2011
Hi-index | 0.07 |
Ensemble techniques have been widely used to improve classification performance also in the case of GP-based systems. These techniques should improve classification accuracy by using voting strategies to combine the responses of different classifiers. However, even reducing the number of classifiers composing the ensemble, by selecting only those appropriately ''diverse'' according to a given measure, gives no guarantee of obtaining significant improvements in both classification accuracy and generalization capacity. This paper presents a novel approach for combining GP-based ensembles by means of a Bayesian Network. The proposed system is able to learn and combine decision tree ensembles effectively by using two different strategies: in the first, decision tree ensembles are learned by means of a boosted GP algorithm; in the second, the responses of the ensemble are combined using a Bayesian network, which also implements a selection strategy to reduce the number of classifiers. Experiments on several data sets show that the approach obtains comparable or better accuracy with respect to other methods proposed in the literature, considerably reducing the number of classifiers used. In addition, a comparison with similar approaches, confirmed the goodness of our method and its superiority with respect to other selection techniques based on diversity.