Adaptation in natural and artificial systems
Adaptation in natural and artificial systems
Machine Learning
Neural network design
Optimal linear combinations of neural networks
Neural Networks
Ensemble learning via negative correlation
Neural Networks
How to solve it: modern heuristics
How to solve it: modern heuristics
Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering
Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering
Ensembling neural networks: many could be better than all
Artificial Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Dynamic Weighted Majority: A New Ensemble Method for Tracking Concept Drift
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Ensemble selection from libraries of models
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Pruning in ordered bagging ensembles
ICML '06 Proceedings of the 23rd international conference on Machine learning
Managing Diversity in Regression Ensembles
The Journal of Machine Learning Research
Classifier ensemble selection using hybrid genetic algorithms
Pattern Recognition Letters
An Analysis of Ensemble Pruning Techniques Based on Ordered Aggregation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Introduction to Genetic Algorithms
Introduction to Genetic Algorithms
Neural network architecture selection: can function complexity help?
Neural Processing Letters
A new architecture selection method based on tabu search for artificial neural networks
Expert Systems with Applications: An International Journal
Margin distribution based bagging pruning
Neurocomputing
Evolutionary ensembles with negative correlation learning
IEEE Transactions on Evolutionary Computation
Learning Ensembles of Neural Networks by Means of a Bayesian Artificial Immune System
IEEE Transactions on Neural Networks
A genetic algorithm for designing neural network ensembles
Proceedings of the 14th annual conference on Genetic and evolutionary computation
Hi-index | 0.01 |
In the last decades ensemble learning has established itself as a valuable strategy within the computational intelligence modeling and machine learning community. Ensemble learning is a paradigm where multiple models combine in some way their decisions, or their learning algorithms, or different data to improve the prediction performance. Ensemble learning aims at improving the generalization ability and the reliability of the system. Key factors of ensemble systems are diversity, training and combining ensemble members to improve the ensemble system performance. Since there is no unified procedure to address all these issues, this work proposes and compares Genetic Algorithm and Simulated Annealing based approaches for the automatic development of Neural Network Ensembles for regression problems. The main contribution of this work is the development of optimization techniques that selects the best subset of models to be aggregated taking into account all the key factors of ensemble systems (e.g., diversity, training ensemble members and combination strategy). Experiments on two well-known data sets are reported to evaluate the effectiveness of the proposed methodologies. Results show that these outperform other approaches including Simple Bagging, Negative Correlation Learning (NCL), AdaBoost and GASEN in terms of generalization ability.