The cascade-correlation learning architecture
Advances in neural information processing systems 2
Original Contribution: Stacked generalization
Neural Networks
Machine Learning
Ensemble learning via negative correlation
Neural Networks
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Constructing Boosting Algorithms from SVMs: An Application to One-Class Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improving Regressors using Boosting Techniques
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
EuroCOLT '99 Proceedings of the 4th European Conference on Computational Learning Theory
Cluster ensembles: a knowledge reuse framework for combining partitionings
Eighteenth national conference on Artificial intelligence
Negative correlation learning and evolutionary design of neural network ensembles
Negative correlation learning and evolutionary design of neural network ensembles
The Journal of Machine Learning Research
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Machine Learning
Solving cluster ensemble problems by bipartite graph partitioning
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Managing Diversity in Regression Ensembles
The Journal of Machine Learning Research
Multiple Classifier Systems: 6th International Workshop, MCS 2005, Seaside, CA, USA, June 13-15, 2005, Proceedings (Lecture Notes in Computer Science)
Selective fusion of heterogeneous classifiers
Intelligent Data Analysis
A neural network ensemble method with jittered training data for time series forecasting
Information Sciences: an International Journal
Adaptive mixtures of local experts
Neural Computation
Incremental construction of classifier and discriminant ensembles
Information Sciences: an International Journal
Proceedings of the 7th international conference on Multiple classifier systems
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
A dynamic classifier ensemble selection approach for noise data
Information Sciences: an International Journal
Embedded local feature selection within mixture of experts
Information Sciences: an International Journal
Hi-index | 0.07 |
Ensemble methods learn models from examples by generating a set of hypotheses, which are then combined to make a single decision. We propose an algorithm to construct an ensemble for regression estimation. Our proposal generates the hypotheses sequentially using a simple procedure whereby the target map to be learned by the base learner at each step is modified as a function of the previous step error. We state a theorem that relates the overall upper error bound of the composite hypothesis obtained within this procedure to the training errors of the individual hypotheses. We also demonstrate that the proposed procedure results in a learning functional that enforces a weighted form of Negative Correlation with respect to previous hypotheses. Additionally, we incorporate resampling to allow the ensemble to control the impact of highly influential data points, showing that this component significantly improves its ability to generalize from the known examples. We describe experiments performed to evaluate our technique on real and synthetic datasets using neural networks as base learners. These results show that our technique exhibits considerably better prediction errors than the Negative Correlation (NC) method and that its performance is very competitive with that of the Bagging and AdaBoost algorithms for regression estimation.