The cascade-correlation learning architecture
Advances in neural information processing systems 2
Boosted mixture of experts: an ensemble learning scheme
Neural Computation
Combining predictors: comparison of five meta machine learning methods
Information Sciences: an International Journal
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Ensembling neural networks: many could be better than all
Artificial Intelligence
Using Correspondence Analysis to Combine Classifiers
Machine Learning
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Online Ensemble Learning: An Empirical Study
Machine Learning
Evolutionary ensembles with negative correlation learning
IEEE Transactions on Evolutionary Computation
Making use of population information in evolutionary artificialneural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Information Sciences: an International Journal
Journal of Real-Time Image Processing
Hi-index | 0.00 |
Neural network ensembles are widely use for classification and regression problems as an alternative to the use of isolated networks. In many applications, ensembles has proven a performance above the performance of just one network. In this paper we present a new approach to neural network ensembles that we call “cascade ensembles”. The approach is based on two ideas: (i) the ensemble is created constructively, and (ii) the output of each network is fed to the inputs of the subsequent networks. In this way we make a cascade of networks. This method is compared with standard ensembles in several problems of classification with excellent performance.