Original Contribution: Stacked generalization
Neural Networks
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Ensembling neural networks: many could be better than all
Artificial Intelligence
Machine Learning
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Not So Naive Bayes: Aggregating One-Dependence Estimators
Machine Learning
Improvements to Platt's SMO Algorithm for SVM Classifier Design
Neural Computation
Experiments with AdaBoost.RT, an improved boosting scheme for regression
Neural Computation
Selective fusion of heterogeneous classifiers
Intelligent Data Analysis
IEEE Transactions on Knowledge and Data Engineering
Ensembles of relational classifiers
Knowledge and Information Systems
Matrix factorization and neighbor based algorithms for the netflix prize problem
Proceedings of the 2008 ACM conference on Recommender systems
Scalable Collaborative Filtering Approaches for Large Recommender Systems
The Journal of Machine Learning Research
Stacked generalization: when does it work?
IJCAI'97 Proceedings of the Fifteenth international joint conference on Artifical intelligence - Volume 2
Modeling wine preferences by data mining from physicochemical properties
Decision Support Systems
A novel ensemble machine learning for robust microarray data classification
Computers in Biology and Medicine
Feature selection for bagging of support vector machines
PRICAI'06 Proceedings of the 9th Pacific Rim international conference on Artificial intelligence
Selective ensemble of decision trees
RSFDGrC'03 Proceedings of the 9th international conference on Rough sets, fuzzy sets, data mining, and granular computing
Combining predictions for accurate recommender systems
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Graph-Based model-selection framework for large ensembles
HAIS'10 Proceedings of the 5th international conference on Hybrid Artificial Intelligence Systems - Volume Part I
Lung cancer cell identification based on artificial neural network ensembles
Artificial Intelligence in Medicine
Hi-index | 0.01 |
Ensembles constitute one of the most prominent class of hybrid prediction models. One basically assumes that different models compensate each other's errors if one combines them in an appropriate way. Often, a large number of various prediction models are available. However, many of them may share similar error characteristics, which highly depress the error compensation effect. Thus the selection of an appropriate subset of models is crucial. In this paper, we address this issue. As major contribution, for the case if large number of models is present, we propose a network-based framework for model selection while paying special attention to the interaction effect of models. In this framework, we introduce four ensemble techniques and compare them to the state-of-the-art in experiments on publicly available real-world data.