Original Contribution: Stacked generalization
Neural Networks
Machine Learning
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Using a Neural Network to Approximate an Ensemble of Classifiers
Neural Processing Letters
Machine Learning
Knowledge Acquisition form Examples Vis Multiple Models
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Bayesian Averaging of Classifiers and the Overfitting Problem
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Ensemble selection from libraries of models
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Naive Bayes models for probability estimation
ICML '05 Proceedings of the 22nd international conference on Machine learning
Constructing diverse classifier ensembles using artificial training examples
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Structure compilation: trading structure for features
Proceedings of the 25th international conference on Machine learning
Neighborhood-Based Local Sensitivity
ECML '07 Proceedings of the 18th European conference on Machine Learning
Artificial neural network reduction through oracle learning
Intelligent Data Analysis
A method for evaluation of learning components
Automated Software Engineering
Hi-index | 0.00 |
Often the best performing supervised learning models are ensembles of hundreds or thousands of base-level classifiers. Unfortunately, the space required to store this many classifiers, and the time required to execute them at run-time, prohibits their use in applications where test sets are large (e.g. Google), where storage space is at a premium (e.g. PDAs), and where computational power is limited (e.g. hea-ring aids). We present a method for "compressing" large, complex ensembles into smaller, faster models, usually without significant loss in performance.