Original Contribution: Stacked generalization
Neural Networks
Machine Learning
Combining Nearest Neighbor Classifiers Through Multiple Feature Subsets
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Stochastic Attribute Selection Committees
AI '98 Selected papers from the 11th Australian Joint Conference on Artificial Intelligence on Advanced Topics in Artificial Intelligence
Classification and regression by combining models
Classification and regression by combining models
Neural Computation
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Ensembles of Learning Machines
WIRN VIETRI 2002 Proceedings of the 13th Italian Workshop on Neural Nets-Revised Papers
Feature Ranking Ensembles for Facial Action Unit Classification
ANNPR '08 Proceedings of the 3rd IAPR workshop on Artificial Neural Networks in Pattern Recognition
Ensemble Approaches to Facial Action Unit Classification
CIARP '08 Proceedings of the 13th Iberoamerican congress on Pattern Recognition: Progress in Pattern Recognition, Image Analysis and Applications
A Wrapper Method for Feature Selection in Multiple Classes Datasets
IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part I: Bio-Inspired Systems: Computational and Ambient Intelligence
MCS '09 Proceedings of the 8th International Workshop on Multiple Classifier Systems
Bootstrap feature selection for ensemble classifiers
ICDM'10 Proceedings of the 10th industrial conference on Advances in data mining: applications and theoretical aspects
Observations on boosting feature selection
MCS'05 Proceedings of the 6th international conference on Multiple Classifier Systems
Decoding visual brain states from fMRI using an ensemble of classifiers
Pattern Recognition
KES'05 Proceedings of the 9th international conference on Knowledge-Based Intelligent Information and Engineering Systems - Volume Part III
ReinSel: A class-based mechanism for feature selection in ensemble of classifiers
Applied Soft Computing
Random subspace method and genetic algorithm applied to a LS-SVM ensemble
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Hybrid random subsample classifier ensemble for high dimensional data sets
International Journal of Hybrid Intelligent Systems
Hi-index | 0.00 |
Using an ensemble of classifiers instead of a single classifier has been shown to improve generalization performance in many machine learning problems [4, 16]. However, the extent of such improvement depends greatly on the amount of correlation among the errors of the base classifiers [1,14]. As such, reducing those correlations while keeping the base classifiers' performance levels high is a promising research topic. In this paper, we describe input decimation, a method that decouples the base classifiers by training them with different subsets of the input features. In past work [15], we showed the theoretical benefits of input decimation and presented its application to a handful of real data sets. In this paper, we provide a systematic study of input decimation on synthetic data sets and analyze how the interaction between correlation and performance in base classifiers affects ensemble performance.