Original Contribution: Stacked generalization
Neural Networks
IEEE Transactions on Pattern Analysis and Machine Intelligence
Combining Multiple Representations and Classifiers for Pen-based Handwritten Digit Recognitio
ICDAR '97 Proceedings of the 4th International Conference on Document Analysis and Recognition
Adaptive mixtures of local experts
Neural Computation
Journal of Artificial Intelligence Research
Using boosting to prune bagging ensembles
Pattern Recognition Letters
Test-Cost Sensitive Classification Based on Conditioned Loss Functions
ECML '07 Proceedings of the 18th European conference on Machine Learning
Using Boosting to prune Double-Bagging ensembles
Computational Statistics & Data Analysis
Incremental construction of classifier and discriminant ensembles
Information Sciences: an International Journal
Cost-conscious multiple kernel learning
Pattern Recognition Letters
Cost-sensitive classification with respect to waiting cost
Knowledge-Based Systems
Neuro-fuzzy-combiner: an effective multiple classifier system
International Journal of Knowledge Engineering and Soft Data Paradigms
Qualitative test-cost sensitive classification
Pattern Recognition Letters
Computer Methods and Programs in Biomedicine
Proceedings of the 2nd Conference on Wireless Health
Eigenclassifiers for combining correlated classifiers
Information Sciences: an International Journal
Hi-index | 0.10 |
Ensemble methods improve the classification accuracy at the expense of testing complexity, resulting in increased computational costs in real-world applications. Developing a utility-based framework, we construct two novel cost-conscious ensembles; the first one determines a subset of classifiers and the second dynamically selects a single classifier. Both ensembles successfully switch between classifiers according to the accuracy-cost trade-off of an application.