Automated design of linear tree classifiers
Pattern Recognition
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Pattern Recognition Letters - Special issue on non-conventional pattern analysis in remote sensing
Combination of Multiple Classifiers Using Local Accuracy Estimates
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
A multiple classifier system using ambiguity rejection for clustering-classification cooperation
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems - special issue on measures and aggregation: formal aspects and applications to clustering and decision
Multi-Objective Optimization Using Evolutionary Algorithms
Multi-Objective Optimization Using Evolutionary Algorithms
Ensembling neural networks: many could be better than all
Artificial Intelligence
Open Systems & Information Dynamics
Automatic Recognition of Handwritten Numerical Strings: A Recognition and Verification Strategy
IEEE Transactions on Pattern Analysis and Machine Intelligence
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Optimizing Nearest Neighbour in Random Subspaces using a Multi-Objective Genetic Algorithm
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 1 - Volume 01
Dynamic Classifier Selection for Effective Mining from Noisy Data Streams
ICDM '04 Proceedings of the Fourth IEEE International Conference on Data Mining
Applying Weights in the Functioning of the Dynamic Classifier Selection Method
SBRN '06 Proceedings of the Ninth Brazilian Symposium on Neural Networks
From dynamic classifier selection to dynamic ensemble selection
Pattern Recognition
Engineering multiversion neural-net systems
Neural Computation
No free lunch and free leftovers theorems for multiobjective optimisation problems
EMO'03 Proceedings of the 2nd international conference on Evolutionary multi-criterion optimization
Switching between selection and fusion in combining classifiers: anexperiment
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Incremental construction of classifier and discriminant ensembles
Information Sciences: an International Journal
Predicting protein subcellular locations for Gram-negative bacteria using neural networks ensemble
CIBCB'09 Proceedings of the 6th Annual IEEE conference on Computational Intelligence in Bioinformatics and Computational Biology
International Journal of Hybrid Intelligent Systems - Hybrid Fuzzy Models
So near and yet so far: New insight into properties of some well-known classifier paradigms
Information Sciences: an International Journal
A dynamic classifier ensemble selection approach for noise data
Information Sciences: an International Journal
Analysis of bagging ensembles of fuzzy models for premises valuation
ACIIDS'10 Proceedings of the Second international conference on Intelligent information and database systems: Part II
A probabilistic model of classifier competence for dynamic ensemble selection
Pattern Recognition
HAIS'11 Proceedings of the 6th international conference on Hybrid artificial intelligent systems - Volume Part II
Dynamic selection of ensembles of classifiers using contextual information
MCS'10 Proceedings of the 9th international conference on Multiple Classifier Systems
An efficient ensemble classification method based on novel classifier selection technique
Proceedings of the 2nd International Conference on Web Intelligence, Mining and Semantics
Improving a dynamic ensemble selection method based on oracle information
International Journal of Innovative Computing and Applications
New dynamic classifiers selection approach for handwritten recognition
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
The use of artificial-intelligence-based ensembles for intrusion detection: a review
Applied Computational Intelligence and Soft Computing
A review of information fusion techniques employed in iris recognition systems
International Journal of Advanced Intelligence Paradigms
Expert Systems with Applications: An International Journal
Evolutionary computation for supervised learning
Proceedings of the 15th annual conference companion on Genetic and evolutionary computation
IWANN'13 Proceedings of the 12th international conference on Artificial Neural Networks: advances in computational intelligence - Volume Part I
International Journal of Knowledge-based and Intelligent Engineering Systems
Learning to filter spam emails: An ensemble learning approach
International Journal of Hybrid Intelligent Systems
Hi-index | 0.01 |
The overproduce-and-choose strategy, which is divided into the overproduction and selection phases, has traditionally focused on finding the most accurate subset of classifiers at the selection phase, and using it to predict the class of all the samples in the test data set. It is therefore, a static classifier ensemble selection strategy. In this paper, we propose a dynamic overproduce-and-choose strategy which combines optimization and dynamic selection in a two-level selection phase to allow the selection of the most confident subset of classifiers to label each test sample individually. The optimization level is intended to generate a population of highly accurate candidate classifier ensembles, while the dynamic selection level applies measures of confidence to reveal the candidate ensemble with the highest degree of confidence in the current decision. Experimental results conducted to compare the proposed method to a static overproduce-and-choose strategy and a classical dynamic classifier selection approach demonstrate that our method outperforms both these selection-based methods, and is also more efficient in terms of performance than combining the decisions of all classifiers in the initial pool.