Neural networks and the bias/variance dilemma
Neural Computation
Machine Learning
Combination of Multiple Classifiers Using Local Accuracy Estimates
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Soft combination of neural classifiers: a comparative study
Pattern Recognition Letters
Feature selection for ensembles
AAAI '99/IAAI '99 Proceedings of the sixteenth national conference on Artificial intelligence and the eleventh Innovative applications of artificial intelligence conference innovative applications of artificial intelligence
Inductive Learning Algorithms for Complex Systems Modeling
Inductive Learning Algorithms for Complex Systems Modeling
Ensembling neural networks: many could be better than all
Artificial Intelligence
Adaptive Selection of Image Classifiers
ICIAP '97 Proceedings of the 9th International Conference on Image Analysis and Processing-Volume I - Volume I
A Unified Bias-Variance Decomposition for Zero-One and Squared Loss
Proceedings of the Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence
Tree induction vs. logistic regression: a learning-curve analysis
The Journal of Machine Learning Research
Recognition of Exon/Intron Boundaries Using Dynamic Ensembles
CSB '04 Proceedings of the 2004 IEEE Computational Systems Bioinformatics Conference
Class noise vs. attribute noise: a quantitative study of their impacts
Artificial Intelligence Review
Applying Weights in the Functioning of the Dynamic Classifier Selection Method
SBRN '06 Proceedings of the Ninth Brazilian Symposium on Neural Networks
A Dynamic Classifier Selection Method to Build Ensembles using Accuracy and Diversity
SBRN '06 Proceedings of the Ninth Brazilian Symposium on Neural Networks
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
From dynamic classifier selection to dynamic ensemble selection
Pattern Recognition
Diversity of ability and cognitive style for group decision processes
Information Sciences: an International Journal
Collective-agreement-based pruning of ensembles
Computational Statistics & Data Analysis
Classification algorithm sensitivity to training data with non representative attribute noise
Decision Support Systems
RETRACTED: Investigating the efficiency in oil futures market based on GMDH approach
Expert Systems with Applications: An International Journal
Incremental construction of classifier and discriminant ensembles
Information Sciences: an International Journal
Bio-inspired and gradient-based algorithms to train MLPs: The influence of diversity
Information Sciences: an International Journal
Information Market-Based Decision Fusion
Management Science
Structure identification of Bayesian classifiers based on GMDH
Knowledge-Based Systems
Constructing diverse classifier ensembles using artificial training examples
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
A new separation measure for improving the effectiveness of validity indices
Information Sciences: an International Journal
An approach to discovering multi-temporal patterns and its application to financial databases
Information Sciences: an International Journal
Information Sciences: an International Journal
Ensemble strategies with adaptive evolutionary programming
Information Sciences: an International Journal
A forecasting solution to the oil spill problem based on a hybrid intelligent system
Information Sciences: an International Journal
A driver fatigue recognition model based on information fusion and dynamic Bayesian network
Information Sciences: an International Journal
A new ensemble diversity measure applied to thinning ensembles
MCS'03 Proceedings of the 4th international conference on Multiple classifier systems
Switching between selection and fusion in combining classifiers: anexperiment
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
An experimental bias-variance analysis of SVM ensembles based on resampling techniques
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Heuristic self-organization in problems of engineering cybernetics
Automatica (Journal of IFAC)
Dynamic classifier ensemble model for customer classification with imbalanced class distribution
Expert Systems with Applications: An International Journal
Eigenclassifiers for combining correlated classifiers
Information Sciences: an International Journal
Training regression ensembles by sequential target correction and resampling
Information Sciences: an International Journal
From cluster ensemble to structure ensemble
Information Sciences: an International Journal
An efficient ensemble classification method based on novel classifier selection technique
Proceedings of the 2nd International Conference on Web Intelligence, Mining and Semantics
Information Sciences: an International Journal
Multi-label ensemble based on variable pairwise constraint projection
Information Sciences: an International Journal
A novel fuzzy Dempster-Shafer inference system for brain MRI segmentation
Information Sciences: an International Journal
Which work-item updates need your response?
Proceedings of the 10th Working Conference on Mining Software Repositories
Information Sciences: an International Journal
Embedded local feature selection within mixture of experts
Information Sciences: an International Journal
Hi-index | 0.07 |
Dynamic classifier ensemble selection (DCES) plays a strategic role in the field of multiple classifier systems. The real data to be classified often include a large amount of noise, so it is important to study the noise-immunity ability of various DCES strategies. This paper introduces a group method of data handling (GMDH) to DCES, and proposes a novel dynamic classifier ensemble selection strategy GDES-AD. It considers both accuracy and diversity in the process of ensemble selection. We experimentally test GDES-AD and six other ensemble strategies over 30 UCI data sets in three cases: the data sets do not include artificial noise, include class noise, and include attribute noise. Statistical analysis results show that GDES-AD has stronger noise-immunity ability than other strategies. In addition, we find out that Random Subspace is more suitable for GDES-AD compared with Bagging. Further, the bias-variance decomposition experiments for the classification errors of various strategies show that the stronger noise-immunity ability of GDES-AD is mainly due to the fact that it can reduce the bias in classification error better.