The Strength of Weak Learnability
Machine Learning
Machine Learning
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Classifier Combinations: Implementations and Theoretical Issues
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Selecting Diverse Members of Neural Network Ensembles
SBRN '00 Proceedings of the VI Brazilian Symposium on Neural Networks (SBRN'00)
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Constructing diverse classifier ensembles using artificial training examples
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Genetic algorithm based selective neural network ensemble
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
Expert Systems with Applications: An International Journal
A new N-gram feature extraction-selection method for malicious code
ICANNGA'11 Proceedings of the 10th international conference on Adaptive and natural computing algorithms - Volume Part II
Localizing program logical errors using extraction of knowledge from invariants
SEA'11 Proceedings of the 10th international conference on Experimental algorithms
A novel classifier ensemble method based on class weightening in huge dataset
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part II
An innovative feature selection using fuzzy entropy
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part III
A new clustering algorithm with the convergence proof
KES'11 Proceedings of the 15th international conference on Knowledge-based and intelligent information and engineering systems - Volume Part I
On possibility of conditional invariant detection
KES'11 Proceedings of the 15th international conference on Knowledge-based and intelligent information and engineering systems - Volume Part II
Linkage learning based on local optima
ICCCI'11 Proceedings of the Third international conference on Computational collective intelligence: technologies and applications - Volume Part I
Detection of cancer patients using an innovative method for learning at imbalanced datasets
RSKT'11 Proceedings of the 6th international conference on Rough sets and knowledge technology
A metric to evaluate a cluster by eliminating effect of complement cluster
KI'11 Proceedings of the 34th Annual German conference on Advances in artificial intelligence
Evolutionary ensembles with negative correlation learning
IEEE Transactions on Evolutionary Computation
Hi-index | 0.00 |
Multiple classifier systems (MCSs), or simply classifier ensembles, which combine the outputs of a set of base classifiers, have been recently emerged as a method to develop a more accurate classification system. There are two fundamental issues relating to constructing an ensemble of classifiers. The first one is how to construct a set of the base classifiers in such a way that their ensemble can be a successful one; and the second is how to combine a set of base classifiers. This paper deals with the first important issue of ensemble creation. In the paper, a new method for combining classifiers is proposed. The main idea is heuristic retraining of classifiers. Specifically, in the new method named Combinational Classifiers using Heuristic Retraining (CCHR) which proposes a new way for generating diversity in ensemble pool, a classifier is first run, then, focusing on the drawbacks of this base classifier, other classifiers are retrained heuristically. Experimental results show that the MCSs using the proposed method as the constructor of ensemble components outperform those using those using another method as the constructor of ensemble components in terms of testing accuracy.