Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Handwritten numerical recognition based on multiple algorithms
Pattern Recognition
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Combination of Multiple Classifiers Using Local Accuracy Estimates
IEEE Transactions on Pattern Analysis and Machine Intelligence
Modular and hierarchical learning systems
The handbook of brain theory and neural networks
Boosted mixture of experts: an ensemble learning scheme
Neural Computation
Machine Learning
Dynamic ensemble approach for estimating organic carbon using computational intelligence
ACST'06 Proceedings of the 2nd IASTED international conference on Advances in computer science and technology
Robust Feature Selection for Microarray Data Based on Multicriterion Fusion
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Efficient astronomical data classification on large-scale distributed systems
GPC'10 Proceedings of the 5th international conference on Advances in Grid and Pervasive Computing
Dynamic fusion method using Localized Generalization Error Model
Information Sciences: an International Journal
International Journal of Data Mining and Bioinformatics
Hi-index | 0.00 |
The diversity of application domains of pattern recognition makes it difficult to find a highly reliable classification algorithm for sufficiently interesting tasks. In this paper we propose a new combining method, which harness the local confidence of each classifier in the combining process. Our method is at the confluence of two main streams of combining multiple classifiers: classifier fusion and classifier selection. This method learns the local confidence of each classifier using training data and if an unknown data is given, the learned knowledge is used to evaluate the outputs of individual classifiers. An empirical evaluation using five real data sets has shown that this method achieves a promising performance and outperforms the best single classifiers and other known combining methods we tried.