Decision Combination in Multiple Classifier Systems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Combination of Multiple Classifiers Using Local Accuracy Estimates
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Combining classifiers using correspondence analysis
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Bagging and Boosting with Dynamic Integration of Classifiers
PKDD '00 Proceedings of the 4th European Conference on Principles of Data Mining and Knowledge Discovery
A Dynamic Integration Algorithm for an Ensemble of Classifiers
ISMIS '99 Proceedings of the 11th International Symposium on Foundations of Intelligent Systems
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Classifier Combinations: Implementations and Theoretical Issues
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Methods for Dynamic Classifier Selection
ICIAP '99 Proceedings of the 10th International Conference on Image Analysis and Processing
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
From dynamic classifier selection to dynamic ensemble selection
Pattern Recognition
Adaptive mixtures of local experts
Neural Computation
A Theoretical Analysis of Bagging as a Linear Combination of Classifiers
IEEE Transactions on Pattern Analysis and Machine Intelligence
Induction of multiple fuzzy decision trees based on rough set technique
Information Sciences: an International Journal
Applying Static and Dynamic Weight Measures in Ensemble Systems
SBRN '08 Proceedings of the 2008 10th Brazilian Symposium on Neural Networks
Radial Basis Function network learning using localized generalization error bound
Information Sciences: an International Journal
Improving generalization of fuzzy IF-THEN rules by maximizing fuzzy entropy
IEEE Transactions on Fuzzy Systems
A novel dynamic fusion method using localized generalization error model
SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
Information Sciences: an International Journal
ISBMDA'06 Proceedings of the 7th international conference on Biological and Medical Data Analysis
Dynamic integration with random forests
ECML'06 Proceedings of the 17th European conference on Machine Learning
Dynamic classifier integration method
MCS'05 Proceedings of the 6th international conference on Multiple Classifier Systems
KES'05 Proceedings of the 9th international conference on Knowledge-Based Intelligent Information and Engineering Systems - Volume Part III
Switching between selection and fusion in combining classifiers: anexperiment
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Maximum Ambiguity-Based Sample Selection in Fuzzy Decision Tree Induction
IEEE Transactions on Knowledge and Data Engineering
Combinations of weak classifiers
IEEE Transactions on Neural Networks
Model complexity control for regression using VC generalization bounds
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Clustering-based ensembles for one-class classification
Information Sciences: an International Journal
Hi-index | 0.07 |
Multiple Classifier Systems (MCSs), which combine the outputs of a set of base classifiers, were proposed as a method to develop a more accurate classification system. One fundamental issue is how to combine the base classifiers. In this paper, a new dynamic fusion method named Localized Generalization Error Model Fusion Method (LFM) for MCSs is proposed. The Localized Generalization Error Model (L-GEM) has been used to estimate the local competence of base classifiers in MCSs. L-GEM provides a generalization error bound for unseen samples located within neighborhoods of testing samples. Base classifiers with lower generalization error bounds are assigned higher weights. In contrast to the current dynamic fusion methods, LFM estimates the local competence of base classifiers not only using the information of training error but also the sensitivity of classifier outputs. The additional effect of the sensitivity on the performance of model and the time complexity of the LFM are discussed and analyzed. Experimental results show that the MCSs using the LFM as a combination method outperform those using the other 21 dynamic fusion methods in terms of testing accuracy and time.