A Method of Combining Multiple Experts for the Recognition of Unconstrained Handwritten Numerals
IEEE Transactions on Pattern Analysis and Machine Intelligence
Boosting a weak learning algorithm by majority
Information and Computation
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improving OCR performance using character degradation models and boosting algorithm
Pattern Recognition Letters - special issue on pattern recognition in practice V
Machine Learning
Ensembling neural networks: many could be better than all
Artificial Intelligence
Using Correspondence Analysis to Combine Classifiers
Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Extracting symbolic rules from trained neural network ensembles
AI Communications - Special issue on Artificial intelligence advances in China
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
A Theoretical and Experimental Analysis of Linear Combiners for Multiple Classifier Systems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Classification of seismic signals by integrating ensembles ofneural networks
IEEE Transactions on Signal Processing
Hi-index | 0.00 |
The combination of classifier has long been proposed as a method to improve the accuracy achieved in isolation by a single classifier. Most of the extant works focus on how to generate a group of "good" base classifiers, such as AdaBoost and Bagging. We are interested in the method of combining multiple classifiers. In contrast to such popular used method as vote, we regard the classifier combination problem as a classification problem. From the perspective of pattern recognition, the base classifiers can also be regarded as a feature extraction method. In theory, any of classifiers can be used to combine the base classifiers as long as they are able to treat the outputs of base classifiers. More generally, the combination model is also able to deal with other machine learning problems including cluster and regression task, that is Learner Combination via Learner(LCL) model. A large empirical study shows that, comparing with majority vote(Bagging) and weighted majority vote(AdaBoost), CCC can significantly improve the performance.