Neural Computation
A Method of Combining Multiple Experts for the Recognition of Unconstrained Handwritten Numerals
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
A Theoretical Study on Six Classifier Fusion Strategies
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical Analysis: A Computer Oriented Approach
Statistical Analysis: A Computer Oriented Approach
Training Invariant Support Vector Machines
Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
From dynamic classifier selection to dynamic ensemble selection
Pattern Recognition
A Theoretical Analysis of Bagging as a Linear Combination of Classifiers
IEEE Transactions on Pattern Analysis and Machine Intelligence
An Analysis of Ensemble Pruning Techniques Based on Ordered Aggregation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Incremental construction of classifier and discriminant ensembles
Information Sciences: an International Journal
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learn++.MF: A random subspace approach for the missing feature problem
Pattern Recognition
A Measure of Competence Based on Randomized Reference Classifier for Dynamic Ensemble Selection
ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
Hi-index | 12.05 |
In this paper, we propose a Single Classifier-based Multiple Classification Scheme (SMCS) that uses only a single classifier to generate multiple classifications for a given test data point. The SMCS does not require the presence of multiple classifiers, and generates diversity through the creation of pseudo test samples. The pseudo test sample generation mechanism allows the SMCS to adapt to dynamic environments without multiple classifier training. Moreover, because of the presence of multiple classifications, classification combination schemes, such as majority voting, can be applied, and so the mechanism may improve the recognition rate in a manner similar to that of Multiple Classifier Systems (MCS). The experimental results confirm the validity of the proposed SMCS as applicable to many classification systems. Even without parameter selection, the average performance of the SMCS is still comparable to that of Bagging or Boosting. Moreover, the SMCS and the traditional MCS scheme are not mutually exclusive, and the SMCS can be applied along with traditional MCS, such as Bagging and Boosting.