“Change-glasses” approach in pattern recognition
Pattern Recognition Letters
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Machine Learning
Combination of Multiple Classifiers Using Local Accuracy Estimates
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Hi-index | 0.00 |
A method for multiple classifier selection and combination is presented. Classifiers are selected sequentially on-line based on a context specific (data driven) formulation of classifier optimality. A finite subset of a large (or infinite) set of classifiers is used for classification resulting not only in a computational saving, but a boost in classification performance. Experiments were carried out using single class binary classifiers on multi-class classification problems. Classifier outputs are combined using a Bayesian approach and results show a significant improvement in classification accuracy over the AdaBoost.MH method.