Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Combination of Multiple Classifiers Using Local Accuracy Estimates
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Overriding the Experts: A Stacking Method for Combining Marginal Classifiers
Proceedings of the Thirteenth International Florida Artificial Intelligence Research Society Conference
Information combination operators for data fusion: a comparative review with classification
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Application of majority voting to pattern recognition: an analysis of its behavior and performance
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
A target identification comparison of Bayesian and Dempster-Shafer multisensor fusion
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Hi-index | 0.00 |
The use of multiple features by a classifier often leads to a reduced probability of error, but the design of an optimal Bayesian classifier for multiple features is dependent on the estimation of multidimensional joint probability density functions and therefore requires a design sample size that, in general, increases exponentially with the number of dimensions. The classification method described in this paper makes decisions by combining the decisions made by multiple Bayesian classifiers using an additional classifier that estimates the joint probability densities of the decision space rather than the joint probability densities of the feature space. A proof is presented for the restricted case of two classes and two features; showing that the method always demonstrates a probability of error that is less than or equal to the probability of error of the marginal classifier with the lowest probability of error.