Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
An introduction to signal detection and estimation (2nd ed.)
An introduction to signal detection and estimation (2nd ed.)
Arbitrarily Tight Upper and Lower Bounds on the Bayesian Probability of Error
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Tight Upper Bound on the Bayesian Probability of Error
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
On a New Class of Bounds on Bayes Risk in Multihypothesis Pattern Recognition
IEEE Transactions on Computers
On information and distance measures, error bounds, and feature selection
Information Sciences: an International Journal
Distribution of noncentral indefinite quadratic forms in complex normal variables
IEEE Transactions on Information Theory
Probability of error, equivocation, and the Chernoff bound
IEEE Transactions on Information Theory
An optimal sensing framework based on spatial RSS-profile in cognitive radio networks
SECON'09 Proceedings of the 6th Annual IEEE communications society conference on Sensor, Mesh and Ad Hoc Communications and Networks
Hi-index | 0.01 |
It is well known that the error probability, of the binary Gaussian classification problem with different class covariance matrices, cannot be generally evaluated exactly because of the lack of closed-form expression. This fact pointed out the need to find a tight upper bound for the error probability. This issue has been for more than 50 years ago and is still of interest. All derived upper-bounds are not free of flaws. They might be loose, computationally inefficient particularly in highly dimensional situations, or excessively time consuming if high degree of accuracy is desired. In this paper, a new technique is developed to estimate a tight upper bound for the error probability of the well-known binary Gaussian classification problem with different covariance matrices. The basic idea of the proposed technique is to replace the optimal Bayes decision boundary with suboptimal boundaries which provide an easy-to-calculate upper bound for the error probability. In particular, three types of decision boundaries are investigated: planes, elliptic cylinders, and cones. The new decision boundaries are selected in such a way as to provide the tightest possible upper bound. The proposed technique is found to provide an upper bound, tighter than many of the often used bounds such as the Chernoff bound and the Bayesian-distance bound. In addition, the computation time of the proposed bound is much less than that required by the Monte-Carlo simulation technique. When applied to real world classification problems, obtained from the UCI repository [H. Chernoff, A measure for asymptotic efficiency of a hypothesis based on a sum of observations, Ann. Math. Statist. 23 (1952) 493-507.], the proposed bound was found to provide a tight bound for the analytical error probability of the quadratic discriminant analysis (QDA) classifier and a good approximation to its empirical error probability.