On Extensions to Fisher's Linear Discriminant Function
IEEE Transactions on Pattern Analysis and Machine Intelligence
Bounds on the Bayes Classification Error Based on Pairwise Risk Functions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
A Tight Upper Bound on the Bayesian Probability of Error
IEEE Transactions on Pattern Analysis and Machine Intelligence
Dissimilarity computation through low rank corrections
Pattern Recognition Letters
IEEE Transactions on Pattern Analysis and Machine Intelligence
Posterior probability measure for image matching
Pattern Recognition
An image content description technique for the inspection of specular objects
EURASIP Journal on Advances in Signal Processing
An incremental Bhattacharyya dissimilarity measure for particle filtering
Pattern Recognition
GMM-SVM Kernel with a Bhattacharyya-based distance for speaker recognition
IEEE Transactions on Audio, Speech, and Language Processing
Video-based illumination estimation
CCIW'11 Proceedings of the Third international conference on Computational color imaging
An anti-tampering algorithm based on an artificial intelligence approach
ISMIS'12 Proceedings of the 20th international conference on Foundations of Intelligent Systems
Hi-index | 0.14 |
The quality, in terms of the bias and variance, of estimates of the Bhattacharyya coefficient based on n training samples from two classes described by multivariate Gaussian distributions is considered. The case where the classes are described by a common covariance matrix, as well as the case where each class is described by a different covariance matrix, is analyzed. Expressions for the bias and the variance of estimates of the Bhattacharyya coefficient are derived, and numerical examples are used to show the relationship between these parameters, the number of training samples, and the dimensionality of the observation space.