On a New Class of Bounds on Bayes Risk in Multihypothesis Pattern Recognition
IEEE Transactions on Computers
Note on discrimination information and variation (Corresp.)
IEEE Transactions on Information Theory
Non-iterative Heteroscedastic Linear Dimension Reduction for Two-Class Data
Proceedings of the Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition
A discriminant analysis using composite features for classification problems
Pattern Recognition
Image thresholding based on Ali-Silvey distance measures
Pattern Recognition
Pattern classification using composite features
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part II
Generalized error bounds in pattern recognition
Pattern Recognition Letters
Exploiting fisher and fukunaga-koontz transforms in chernoff dimensionality reduction
ACM Transactions on Knowledge Discovery from Data (TKDD)
Hi-index | 0.07 |
The information and distance measures are useful to provide probability of error bounds, especially when the exact error probability is not available or too difficult to compute. They are also useful for feature selection and ordering in pattern recognition such that the error probability is indirectly minimized. This paper provides a fairly complete list of information and distance measures including a new average conditional cubic entropy proposed by the author. Major problem areas such as computation with these measures are examined and methods of approach for the unresolved problems are suggested. Error bounds on feature subset selection and on one-dimensional Laplacian and Gaussian densities are also considered.