Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Elements of information theory
Elements of information theory
The nature of statistical learning theory
The nature of statistical learning theory
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
An Information-Theoretic Approach to Neural Computing
An Information-Theoretic Approach to Neural Computing
Information Theory and Reliable Communication
Information Theory and Reliable Communication
Feature Extraction Using Information-Theoretic Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
A hybrid genetic algorithm for feature selection wrapper based on mutual information
Pattern Recognition Letters
Enhancing and Relaxing Competitive Units for Feature Discovery
Neural Processing Letters
Gait feature subset selection by mutual information
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans - Special section: Best papers from the 2007 biometrics: Theory, applications, and systems (BTAS 07) conference
Self-enhancement learning: self-supervised and target-creating learning
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
Information theoretic combination of classifiers with application to AdaBoost
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
Information theoretic combination of pattern classifiers
Pattern Recognition
Fano type quantum inequalities in terms of q-entropies
Quantum Information Processing
The Journal of Machine Learning Research
Hi-index | 0.01 |
Fano's inequality has proven to be one important result in Shannon's information theory having found applications in numerous proofs of convergence. It also provides us with a lower bound on the symbol error probability in a communication channel, in terms of Shannon's definitions of entropy and mutual information. This result is also significant in that it suggests insights on how the classification performance is influenced by the amount of information transferred through the classifier. We have previously extended Fano's lower bound on the probability of error to a family of lower and upper bounds based on Renyi's definitions of entropy and mutual information. These new bounds however, despite their theoretical appeal, were practically incomputable. In this paper, we present some modifications to these bounds that will allow us to utilize them in practical situations. The significance of these new bounds is threefold: Illustrating a theoretical use of Renyi's definition of information, extending Fano's result to include an upper bound for probability of classification error, and providing insights on how the information transfer through a classifier affects its performance. The performance of the modified bounds is investigated in various numerical examples, including applications to digital communication channels that are designed to point out the major conclusions.