Lower and Upper Bounds for Misclassification Probability Based on Renyi's Information

  • Authors:
  • Deniz Erdogmus;Jose C. Principe

  • Affiliations:
  • Computational NeuroEngineering Laboratory, NEB 454, University of Florida, Gainesville, FL 32611, USA;Computational NeuroEngineering Laboratory, NEB 454, University of Florida, Gainesville, FL 32611, USA

  • Venue:
  • Journal of VLSI Signal Processing Systems
  • Year:
  • 2004

Quantified Score

Hi-index 0.01

Visualization

Abstract

Fano's inequality has proven to be one important result in Shannon's information theory having found applications in numerous proofs of convergence. It also provides us with a lower bound on the symbol error probability in a communication channel, in terms of Shannon's definitions of entropy and mutual information. This result is also significant in that it suggests insights on how the classification performance is influenced by the amount of information transferred through the classifier. We have previously extended Fano's lower bound on the probability of error to a family of lower and upper bounds based on Renyi's definitions of entropy and mutual information. These new bounds however, despite their theoretical appeal, were practically incomputable. In this paper, we present some modifications to these bounds that will allow us to utilize them in practical situations. The significance of these new bounds is threefold: Illustrating a theoretical use of Renyi's definition of information, extending Fano's result to include an upper bound for probability of classification error, and providing insights on how the information transfer through a classifier affects its performance. The performance of the modified bounds is investigated in various numerical examples, including applications to digital communication channels that are designed to point out the major conclusions.