On information and distance measures, error bounds, and feature selection

  • Authors:
  • C. H. Chen

  • Affiliations:
  • -

  • Venue:
  • Information Sciences: an International Journal
  • Year:
  • 1976

Quantified Score

Hi-index 0.07

Visualization

Abstract

The information and distance measures are useful to provide probability of error bounds, especially when the exact error probability is not available or too difficult to compute. They are also useful for feature selection and ordering in pattern recognition such that the error probability is indirectly minimized. This paper provides a fairly complete list of information and distance measures including a new average conditional cubic entropy proposed by the author. Major problem areas such as computation with these measures are examined and methods of approach for the unresolved problems are suggested. Error bounds on feature subset selection and on one-dimensional Laplacian and Gaussian densities are also considered.