Is Entropy Suitable to Characterize Data and Signals for Cognitive Informatics?

  • Authors:
  • W. Kinsner

  • Affiliations:
  • University of Manitoba

  • Venue:
  • ICCI '04 Proceedings of the Third IEEE International Conference on Cognitive Informatics
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper provides a review of Shannon and other entropy measures in evaluating the quality of materials used in perception, cognition and learning processes. Energy-based metrics are not suitable for cognition, as energy itself does not carry information. Instead, morphological (structural and contextual) as well as entropy-based metrics should be considered in cognitive informatics. The data and signal transformation processes are defined and discussed in the perceptual framework, followed by various classes of information and entropies suitable for characterization of data, signals and distortion. Other entropies are also described, including the Rényi generalized entropy spectrum, Kolmogorov complexity measure, Kolmogorov-Sinai entropy and Prigogine entropy for evolutionary dynamical systems. Although such entropy-based measures are suitable for many signals, they are not sufficient for scale-invariant (fractal and multifractal) signals without complementary measures.