Possibilistic information theory: a coding theoretic approach

  • Authors:
  • Andrea Sgarro

  • Affiliations:
  • Department of Mathematical Sciences (DSM), University of Trieste, 34100 Trieste, Italy

  • Venue:
  • Fuzzy Sets and Systems - Possibility theory and fuzzy logic
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

We define information measures which pertain to possibility theory and which have a coding-theoretic meaning. We put forward a model for information sources and transmission channels which is possibilistic rather than probabilistic. In the case of source coding without distortion we define a notion of possibilistic entropy, which is connected to the so-called Hartley's measure; we tackle also the case of source coding with distortion. In the case of channel coding we define a notion of possibilistic capacity, which is connected to a combinatorial notion called graph capacity. In the probabilistic case Hartley's measure and graph capacity are relevant quantities only when the allowed decoding error probability is strictly equal to zero, while in the possibilistic case they are relevant quantities for whatever value of the allowed decoding error possibility; as the allowed error possibility becomes larger the possibilistic entropy decreases (one can reliably compress data to smaller sizes), while the possibilistic capacity increases (one can reliably transmit data at a higher rate). We put forward an interpretation of possibilistic coding, which is based on distortion measures. We discuss an application, where possibilities are used to cope with uncertainty as induced by a "vague" linguistic description of the transmission channel.