Fuzzy sets, uncertainty, and information
Fuzzy sets, uncertainty, and information
Properties of measures of information in evidence and possibility theories
Fuzzy Sets and Systems - Special Issue: Measures of Uncertainty
Fuzzy sets in approximate reasoning, part 1: inference with possibility distributions
Fuzzy Sets and Systems - Special memorial volume on foundations of fuzzy reasoning
Elements of information theory
Elements of information theory
A first course in fuzzy logic
Data compression: the complete reference
Data compression: the complete reference
Uncertainty-Based Information: Elements of Generalized Information Theory
Uncertainty-Based Information: Elements of Generalized Information Theory
IEEE Transactions on Information Theory
Possibilistic entropies and the compression of possibilistic data
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems
Utilities and distortions: an objective approach to possibilistic coding
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems
Spearman Permutation Distances and Shannon's Distinguishability
Fundamenta Informaticae
Importance identification for fault trees based on possibilistic information measurements
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
Hi-index | 0.00 |
We define information measures which pertain to possibility theory and which have a coding-theoretic meaning. We put forward a model for information sources and transmission channels which is possibilistic rather than probabilistic. In the case of source coding without distortion we define a notion of possibilistic entropy, which is connected to the so-called Hartley's measure; we tackle also the case of source coding with distortion. In the case of channel coding we define a notion of possibilistic capacity, which is connected to a combinatorial notion called graph capacity. In the probabilistic case Hartley's measure and graph capacity are relevant quantities only when the allowed decoding error probability is strictly equal to zero, while in the possibilistic case they are relevant quantities for whatever value of the allowed decoding error possibility; as the allowed error possibility becomes larger the possibilistic entropy decreases (one can reliably compress data to smaller sizes), while the possibilistic capacity increases (one can reliably transmit data at a higher rate). We put forward an interpretation of possibilistic coding, which is based on distortion measures. We discuss an application, where possibilities are used to cope with uncertainty as induced by a "vague" linguistic description of the transmission channel.