On the concept of possibility-probability consistency
Fuzzy Sets and Systems
The principle of minimum specificity as a basis for evidential reasoning
Processing and Management of Uncertainty in Knowledge-Based Systems on Uncertainty in knowledge-based systems. International Conference on Information
Causality and maximum entropy updating
International Journal of Approximate Reasoning
When upper probabilities are possibility measures
Fuzzy Sets and Systems - Special issue dedicated to Professor Claude Ponsard
On the specificity of a possibility distribution
Fuzzy Sets and Systems
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Constructing the Pignistic Probability Function in a Context of Uncertainty
UAI '89 Proceedings of the Fifth Annual Conference on Uncertainty in Artificial Intelligence
Credal Networks under Maximum Entropy
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
The Uncertain Reasoner's Companion (Cambridge Tracts in Theoretical Computer Science)
The Uncertain Reasoner's Companion (Cambridge Tracts in Theoretical Computer Science)
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Unfair coins and necessity measures: Towards a possibilistic interpretation of histograms
Fuzzy Sets and Systems
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Learning label preferences: ranking error versus position error
IDA'05 Proceedings of the 6th international conference on Advances in Intelligent Data Analysis
Uncertainty of discrete stochastic systems: general theory and statistical inference
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy
IEEE Transactions on Information Theory
On the granularity of summative kernels
Fuzzy Sets and Systems
Peakedness and generalized entropy for continuous density functions
IPMU'10 Proceedings of the Computational intelligence for knowledge-based systems design, and 13th international conference on Information processing and management of uncertainty
International Journal of Approximate Reasoning
Hi-index | 0.00 |
Deciding whether one probability distribution is more informative (in the sense of representing a less indeterminate situation) than another one is typically done using well-established information measures such as, e.g., the Shannon entropy or other dispersion indices. In contrast, the relative specificity of possibility distributions is evaluated by means of fuzzy set inclusion. In this paper, we propose a technique for comparing probability distributions from the point of view of their relative dispersion without resorting to a numerical index. A natural partial ordering in terms of relative ''peakedness'' of probability functions is proposed which is closely related to order-1 stochastic dominance. There is also a close connection between this ordering on probability distributions and the standard specificity ordering on possibility distributions that can be derived by means of a known probability-possibility transformation. The paper proposes a direct proof showing that the (total) preordering on probability measures defined by probabilistic entropy refines the (partial) ordering defined by possibilistic specificity. This result, also valid for other dispersion indices, is discussed against the background of related work in statistics, mathematics (inequalities on convex functions), and the social sciences. Finally, an application of the possibilistic specificity ordering in the field of machine learning or, more specifically, the induction of decision forests is proposed.