Entropy and information theory
Entropy and information theory
The uncertain reasoner's companion: a mathematical perspective
The uncertain reasoner's companion: a mathematical perspective
Characterizing the principle of minimum cross-entropy within a conditional-logical framework
Artificial Intelligence
Journal of Logic, Language and Information
Combining probabilistic logic programming with the power of maximum entropy
Artificial Intelligence - Special issue on nonmonotonic reasoning
Belief revision and information fusion on optimum entropy: Research Articles
International Journal of Intelligent Systems - Uncertain Reasoning (Part 2)
Random worlds and maximum entropy
Journal of Artificial Intelligence Research
On the revision of probabilistic beliefs using uncertain evidence
Artificial Intelligence
Updating sets of probabilities
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Probabilistic belief contraction has been a much neglected topic in the field of probabilistic reasoning. This is due to the difficulty in establishing a reasonable reversal of the effect of Bayesian conditionalization on a probabilistic distribution. We show that indifferent contraction, a solution proposed by Ramer to this problem through a judicious use of the principle of maximum entropy, is a probabilistic version of a full meet contraction. We then propose variations of indifferent contraction, using both the Shannon entropy measure as well as the Hartley entropy measure, with an aim to avoid excessive loss of beliefs that full meet contraction entails.