A method for updating that justifies minimum cross entropy
International Journal of Approximate Reasoning
A Prototypical System for Soft Evidential Update
Applied Intelligence
UAI '04 Proceedings of the 20th conference on Uncertainty in artificial intelligence
Belief Revision through Forgetting Conditionals in Conditional Probabilistic Logic Programs
Proceedings of the 2008 conference on ECAI 2008: 18th European Conference on Artificial Intelligence
Revising imprecise probabilistic beliefs in the framework of probabilistic logic programming
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 1
Journal of Artificial Intelligence Research
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Measure selection: notions of rationality and representation independence
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
Hi-index | 0.00 |
Conditioning is the generally agreed-upon method for updating probability distributions when one learns that an event is certainly true. But it has been argued that we need other rules, in particular the rule of cross-entropy minimization, to handle updates that involve uncertain information. In this paper we re-examine such a case: van Fraassen's Judy Benjamin problem [1987], which in essence asks how one might update given the value of a conditional probability. We argue that--contrary to the suggestions in the literature--it is possible to use simple conditionalization in this case, and thereby obtain answers that agree fully with intuition. This contrasts with proposals such as cross-entropy, which are easier to apply but can give unsatisfactory answers. Based on the lessons from this example, we speculate on some general philosophical issues concerning probability update.