A note on the inevitability of maximum entropy
International Journal of Approximate Reasoning
The uncertain reasoner's companion: a mathematical perspective
The uncertain reasoner's companion: a mathematical perspective
Representation dependence in probabilistic inference
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
Elicitation of probabilities for belief networks: combining qualitative and quantitative information
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Probability update: conditioning vs. cross-entropy
UAI'97 Proceedings of the Thirteenth conference on Uncertainty in artificial intelligence
Constraints as data: a new perspective on inferring probabilities
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 1
Hi-index | 0.00 |
We take another look at the general problem of selecting a preferred probability measure among those that comply with some given constraints. The dominant role that entropy maximization has obtained in this context is questioned by arguing that the minimum information principle on which it is based could be supplanted by an at least as plausible "likelihood of evidence" principle. We then review a method for turning given selection functions into representation independent variants, and discuss the tradeoffs involved in this transformation.