Elements of information theory
Elements of information theory
Knowledge, probability, and adversaries
Journal of the ACM (JACM)
On predictive distributions and Bayesian networks
Statistics and Computing
Maximum Entropy and the Glasses You are Looking Through
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
Probability update: conditioning vs. cross-entropy
UAI'97 Proceedings of the Thirteenth conference on Uncertainty in artificial intelligence
The minimum description length principle in coding and modeling
IEEE Transactions on Information Theory
Decision making under incomplete data using the imprecise Dirichlet model
International Journal of Approximate Reasoning
Making decisions using sets of probabilities: updating, time consistency, and calibration
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
It is commonly-accepted wisdom that more information is better, and that information should never be ignored. Here we argue, using both a Bayesian and a non-Bayesian analysis, that in some situations you are better off ignoring information if your uncertainty is represented by a set of probability measures. These include situations in which the information is relevant for the prediction task at hand. In the non-Bayesian analysis, we show how ignoring information avoids dilation, the phenomenon that additional pieces of information sometimes lead to an increase in uncertainty. In the Bayesian analysis, we show that for small sample sizes and certain prediction tasks, the Bayesian posterior based on a non-informative prior yields worse predictions than simply ignoring the given information.