Artificial Intelligence
Readings in nonmonotonic reasoning
Readings in nonmonotonic reasoning
Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
On the applicability of maximum entropy to inexact reasoning
International Journal of Approximate Reasoning
Default reasoning in semantic networks: a formalization of recognition and inheritance
Artificial Intelligence
A logic for reasoning about probabilities
Information and Computation - Selections from 1988 IEEE symposium on logic in computer science
An analysis of first-order logics of probability
Artificial Intelligence
Representing and reasoning with probabilistic knowledge: a logical approach to probabilities
Representing and reasoning with probabilistic knowledge: a logical approach to probabilities
Probabilistic semantics for nonmonotonic reasoning: a survey
Proceedings of the first international conference on Principles of knowledge representation and reasoning
Asymptotic Conditional Probabilities: The Unary Case
SIAM Journal on Computing
A Maximum Entropy Approach to Nonmonotonic Reasoning
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Mathematical Theory of Communication
A Mathematical Theory of Communication
Pruning Redundant Association Rules Using Maximum Entropy Principle
PAKDD '02 Proceedings of the 6th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
Probalilistic Logic Programming under Maximum Entropy
ECSQARU '95 Proceedings of the European Conference on Symbolic and Quantitative Approaches to Reasoning and Uncertainty
Degrees of Belief, Random Worlds, and Maximum Entropy
DS '00 Proceedings of the Third International Conference on Discovery Science
An intrinsic fuzzy set on the universe of discourse of predicate formulas
Fuzzy Sets and Systems
Infodynamics: Analogical analysis of states of matter and information
Information Sciences: an International Journal
Inference Processes for Quantified Predicate Knowledge
WoLLIC '08 Proceedings of the 15th international workshop on Logic, Language, Information and Computation
Conditionals in nonmonotonic reasoning and belief revision: considering conditionals as agents
Conditionals in nonmonotonic reasoning and belief revision: considering conditionals as agents
A note on the least informative model of a theory
CiE'10 Proceedings of the Programs, proofs, process and 6th international conference on Computability in Europe
Maximal-confirmation diagnoses
Knowledge-Based Systems
Relational probabilistic conditional reasoning at maximum entropy
ECSQARU'11 Proceedings of the 11th European conference on Symbolic and quantitative approaches to reasoning with uncertainty
Maximum entropy and the glasses you are looking through
UAI'00 Proceedings of the Sixteenth conference on Uncertainty in artificial intelligence
Credal networks under maximum entropy
UAI'00 Proceedings of the Sixteenth conference on Uncertainty in artificial intelligence
Entailment in probability of thresholded generalizations
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Probabilistic disjunctive logic programming
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
MaskIt: privately releasing user context streams for personalized mobile applications
SIGMOD '12 Proceedings of the 2012 ACM SIGMOD International Conference on Management of Data
Probabilistic Belief Contraction
Minds and Machines
Hi-index | 0.00 |
Given a knowledge base KB containing first-order and statistical facts, we consider a principled method, called the random-worlds method, for computing a degree of belief that some formula Φ holds given KB. If we are reasoning about a world or system consisting of N individuals, then we can consider all possible worlds, or first-order models, with domain {1,..., N} that satisfy KB, and compute the fraction of them in which Φ is true. We define the degree of belief to be the asymptotic value of this fraction as N grows large. We show that when the vocabulary underlying Φ and KB uses constants and unary predicates only, we can naturally associate an entropy with each world. As N grows larger, there are many more worlds with higher entropy. Therefore, we can use a maximum-entropy computation to compute the degree of belief. This result is in a similar spirit to previous work in physics and artificial intelligence, but is far more general. Of equal interest to the result itself are the limitations on its scope. Most importantly, the restriction to unary predicates seems necessary. Although the random-worlds method makes sense in general, the connection to maximum entropy seems to disappear in the non-unary case. These observations suggest unexpected limitations to the applicability of maximum-entropy methods.