Diagnostic reasoning based on structure and behavior
Artificial Intelligence - Special volume on qualitative reasoning about physical systems
Theory of linear and integer programming
Theory of linear and integer programming
Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic construction of deterministic algorithms: approximating packing integer programs
Journal of Computer and System Sciences - 27th IEEE Conference on Foundations of Computer Science October 27-29, 1986
Abductive inference models for diagnostic problem-solving
Abductive inference models for diagnostic problem-solving
Optimization
On the generation of alternative explanations with implications for belief revision
Proceedings of the seventh conference (1991) on Uncertainty in artificial intelligence
Fast approximation algorithms for fractional packing and covering problems
SFCS '91 Proceedings of the 32nd annual symposium on Foundations of computer science
A probabilistic approach to language understanding
A probabilistic approach to language understanding
Approximating probabilistic inference in Bayesian belief networks is NP-hard
Artificial Intelligence
A linear constraint satisfaction approach to cost-based abduction
Artificial Intelligence
A new algorithm for finding MAP assignments to belief networks
UAI '90 Proceedings of the Sixth Annual Conference on Uncertainty in Artificial Intelligence
A logic for semantic interpretation
ACL '88 Proceedings of the 26th annual meeting on Association for Computational Linguistics
ACL '88 Proceedings of the 26th annual meeting on Association for Computational Linguistics
Association rule mining: models and algorithms
Association rule mining: models and algorithms
Hi-index | 0.00 |
Probabilistic reasoning suffers from NP-hard implementations. In particular, the amount of probabilistic information necessary to the computations is often overwhelming. For example, the size of conditional probability tables in Bayesian networks has long been a limiting factor in the general use of these networks.We present a new approach for manipulating the probabilistic information given. This approach avoids being overwhelmed by essentially compressing the information using approximation functions called linear potential functions. We can potentially reduce the information from a combinatorial amount to roughly linear in the number of random variable assigments. Furthermore, we can compute these functions through closed form equations. As it turns out, our approximation method is quite general and may be applied to other data compression problems.