Artificial Intelligence: A Modern Approach
Artificial Intelligence: A Modern Approach
Constraint Processing
Introduction to Statistical Relational Learning (Adaptive Computation and Machine Learning)
Introduction to Statistical Relational Learning (Adaptive Computation and Machine Learning)
Database Systems: The Complete Book
Database Systems: The Complete Book
Sound and efficient inference with probabilistic and deterministic dependencies
AAAI'06 Proceedings of the 21st national conference on Artificial intelligence - Volume 1
Discriminative training of Markov logic networks
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 2
Lifted first-order belief propagation
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Constraint processing in lifted probabilistic inference
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Markov Logic: An Interface Layer for Artificial Intelligence
Markov Logic: An Interface Layer for Artificial Intelligence
SampleSearch: Importance sampling in presence of determinism
Artificial Intelligence
Discriminative probabilistic models for relational data
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Efficiently inducing features of conditional random fields
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Location-based reasoning about complex multi-agent behavior
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
Many real world problems can be modeled using a combination of hard and soft constraints. Markov Logic is a highly expressive language which represents the underlying constraints by attaching realvalued weights to formulas in first order logic. The weight of a formula represents the strength of the corresponding constraint. Hard constraints are represented as formulas with infinite weight. The theory is compiled into a ground Markov network over which probabilistic inference can be done. For many problems, hard constraints pose a significant challenge to the probabilistic inference engine. However, solving the hard constraints (partially or fully) before hand outside of the probabilistic engine can hugely simplify the ground Markov network and speed probabilistic inference. In this work, we propose a generalized arc consistency algorithm that prunes the domains of predicates by propagating hard constraints. Our algorithm effectively performs unit propagation at a lifted level, avoiding the need to explicitly ground the hard constraints during the pre-processing phase, yielding a potentially exponential savings in space and time. Our approach results in much simplified domains, thereby, making the inference significantly more efficient both in terms of time and memory. Experimental evaluation over one artificial and two real-world datasets show the benefit of our approach.