Arc Consistency for Soft Constraints
CP '02 Proceedings of the 6th International Conference on Principles and Practice of Constraint Programming
Node and arc consistency in weighted CSP
Eighteenth national conference on Artificial intelligence
Model induction: a new source of CSP model redundancy
Eighteenth national conference on Artificial intelligence
Dual modelling of permutation and injection problems
Journal of Artificial Intelligence Research
Valued constraint satisfaction problems: hard and easy problems
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 1
In the quest of the best form of local consistency for weighted CSP
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Existential arc consistency: getting closer to full arc consistency in weighted CSPs
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
A parameterized local consistency for redundant modeling in weighted CSPs
AI'07 Proceedings of the 20th Australian joint conference on Advances in artificial intelligence
Hi-index | 0.00 |
In classical constraint satisfaction, combining mutually redundant models using channeling constraints is effective in increasing constraint propagation and reducing search space for many problems. In this paper, we investigate how to benefit the same for weighted constraint satisfaction problems (WCSPs), a common soft constraint framework for modeling optimization and over-constrained problems. First, we show how to generate a redundant WCSP model from an existing WCSP using generalized model induction. We then uncover why naively combining two WCSPs by posting channeling constraints as hard constraints and relying on the standard NC* and AC* propagation algorithms does not work well. Based on these observations, we propose m -NC*c and m-AC*c and their associated algorithms for effectively enforcing node and arc consistencies on a combined model with m sub-models. The two notions are strictly stronger than NC* and AC* respectively. Experimental results confirm that applying the 2-NC*c and 2-AC*c algorithms on combined models reduces more search space and runtime than applying the state-of-the-art AC*, FDAC*, and EDAC* algorithms on single models.