Sensitivity theorems in integer linear programming
Mathematical Programming: Series A and B
Chaff: engineering an efficient SAT solver
Proceedings of the 38th annual Design Automation Conference
Incremental Subgradient Methods for Nondifferentiable Optimization
SIAM Journal on Optimization
Mini-buckets: A general scheme for bounded inference
Journal of the ACM (JACM)
Inference-Based Sensitivity Analysis for Mixed Integer/Linear Programming
Operations Research
Algorithms for Hybrid MILP/CP Models for a Class of Optimization Problems
INFORMS Journal on Computing
A Hybrid Method for the Planning and Scheduling
Constraints
Mini-buckets: a general scheme for generating approximations in automated reasoning
IJCAI'97 Proceedings of the Fifteenth international joint conference on Artifical intelligence - Volume 2
Hi-index | 0.00 |
We show that various duals that occur in optimization and constraint satisfaction can be classified as inference duals, relaxation duals, or both. We discuss linear programming, surrogate, Lagrangean, superadditive, and constraint duals, as well as duals defined by resolution and filtering algorithms. Inference duals give rise to nogood-based search methods and sensitivity analysis, while relaxation duals provide bounds. This analysis shows that duals may be more closely related than they appear, as are surrogate and Lagrangean duals. It also reveals common structure between solution methods, such as Benders decomposition and Davis-Putnam-Loveland methods with clause learning. It provides a framework for devising new duals and solution methods, such as generalizations of mini-bucket elimination.