Understanding the role of noise in stochastic local search: Analysis and experiments
Artificial Intelligence
Complexity results and approximation strategies for MAP explanations
Journal of Artificial Intelligence Research
Journal of Automated Reasoning
Approximating MAP using local search
UAI'01 Proceedings of the Seventeenth conference on Uncertainty in artificial intelligence
Hi-index | 0.00 |
Heuristic search and uncertain information constitute two central Themes in artificial intelligence. This article assumes that the uncertain information is represented in Bayesian networks, and introduces and investigates a novel stochastic heuristic local search algorithm, Stochastic Greedy Search. Stochastic Greedy Search searches for a most probable explanation in a Bayesian network. The innovative aspects of Stochastic Greedy Search, compared to existing local search algorithms, are some of its measures of gain, certain noise operators, certain initialization operators, and an operator- based variant. Stochastic Greedy Search uses noisy steps that allow local search to escape local optima. We introduce different measures of gain(or gradient) and an operator-based approach, giving several ways to search locally. We also introduce two novel dynamic programming based initialization algorithms denoted forward and backward dynamic programming. The initialization algorithms start the local search at points closer to local optima than when search starts from an explanation created uniformly at random. Comparisons to the state-of-the-art inference algorithm Hugin show that Stochastic Greedy Search performs significantly better for Bayesian networks from applications as well as on synthetically generated networks. In synthetic networks, Stochastic Greedy Search speeds up computation up to three orders of magnitude compared to Hugin. In application networks, our initialization algorithms, which compute the most probable explanation in bounding cases, prove to be very valuable.