MAP complexity results and approximation methods

  • Authors:
  • James D. Park

  • Affiliations:
  • Computer Science Department, University of California, Los Angeles, CA

  • Venue:
  • UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

MAP is the problem of nding a most probable instantiation of a set of variables in a Bayesian network, given some evidence. MAP appears to be a signi cantly harder problem than the related problems of computing the probability of evidence (Pr), or MPE (a special case of MAP). Because of the complexity of MAP, and the lack of viable algorithms to approximate it, MAP computations are generally avoided by practitioners. This paper investigates the complexity of MAP. We show that MAP is complete for NPPP. We also provide negative complexity results for elimination based algorithms. It turns out that MAP remains hard even when MPE, and Pr are easy. We show that MAP is NP-complete when the networks are restricted to polytrees, and even then can not be e ectively approximated. Because there is no approximation algorithm with guaranteed results, we investigate best effort approximations. We introduce a generic MAP approximation framework. As one instantiation of it, we implement local search coupled with belief propagation (BP) to approximate MAP. We show how to extract approximate evidence retraction information from belief propagation which allows us to perform e cient local search. This allows MAP approximation even on networks that are too complex to even exactly solve the easier problems of computing Pr or MPE. Experimental results indicate that using BP and local search provides accurate MAP estimates in many cases.