A Fast Hill-Climbing Algorithm for Bayesian Networks Structure Learning

  • Authors:
  • José A. Gámez;Juan L. Mateo;José M. Puerta

  • Affiliations:
  • Computing System Department and SIMD-i3A, University of Castilla-La Mancha, Albacete, Spain 02071;Computing System Department and SIMD-i3A, University of Castilla-La Mancha, Albacete, Spain 02071;Computing System Department and SIMD-i3A, University of Castilla-La Mancha, Albacete, Spain 02071

  • Venue:
  • ECSQARU '07 Proceedings of the 9th European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In the score plus search based Bayesian networks structure learning approach, the most used method is hill climbing (HC), because its implementation is good trade-off between CPU requirements, accuracy of the obtained model, and ease of implementation. Because of these features and to the fact that HC with the classical operators guarantees to obtain a minimal I-map, this approach is really appropriate to deal with high dimensional domains. In this paper we revisited a previously developed HC algorithm (termed constrained HC, or CHC in short) that takes advantage of some scoring metrics properties in order to restrict during the search the parent set of each node. The main drawback of CHC is that there is no warranty of obtaining a minimal I-map, and so the algorithm includes a second stage in which an unconstrained HC is launched by taking as initial solution the one returned by the constrained search stage. In this paper we modify CHC in order to guarantee that its output is a minimal I-map and so the second stage is not needed. In this way we save a considerable amount of CPU time, making the algorithm best suited for high dimensional datasets. A proof is provided about the minimal I-map condition of the returned network, and also computational experiments are reported to show the gain with respect to CPU requirements.