Learning first-order Bayesian networks

  • Authors:
  • Ratthachat Chatpatanasiri;Boonserm Kijsirikul

  • Affiliations:
  • Department of Computer Engineering, Chulalongkorn University, Pathumwan, Bangkok, Thailand;Department of Computer Engineering, Chulalongkorn University, Pathumwan, Bangkok, Thailand

  • Venue:
  • AI'03 Proceedings of the 16th Canadian society for computational studies of intelligence conference on Advances in artificial intelligence
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

A first-order Bayesian network (FOBN) is an extension of first-order logic in order to cope with uncertainty problems. Therefore, learning an FOBN might be a good idea to build an effective classifier. However, because of a complication of the FOBN, directly learning it from relational data is difficult. This paper proposes another way to learn FOBN classifiers. We adapt Inductive Logic Programming (ILP) and a Bayesian network learner to construct the FOBN. To do this, we propose a feature extraction algorithm to generate the significant parts (features) of ILP rules, and use these features as a main structure of the induced the FOBN. Next, to learn the remaining parts of the FOBN structure and its conditional probability tables by a standard Bayesian network learner, we also propose an efficient propositionalisation algorithm for translating the original data into the single table format. In this work, we provide a preliminary evaluation on the mutagenesis problem, a standard dataset for relational learning problem. The results are compared with the state-of-the-art ILP learner, the PROGOL system.