UAI '88 Proceedings of the Fourth Annual Conference on Uncertainty in Artificial Intelligence
Equivalence and synthesis of causal models
UAI '90 Proceedings of the Sixth Annual Conference on Uncertainty in Artificial Intelligence
Optimal structure identification with greedy search
The Journal of Machine Learning Research
Learning Bayesian Networks
A reconstruction algorithm for the essential graph
International Journal of Approximate Reasoning
Probabilistic Conditional Independence Structures
Probabilistic Conditional Independence Structures
Efficient Algorithms for Conditional Independence Inference
The Journal of Machine Learning Research
On open questions in the geometric approach to structural learning Bayesian nets
International Journal of Approximate Reasoning
Characteristic imsets for learning Bayesian network structure
International Journal of Approximate Reasoning
International Journal of Approximate Reasoning
International Journal of Approximate Reasoning
Learning Bayesian network structure: Towards the essential graph by integer linear programming tools
International Journal of Approximate Reasoning
Hi-index | 0.00 |
We recall the basic idea of an algebraic approach to learning Bayesian network (BN) structures, namely to represent every BN structure by a certain (uniquely determined) vector, called a standard imset. The main result of the paper is that the set of standard imsets is the set of vertices (=extreme points) of a certain polytope. Motivated by the geometric view, we introduce the concept of the geometric neighborhood for standard imsets, and, consequently, for BN structures. Then we show that it always includes the inclusion neighborhood, which was introduced earlier in connection with the greedy equivalence search (GES) algorithm. The third result is that the global optimum of an affine function over the polytope coincides with the local optimum relative to the geometric neighborhood. To illustrate the new concept by an example, we describe the geometric neighborhood in the case of three variables and show it differs from the inclusion neighborhood. This leads to a simple example of the failure of the GES algorithm if data are not ''generated'' from a perfectly Markovian distribution. The point is that one can avoid this failure if the search technique is based on the geometric neighborhood instead. We also found out what is the geometric neighborhood in the case of four and five variables.