An entropy-based learning algorithm of Bayesian conditional trees
UAI '92 Proceedings of the eighth conference on Uncertainty in Artificial Intelligence
Computer-based probabilistic-network construction
Computer-based probabilistic-network construction
Mining association rules between sets of items in large databases
SIGMOD '93 Proceedings of the 1993 ACM SIGMOD international conference on Management of data
IEEE Transactions on Pattern Analysis and Machine Intelligence
Beyond market baskets: generalizing association rules to correlations
SIGMOD '97 Proceedings of the 1997 ACM SIGMOD international conference on Management of data
A tutorial on learning with Bayesian networks
Learning in graphical models
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
Data Mining Techniques: For Marketing, Sales, and Customer Support
Data Mining Techniques: For Marketing, Sales, and Customer Support
Learning Bayesian networks from data: an information-theory based approach
Artificial Intelligence
From Recombination of Genes to the Estimation of Distributions I. Binary Parameters
PPSN IV Proceedings of the 4th International Conference on Parallel Problem Solving from Nature
Population-Based Incremental Learning: A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning
Tractable learning of large Bayes net structures from sparse data
ICML '04 Proceedings of the twenty-first international conference on Machine learning
The equation for response to selection and its use for prediction
Evolutionary Computation
Bayesian networks learning for gene expression datasets
IDA'05 Proceedings of the 6th international conference on Advances in Intelligent Data Analysis
Hi-index | 0.00 |
In data mining, association and correlation rules are inferred from data in order to highlight statistical dependencies among attributes. The metrics defined for evaluating these rules can be exploited to score relationships between attributes in Bayesian network learning. In this paper, we propose two novel methods for learning Bayesian networks from data that are based on the K2 learning algorithm and that improve it by exploiting parameters normally defined for association and correlation rules. In particular, we propose the algorithms K2-Lift and K2-$X^{2}$, that exploit the lift metric and the $X^2$ metric respectively. We compare K2-Lift, K2-$X^{2}$ with K2 on artificial data and on three test Bayesian networks. The experiments show that both our algorithms improve K2 with respect to the quality of the learned network. Moreover, a comparison of K2-Lift and K2-$X^{2}$ with a genetic algorithm approach on two benchmark networks show superior results on one network and comparable results on the other.