Theory refinement on Bayesian networks
Proceedings of the seventh conference (1991) on Uncertainty in artificial intelligence
A Bayesian method for constructing Bayesian belief networks from databases
Proceedings of the seventh conference (1991) on Uncertainty in artificial intelligence
Bayesian classification (AutoClass): theory and results
Advances in knowledge discovery and data mining
Efficient Approximations for the MarginalLikelihood of Bayesian Networks with Hidden Variables
Machine Learning - Special issue on learning with probabilistic representations
Cached sufficient statistics for efficient machine learning with large datasets
Journal of Artificial Intelligence Research
A Bayesian approach to learning Bayesian networks with local structure
UAI'97 Proceedings of the Thirteenth conference on Uncertainty in artificial intelligence
Accelerating EM for Large Databases
Machine Learning
Tractable learning of large Bayes net structures from sparse data
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Using temporal data for making recommendations
UAI'01 Proceedings of the Seventeenth conference on Uncertainty in artificial intelligence
Review: learning bayesian networks: Approaches and issues
The Knowledge Engineering Review
Hi-index | 0.00 |
We describe two techniques that significantly improve the running time of several standard machine-learning algorithms when data is sparse. The first technique is an algorithm that efficiently extracts one-way and two-way counts-either real or expected-from discrete data. Extracting such counts is a fundamental step in learning algorithms for constructing a variety of models including decision trees, decision graphs, Bayesian networks, and naive-Bayes clustering models. The second technique is an algorithm that efficiently performs the E-step of the EM algorithm (i.e., inference) when applied to a naive-Bayes clustering model. Using real-world data sets, we demonstrate a dramatic decrease in running time for algorithms that incorporate these techniques.