Maximum likelihood bounded tree-width Markov networks
Artificial Intelligence
Optimal structure identification with greedy search
The Journal of Machine Learning Research
Large-Sample Learning of Bayesian Networks is NP-Hard
The Journal of Machine Learning Research
The Minimum Description Length Principle (Adaptive Computation and Machine Learning)
The Minimum Description Length Principle (Adaptive Computation and Machine Learning)
On the sample complexity of learning Bayesian networks
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Hi-index | 0.00 |
Traditional model selection techniques involve training all candidate models in order to select the one that best balances training performance and expected generalization to new cases. When the number of candidate models is very large, though, training all of them is prohibitive. We present a method to automatically explore a large space of models of varying complexities, organized based on the structure of the example space. In our approach, one model is trained by minimizing a minimum description length objective function, and then derivatives of the objective with respect to model parameters over distinct classes of the training data are analyzed in order to suggest what model specifications and generalizations are likely to improve performance. This directs a search through the space of candidates, capable of finding a high performance model despite evaluating a small fraction of the total number of models.We apply our approach in a complex fantasy (American) football prediction domain and demonstrate that it finds high quality model structures, tailored to the amount of training data available.