Convergence rates for Markov chains
SIAM Review
Stochastic Logic Programs: Sampling, Inference and Applications
UAI '00 Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence
Markov Chain Monte Carlo using Tree-Based Priors on Model Structure
UAI '01 Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence
ICML '05 Proceedings of the 22nd international conference on Machine learning
Bayes optimal classification for decision trees
Proceedings of the 25th international conference on Machine learning
Optimal constraint-based decision tree induction from itemset lattices
Data Mining and Knowledge Discovery
Hi-index | 0.00 |
A general method for defining informative priors on statistical models is presented and applied specifically to the space of classification and regression trees. A Bayesian approach to learning such models from data is taken, with the Metropolis-Hastings algorithm being used to approximately sample from the posterior. By only using proposal distributions closely tied to the prior, acceptance probabilities are easily computable via marginal likelihood ratios, whatever the prior used. Our approach is empirically tested by varying (i) the data, (ii) the prior and (iii) the proposal distribution. A comparison with related work is given.