Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
WinBUGS – A Bayesian modelling framework: Concepts, structure, and extensibility
Statistics and Computing
On Bayesian model and variable selection using MCMC
Statistics and Computing
Bayesian model learning based on a parallel MCMC strategy
Statistics and Computing
Statistics and Computing
Acceleration of the Multiple-Try Metropolis algorithm using antithetic and stratified sampling
Statistics and Computing
On population-based simulation for static inference
Statistics and Computing
Classification tree analysis using TARGET
Computational Statistics & Data Analysis
Interacting sequential Monte Carlo samplers for trans-dimensional simulation
Computational Statistics & Data Analysis
Estimating Bayes factors via thermodynamic integration and population MCMC
Computational Statistics & Data Analysis
Distributed evolutionary Monte Carlo for Bayesian computing
Computational Statistics & Data Analysis
Hi-index | 0.03 |
A novel class of interacting Markov chain Monte Carlo (MCMC) algorithms, hereby referred to as the Parallel Hierarchical Sampler (PHS), is developed and its mixing properties are assessed. PHS algorithms are modular MCMC samplers designed to produce reliable estimates for multi-modal and heavy-tailed posterior distributions. As such, PHS aims at benefitting statisticians whom, working on a wide spectrum of applications, are more focused on defining and refining models than constructing sophisticated sampling strategies. Convergence of a vanilla PHS algorithm is proved for the case of Metropolis-Hastings within-chain updates. The accuracy of this PHS kernel is compared with that of optimized single-chain and multiple-chain MCMC algorithms for multi-modal mixtures of multivariate Gaussian densities and for 'banana-shaped' heavy-tailed multivariate distributions. These examples show that PHS can yield a dramatic improvement in the precision of MCMC estimators over standard samplers. PHS is then applied to two realistically complex Bayesian model uncertainty scenarios. First, PHS is used to select a low number of meaningful predictors for a Gaussian linear regression model in the presence of high collinearity. Second, the posterior probability of survival trees approximated by PHS indicates that the number and size of liver metastases at the time of diagnosis are predictive of substantial differences in the survival distributions of colorectal cancer patients.