Adaptive independence samplers

  • Authors:
  • Jonathan M. Keith;Dirk P. Kroese;George Y. Sofronov

  • Affiliations:
  • School of Mathematical Sciences, Queensland University of Technology, Brisbane, Australia 4001;Department of Mathematics, The University of Queensland, Brisbane, Australia 4072;School of Mathematics and Applied Statistics, University of Wollongong, Wollongong, Australia 2522

  • Venue:
  • Statistics and Computing
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Markov chain Monte Carlo (MCMC) is an important computational technique for generating samples from non-standard probability distributions. A major challenge in the design of practical MCMC samplers is to achieve efficient convergence and mixing properties. One way to accelerate convergence and mixing is to adapt the proposal distribution in light of previously sampled points, thus increasing the probability of acceptance. In this paper, we propose two new adaptive MCMC algorithms based on the Independent Metropolis---Hastings algorithm. In the first, we adjust the proposal to minimize an estimate of the cross-entropy between the target and proposal distributions, using the experience of pre-runs. This approach provides a general technique for deriving natural adaptive formulae. The second approach uses multiple parallel chains, and involves updating chains individually, then updating a proposal density by fitting a Bayesian model to the population. An important feature of this approach is that adapting the proposal does not change the limiting distributions of the chains. Consequently, the adaptive phase of the sampler can be continued indefinitely. We include results of numerical experiments indicating that the new algorithms compete well with traditional Metropolis---Hastings algorithms. We also demonstrate the method for a realistic problem arising in Comparative Genomics.