On MCMC sampling in hierarchical longitudinal models

  • Authors:
  • Siddhartha Chib;Bradley P. Carlin

  • Affiliations:
  • John M. Olin School of Business, Washington University, One Brookings Drive, St. Louis, Missouri 63130, USA;Division of Biostatistics, School of Public Health, University of Minnesota, Box 303 Mayo Building, Minneapolis, Minnesota 55455, USA

  • Venue:
  • Statistics and Computing
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

Markov chain Monte Carlo (MCMC) algorithms have revolutionized Bayesian practice. In their simplest form (i.e., when parameters are updated one at a time) they are, however, often slow to converge when applied to high-dimensional statistical models. A remedy for this problem is to block the parameters into groups, which are then updated simultaneously using either a Gibbs or Metropolis-Hastings step. In this paper we construct several (partially and fully blocked) MCMC algorithms for minimizing the autocorrelation in MCMC samples arising from important classes of longitudinal data models. We exploit an identity used by Chib (1995) in the context of Bayes factor computation to show how the parameters in a general linear mixed model may be updated in a single block, improving convergence and producing essentially independent draws from the posterior of the parameters of interest. We also investigate the value of blocking in non-Gaussian mixed models, as well as in a class of binary response data longitudinal models. We illustrate the approaches in detail with three real-data examples.