Bayesian modeling of several covariance matrices and some results on propriety of the posterior for linear regression with correlated and/or heterogeneous errors

  • Authors:
  • Michael J. Daniels

  • Affiliations:
  • Department of Statistics, University of Florida, 207 Griffin-Floyd Hall Gainesville, FL 32611-8545, USA

  • Venue:
  • Journal of Multivariate Analysis
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We explore simultaneous modeling of several covariance matrices across groups using the spectral (eigenvalue) decomposition and modified Cholesky decomposition. We introduce several models for covariance matrices under different assumptions about the mean structure. We consider 'dependence' matrices, which tend to have many parameters, as constant across groups and/or parsimoniously modeled via a regression formulation. For 'variances', we consider both unrestricted across groups and more parsimoniously modeled via log-linear models. In all these models, we explore the propriety of the posterior when improper priors are used on the mean and 'variance' parameters (and in some cases, on components of the 'dependence' matrices). The models examined include several common Bayesian regression models, whose propriety has not been previously explored, as special cases. We propose a simple approach to weaken the assumption of constant dependence matrices in an automated fashion and describe how to compute Bayes factors to test the hypothesis of constant 'dependence' across groups. The models are applied to data from two longitudinal clinical studies.