Numerical recipes in C (2nd ed.): the art of scientific computing
Numerical recipes in C (2nd ed.): the art of scientific computing
Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
IEEE Transactions on Pattern Analysis and Machine Intelligence
Bayesian variable selection for Poisson regression with underreported responses
Computational Statistics & Data Analysis
Hi-index | 0.03 |
We estimate parameters in the context of a discrete-time hidden Markov model with two latent states and two observed states through a Bayesian approach. We provide a Gibbs sampling algorithm for longitudinal data that ensures parameter identifiability. We examine two approaches to start the algorithm for estimation. The first approach generates the initial latent data from transition probability estimates under the false assumption of perfect classification. The second approach requires an initial guess of the classification probabilities and obtains bias-adjusted approximated estimators of the latent transition probabilities based on the observed data. These probabilities are then used to generate the initial latent data set based on the observed data set. Both approaches are illustrated on medical data and the performance of estimates is examined through simulation studies. The approach using bias-adjusted estimators is the best choice of the two options, since it generates a plausible initial latent data set. Our situation is particularly applicable to diagnostic testing, where specifying the range of plausible classification rates may be more feasible than specifying initial values for transition probabilities.