Evaluating derivatives: principles and techniques of algorithmic differentiation
Evaluating derivatives: principles and techniques of algorithmic differentiation
Approximating the marginal likelihood estimate for models with random parameters
Applied Mathematics and Computation
Pointwise and functional approximations in Monte Carlo maximum likelihood estimation
Statistics and Computing
The hierarchical-likelihood approach to autoregressive stochastic volatility models
Computational Statistics & Data Analysis
Semiparametric stochastic frontier models for clustered data
Computational Statistics & Data Analysis
Practical estimation of high dimensional stochastic differential mixed-effects models
Computational Statistics & Data Analysis
Computational Statistics & Data Analysis
Hi-index | 0.03 |
Fitting of non-Gaussian hierarchical random effects models by approximate maximum likelihood can be made automatic to the same extent that Bayesian model fitting can be automated by the program BUGS. The word ''automatic'' means that the technical details of computation are made transparent to the user. This is achieved by combining a technique from computer science known as ''automatic differentiation'' with the Laplace approximation for calculating the marginal likelihood. Automatic differentiation, which should not be confused with symbolic differentiation, is mostly unknown to statisticians, and hence basic ideas and results are reviewed. The computational performance of the approach is compared to that of existing mixed-model software on a suite of datasets selected from the mixed-model literature.