Transformation and weighting in regression
Transformation and weighting in regression
Sampling from the posterior distribution in generalized linear mixed models
Statistics and Computing
Nonparametric regression using linear combinations of basis functions
Statistics and Computing
Predictive performance of Dirichlet process shrinkage methods in linear regression
Computational Statistics & Data Analysis
Automatic Bayesian quantile regression curve fitting
Statistics and Computing
On conditional variance estimation in nonparametric regression
Statistics and Computing
Hi-index | 0.00 |
Our article presents a general treatment of the linear regression model, in which the error distribution is modelled nonparametrically and the error variances may be heteroscedastic, thus eliminating the need to transform the dependent variable in many data sets. The mean and variance components of the model may be either parametric or nonparametric, with parsimony achieved through variable selection and model averaging. A Bayesian approach is used for inference with priors that are data-based so that estimation can be carried out automatically with minimal input by the user. A Dirichlet process mixture prior is used to model the error distribution nonparametrically; when there are no regressors in the model, the method reduces to Bayesian density estimation, and we show that in this case the estimator compares favourably with a well-regarded plug-in density estimator. We also consider a method for checking the fit of the full model. The methodology is applied to a number of simulated and real examples and is shown to work well.