Matrix multiplication via arithmetic progressions
Journal of Symbolic Computation - Special issue on computational algebraic complexity
Better subset regression using the nonnegative garrote
Technometrics
Model uncertainty and variable selection in Bayesian lasso regression
Statistics and Computing
Hi-index | 0.03 |
We describe an efficient, exact Bayesian algorithm applicable to both variable selection and model averaging problems. A fully Bayesian approach provides a more complete characterization of the posterior ensemble of possible sub-models, but presents a computational challenge as the number of candidate variables increases. While several approximation techniques have been developed to deal with problems that contain a large numbers of candidate variables, including BMA, IBMA, MCMC and Gibbs Sampling approaches, here we focus on improving the time complexity of exact inference using a recursive algorithm (Exact Bayesian Inference in Regression, or EBIR) that uses components of one sub-model to rapidly generate another and prove that its time complexity is O(m^2), where m is the number candidate variables. Testing against simulated data shows that EBIR significantly reduces compute time without sacrificing accuracy, while comparisons to the results obtained by MCMC approaches on the Crime and Punishment data set show that model averaging yields improved predictive performance over two model selection approaches. In addition, we show that finite mixtures of centroid solutions provide a means to better characterize the shape of multimodal posterior spaces than any individual model. Finally, we describe how the BIC approximations employed in the BMA and IBMA algorithms can be replaced by an EBIR calculation of equal time complexity and illustrate the departure of the BIC approximation from the exact Bayesian inference of EBIR.