Seemingly unrelated regression equations models
Seemingly unrelated regression equations models
All possible subset regressions using the QR decomposition
Computational Statistics & Data Analysis
Computers and Industrial Engineering
Matrix computations (3rd ed.)
Designing and Building Parallel Programs: Concepts and Tools for Parallel Software Engineering
Designing and Building Parallel Programs: Concepts and Tools for Parallel Software Engineering
Computational methods for modifying seemingly unrelated regressions models
Journal of Computational and Applied Mathematics - Special issue: Proceedings of the international conference on linear algebra and arithmetic, Rabat, Morocco, 28-31 May 2001
Seemingly unrelated regression model with unequal size observations: computational aspects
Computational Statistics & Data Analysis
Nonlinear optimization and parallel computing
Parallel Computing - Special issue: Parallel computing in numerical optimization
Computational methods for modifying seemingly unrelated regressions models
Journal of Computational and Applied Mathematics - Special issue: Proceedings of the international conference on linear algebra and arithmetic, Rabat, Morocco, 28-31 May 2001
Analysis of new variable selection methods for discriminant analysis
Computational Statistics & Data Analysis
A graph approach to generate all possible regression submodels
Computational Statistics & Data Analysis
Iterated importance sampling in missing data problems
Computational Statistics & Data Analysis
Message-passing two steps least square algorithms for simultaneous equations models
PPAM'07 Proceedings of the 7th international conference on Parallel processing and applied mathematics
Computational Statistics & Data Analysis
A GRASP method for building classification trees
Expert Systems with Applications: An International Journal
A fast algorithm for non-negativity model selection
Statistics and Computing
Hi-index | 0.00 |
Efficient parallel algorithms for computing all possible subset regression models are proposed. The algorithms are based on the dropping columns method that generates a regression tree. The properties of the tree are exploited in order to provide an efficient load balancing which results in no inter-processor communication. Theoretical measures of complexity suggest linear speedup. The parallel algorithms are extended to deal with the general linear and seemingly unrelated regression models. The case where new variables are added to the regression model is also considered. Experimental results on a shared memory machine are presented and analyzed.