Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
Minimizing pseudoconvex functions on convex compact sets
Journal of Optimization Theory and Applications
Active constraints, indefinite quadratic test problems, and complexity
Journal of Optimization Theory and Applications
Editorial: Special issue on algorithms for design of experiments
Computational Statistics & Data Analysis
Hi-index | 0.03 |
The basic structure of algorithms for numerical computation of optimal approximate linear regression designs is briefly summarized. First order methods are contrasted to second order methods. A first order method, also called a vertex direction method, uses a local linear approximation of the optimality criterion at the actual point. A second order method is a Newton or quasi-Newton method, employing a local quadratic approximation. Specific application is given to a multiple first order regression model on a cube with heteroscedasticity caused by random coefficients with known dispersion matrix. For a general (positive definite) dispersion matrix the algorithms work for moderate dimension of the cube. If the dispersion matrix is diagonal, a restriction to invariant designs is legal by equivariance of the model and the algorithms also work for large dimension.