A routine for converting regression algorithms into corresponding orthogonal regression algorithms

  • Authors:
  • Larry Ammann;John Van Ness

  • Affiliations:
  • Univ. of Texas at Dallas, Richardson;Univ. of Texas at Dallas, Richardson

  • Venue:
  • ACM Transactions on Mathematical Software (TOMS)
  • Year:
  • 1988

Quantified Score

Hi-index 0.00

Visualization

Abstract

The routine converts any standard regression algorithm (that calculates both the coefficients and residuals) into a corresponding orthogonal regression algorithm. Thus, a standard, or robust, or L1 regression algorithm is converted into the corresponding standard, or robust, or L1 orthogonal algorithm. Such orthogonal procedures are important for three basic reasons. First, they solve the classical errors-in-variables (EV) regression problem. Standard L2 orthogonal regression, obtained by converting ordinary least squares regression, is the maximum likelihood solution of the EV problem under Gaussian assumptions. However, this L2 solution is known to be unstable under even slight deviations from the model. Thus this routine's ability to create robust orthogonal regression algorithms from robust ordinary regression algorithms will also be very useful in practice. Second, orthogonal regression is intimately related to principal components procedures. Therefore, this routine can also be used to create corresponding L1, robust, etc., principal components algorithms. And third, orthogonal regression treats the x and y variables symmetrically. This is very important in many science and engineering modeling problems. Monte Carlo studies, which test the effectiveness of the routine under a variety of types of data, are given.