A System of Subroutines for Iteratively Reweighted Least Squares Computations
ACM Transactions on Mathematical Software (TOMS)
S an Interactive Environment for Data Analysis and Graphics
S an Interactive Environment for Data Analysis and Graphics
Online algorithm based on support vectors for orthogonal regression
Pattern Recognition Letters
Hi-index | 0.00 |
The routine converts any standard regression algorithm (that calculates both the coefficients and residuals) into a corresponding orthogonal regression algorithm. Thus, a standard, or robust, or L1 regression algorithm is converted into the corresponding standard, or robust, or L1 orthogonal algorithm. Such orthogonal procedures are important for three basic reasons. First, they solve the classical errors-in-variables (EV) regression problem. Standard L2 orthogonal regression, obtained by converting ordinary least squares regression, is the maximum likelihood solution of the EV problem under Gaussian assumptions. However, this L2 solution is known to be unstable under even slight deviations from the model. Thus this routine's ability to create robust orthogonal regression algorithms from robust ordinary regression algorithms will also be very useful in practice. Second, orthogonal regression is intimately related to principal components procedures. Therefore, this routine can also be used to create corresponding L1, robust, etc., principal components algorithms. And third, orthogonal regression treats the x and y variables symmetrically. This is very important in many science and engineering modeling problems. Monte Carlo studies, which test the effectiveness of the routine under a variety of types of data, are given.