Ridge regression in two-parameter solution: Research Articles
Applied Stochastic Models in Business and Industry
Numerical Recipes 3rd Edition: The Art of Scientific Computing
Numerical Recipes 3rd Edition: The Art of Scientific Computing
Least Angle Regression and LASSO for Large Datasets
Statistical Analysis and Data Mining
Mathematical and Computer Modelling: An International Journal
Two-parameter ridge regression and its convergence to the eventual pairwise model
Mathematical and Computer Modelling: An International Journal
A study on the effects of routing symbol design on process model comprehension
Decision Support Systems
Hi-index | 0.98 |
With a simple transformation, the ordinary least squares objective can yield a family of modified ridge regressions which outperforms the regular ridge model. These models have more stable coefficients and a higher quality of fit with the growing profile parameter. With an additional adjustment based on minimization of the residual variance, all the characteristics become even better: the coefficients of these regressions do not shrink to zero when the ridge parameter increases, the coefficient of multiple determination stays high, while bias and generalized cross-validation are low. In contrast to regular ridge regression, the modified ridge models yield robust solutions with various values of the ridge parameter, encompass interpretable coefficients, and good quality characteristics.