Scientific computing on vector computers
Scientific computing on vector computers
Residual smoothing techniques for iterative methods
SIAM Journal on Scientific Computing
Hybrid procedures for solving linear equations
Numerische Mathematik
Gradient Method with Retards and Generalizations
SIAM Journal on Numerical Analysis
Accuracy and Stability of Numerical Algorithms
Accuracy and Stability of Numerical Algorithms
Gradient Methods with Adaptive Step-Sizes
Computational Optimization and Applications
Hi-index | 0.98 |
The gradient method with retards (GMR) is a nonmonotone iterative method recently developed to solve large, sparse, symmetric, and positive definite linear systems of equations. Its performance depends on the retard parameter m. The larger the m, the faster the convergence, but also the faster the loss of precision is observed in the intermediate computations of the algorithm. This loss of precision is mainly produced by the nonmonotone behavior of the norm of the gradient which also increases with m. In this work, we first use a recently developed inexpensive technique to smooth down the nonmonotone behavior of the method. Then we show that it is possible to choose m adaptively during the process to avoid loss of precision. Our adaptive choice of m can be viewed as a compromise between numerical stability and speed of convergence. Numerical results on some classical test problems are presented to illustrate the good numerical properties.