Nonlinear preconditioned conjugate gradient and least-squares finite elements
Computer Methods in Applied Mechanics and Engineering
Preconditioning and boundary conditions
SIAM Journal on Numerical Analysis
Preconditioning second-order elliptic operators: experiment and theory
SIAM Journal on Scientific and Statistical Computing - Special issue on iterative methods in numerical linear algebra
SIAM Journal on Numerical Analysis
Optimal equivalent preconditioners
SIAM Journal on Numerical Analysis
First-order system least squares for second-order partial differential equations: part I
SIAM Journal on Numerical Analysis - Special issue: the articles in this issue are dedicated to Seymour V. Parter
Minimal surfaces and Sobolev gradients
SIAM Journal on Scientific Computing
First-Order System Least Squares for Second-Order Partial Differential Equations: Part II
SIAM Journal on Numerical Analysis
Sobolev Gradients and the Ginzburg--Landau Functional
SIAM Journal on Scientific Computing
The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
SIAM Journal on Optimization
Variable Preconditioning via Quasi-Newton Methods for Nonlinear Problems in Hilbert Space
SIAM Journal on Numerical Analysis
Preconditioning operators and Sobolevgradients for nonlinear elliptic problems
Computers & Mathematics with Applications
Journal of Computational Physics
A New Sobolev Gradient Method for Direct Minimization of the Gross-Pitaevskii Energy with Rotation
SIAM Journal on Scientific Computing
Image Sharpening via Sobolev Gradient Flows
SIAM Journal on Imaging Sciences
Journal of Computational Physics
A trust region method for constructing triangle-mesh approximations of parametric minimal surfaces
Applied Numerical Mathematics
Hi-index | 0.01 |
Least squares methods are effective for solving systems of partial differential equations. In the case of nonlinear systems the equations are usually linearized by a Newton iteration or successive substitution method, and then treated as a linear least squares problem. We show that it is often advantageous to form a sum of squared residuals first, and then compute a zero of the gradient with a Newton-like method. We present an effective method, based on Sobolev gradients, for treating the nonlinear least squares problem directly. The method is based on trust-region subproblems defined by a Sobolev norm and solved by a preconditioned conjugate gradient method with an effective preconditioner that arises naturally from the Sobolev space setting. The trust-region method is shown to be equivalent to a Levenberg-Marquardt method which blends a Newton or Gauss-Newton iteration with a gradient descent iteration, but uses a Sobolev gradient in place of the Euclidean gradient. We also provide an introduction to the Sobolev gradient method and discuss its relationship to operator preconditioning with equivalent operators.