Comparison of advanced large-scale minimization algorithms for the solution of inverse ill-posed problems

  • Authors:
  • A. K. Alekseev;I. M. Navon;J. L. Steward

  • Affiliations:
  • Department of Aerodynamics and Heat Transfer, RSC, ENERGIA, Korolev (Kaliningrad), Moscow Region, Russian Federation;Department of Scientific Computing, Florida State University, Tallahassee, FL, USA;Department of Scientific Computing, Florida State University, Tallahassee, FL, USA

  • Venue:
  • Optimization Methods & Software
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We compare the performance of several robust large-scale minimization algorithms for the unconstrained minimization of an ill-posed inverse problem. The parabolized Navier-Stokes equation model was used for adjoint parameter estimation. The methods compared consist of three versions of nonlinear conjugate-gradient (CG) method, quasi-Newton Broyden-Fletcher-Goldfarb-Shanno (BFGS), the limited-memory quasi-Newton (L-BFGS) [D.C. Liu and J. Nocedal, On the limited memory BFGS method for large scale minimization, Math. Program. 45 (1989), pp. 503-528], truncated Newton (T-N) method [S.G. Nash, Preconditioning of truncated Newton methods, SIAM J. Sci. Stat. Comput. 6 (1985), pp. 599-616, S.G. Nash, Newton-type minimization via the Lanczos method, SIAM J. Numer. Anal. 21 (1984), pp. 770-788] and a new hybrid algorithm proposed by Morales and Nocedal [J.L. Morales and J. Nocedal, Enriched methods for large-scale unconstrained optimization, Comput. Optim. Appl. 21 (2002), pp. 143-154]. For all the methods employed and tested, the gradient of the cost function is obtained via an adjoint method. A detailed description of the algorithmic form of minimization algorithms employed in the minimization comparison is provided. For the inviscid case, the CG-descent method of Hager [W.W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and efficient line search, SIAM J. Optim. 16 (1) (2005), pp. 170-192] performed the best followed closely by the hybrid method [J.L. Morales and J. Nocedal, Enriched methods for large-scale unconstrained optimization, Comput. Optim. Appl. 21 (2002), pp. 143-154], while in the viscous case, the hybrid method emerged as the best performed followed by CG [D.F. Shanno and K.H. Phua, Remark on algorithm 500. Minimization of unconstrained multivariate functions, ACM Trans. Math. Softw. 6 (1980), pp. 618-622] and CG-descent [W.W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and efficient line search, SIAM J. Optim. 16 (1) (2005), pp. 170-192]. This required an adequate choice of parameters in the CG-descent method as well as controlling the number of L-BFGS and T-N iterations to be interlaced in the hybrid method.