An efficient line search for nonlinear least squares
Journal of Optimization Theory and Applications
Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
On the limited memory BFGS method for large scale optimization
Mathematical Programming: Series A and B
Robust regression computation computation using iteratively reweighted least squares
SIAM Journal on Matrix Analysis and Applications
TNPACK—A truncated Newton minimization package for large-scale problems: I. Algorithm and usage
ACM Transactions on Mathematical Software (TOMS)
TNPACK—a truncated Newton minimization package for large-scale problems: II. Implementation examples
ACM Transactions on Mathematical Software (TOMS)
Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
ACM Transactions on Mathematical Software (TOMS)
Algorithm 778: L-BFGS-B: Fortran subroutines for large-scale bound-constrained optimization
ACM Transactions on Mathematical Software (TOMS)
Adaptive rest condition potentials: first and second order edge-preserving regularization
Computer Vision and Image Understanding
A numerically reliable approach to robust pole assignment for descriptor systems
Future Generation Computer Systems - Selected papers on theoretical and computational aspects of structural dynamical systems in linear algebra and control
Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent
ACM Transactions on Mathematical Software (TOMS)
Global Convergence of a Memory Gradient Method for Unconstrained Optimization
Computational Optimization and Applications
Computers & Mathematics with Applications
Multi-step nonlinear conjugate gradient methods for unconstrained minimization
Computational Optimization and Applications
Some descent three-term conjugate gradient methods and their global convergence
Optimization Methods & Software
Elastic-wave identification of penetrable obstacles using shape-material sensitivity framework
Journal of Computational Physics
A new Liu-Storey type nonlinear conjugate gradient method for unconstrained optimization problems
Journal of Computational and Applied Mathematics
On centroidal voronoi tessellation—energy smoothness and fast computation
ACM Transactions on Graphics (TOG)
Fast and Accurate 3D Edge Detection for Surface Reconstruction
Proceedings of the 31st DAGM Symposium on Pattern Recognition
Journal of Computational and Applied Mathematics
Optimal learning high-order Markov random fields priors of colour image
ACCV'07 Proceedings of the 8th Asian conference on Computer vision - Volume Part I
Journal of Computational and Applied Mathematics
New step lengths in conjugate gradient methods
Computers & Mathematics with Applications
Efficient maximum entropy reconstruction of nuclear magnetic resonance T1-T2 spectra
IEEE Transactions on Signal Processing
Computational Optimization and Applications
Random lines: a novel population set-based evolutionary global optimization algorithm
EuroGP'11 Proceedings of the 14th European conference on Genetic programming
Line Search Multilevel Optimization as Computational Methods for Dense Optical Flow
SIAM Journal on Imaging Sciences
A comparison of acceleration techniques for nonrigid medical image registration
WBIR'06 Proceedings of the Third international conference on Biomedical Image Registration
SIAM Journal on Optimization
Spectral-scaling quasi-Newton methods with updates from the one parameter of the Broyden family
Journal of Computational and Applied Mathematics
Hi-index | 0.01 |
The development of software for minimization problems is often based on a line search method. We consider line search methods that satisfy sufficient decrease and curvature conditions, and formulate the problem of determining a point that satisfies these two conditions in terms of finding a point in a set T(&mgr;). We describe a search algorithm for this problem that produces a sequence of iterates that converge to a point in T(&mgr;) and that, except for pathological cases, terminates in a finite number of steps. Numerical results for an implementation of the search algorithm on a set of test functions show that the algorithm terminates within a small number of iterations.