Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
Shape and motion from image streams under orthography: a factorization method
International Journal of Computer Vision
Numerical recipes in C (2nd ed.): the art of scientific computing
Numerical recipes in C (2nd ed.): the art of scientific computing
A Paraperspective Factorization Method for Shape and Motion Recovery
IEEE Transactions on Pattern Analysis and Machine Intelligence
Matrix computations (3rd ed.)
In Defense of the Eight-Point Algorithm
IEEE Transactions on Pattern Analysis and Machine Intelligence
On the Optimization Criteria Used in Two-View Motion Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
EigenTracking: Robust Matching and Tracking of Articulated Objects Using a View-Based Representation
International Journal of Computer Vision
Efficient Region Tracking With Parametric Models of Geometry and Illumination
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical Optimization for Geometric Computation: Theory and Practice
Statistical Optimization for Geometric Computation: Theory and Practice
Optimization Software Guide
Correction to Construction of Panoramic Image Mosaics with Global and Local Alignment
International Journal of Computer Vision
Statistical Bias of Conic Fitting and Renormalization
IEEE Transactions on Pattern Analysis and Machine Intelligence
Principal Component Analysis with Missing Data and Its Application to Polyhedral Object Modeling
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hierarchical Model-Based Motion Estimation
ECCV '92 Proceedings of the Second European Conference on Computer Vision
Multiple View Geometry in Computer Vision
Multiple View Geometry in Computer Vision
Lucas-Kanade 20 Years On: A Unifying Framework
International Journal of Computer Vision
Convex Optimization
Active Appearance Models Revisited
International Journal of Computer Vision
Damped Newton Algorithms for Matrix Factorization with Missing Data
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)
Optimization Algorithms on Subspaces: Revisiting Missing Data Problem in Low-Rank Matrix
International Journal of Computer Vision
An iterative image registration technique with an application to stereo vision
IJCAI'81 Proceedings of the 7th international joint conference on Artificial intelligence - Volume 2
The power of convex relaxation: near-optimal matrix completion
IEEE Transactions on Information Theory
A Singular Value Thresholding Algorithm for Matrix Completion
SIAM Journal on Optimization
Optimal reduced-rank estimation and filtering
IEEE Transactions on Signal Processing
Recovering the missing components in a large noisy low-rank matrix: application to SFM
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
In this paper, we investigate how the Gauss-Newton Hessian matrix affects the basin of convergence in Newton-type methods. Although the Newton algorithm is theoretically superior to the Gauss-Newton algorithm and the Levenberg-Marquardt (LM) method as far as their asymptotic convergence rate is concerned, the LM method is often preferred in nonlinear least squares problems in practice. This paper presents a theoretical analysis of the advantage of the Gauss-Newton Hessian matrix. It is proved that the Gauss-Newton approximation function is the only nonnegative convex quadratic approximation that retains a critical property of the original objective function: taking the minimal value of zero on an $(n-1)$-dimensional manifold (or affine subspace). Due to this property, the Gauss-Newton approximation does not change the zero-on-$(n-1)$-D “structure” of the original problem, explaining the reason why the Gauss-Newton Hessian matrix is preferred for nonlinear least squares problems, especially when the initial point is far from the solution.