A stable and efficient algorithm for nonlinear orthogonal distance regression
SIAM Journal on Scientific and Statistical Computing
Constrained B-spline curve and surface fitting
Computer-Aided Design
Parameter optimization in approximating curves and surfaces to measurement data
Computer Aided Geometric Design
Fundamentals of computer aided geometric design
Fundamentals of computer aided geometric design
Global reparametrization for curve approximation
Computer Aided Geometric Design
Approximation in normal linear spaces
Journal of Computational and Applied Mathematics - Special issue on numerical analysis in the 20th century vol. 1: approximation theory
Historical developments in convergence analysis for Newton's and Newton-like methods
Journal of Computational and Applied Mathematics - Special issue on numerical analysis 2000 Vol. IV: optimization and nonlinear equations
Finite Algorithms in Optimization and Data Analysis
Finite Algorithms in Optimization and Data Analysis
Journal of Computational and Applied Mathematics
Fitting B-spline curves to point clouds by curvature-based squared distance minimization
ACM Transactions on Graphics (TOG)
Evolution-based least-squares fitting using Pythagorean hodograph spline curves
Computer Aided Geometric Design
Industrial geometry: recent advances and applications in CAD
Computer-Aided Design
A revisit to least squares orthogonal distance fitting of parametric curves and surfaces
GMP'08 Proceedings of the 5th international conference on Advances in geometric modeling and processing
Robust model-based vasculature detection in noisy biomedical images
IEEE Transactions on Information Technology in Biomedicine
Hi-index | 0.00 |
We discuss the problem of fitting a curve or surface to given measurement data. In many situations, the usual least-squares approach (minimization of the sum of squared norms of residual vectors) is not suitable, as it implicitly assumes a Gaussian distribution of the measurement errors. In those cases, it is more appropriate to minimize other functions (which we will call norm-like functions) of the residual vectors. This is well understood in the case of scalar residuals, where the technique of iteratively re-weighted least-squares, which originated in statistics (Huber in Robust statistics, 1981) is known to be a Gauss–Newton-type method for minimizing a sum of norm-like functions of the residuals. We extend this result to the case of vector-valued residuals. It is shown that simply treating the norms of the vector-valued residuals as scalar ones does not work. In order to illustrate the difference we provide a geometric interpretation of the iterative minimization procedures as evolution processes.