Blind Deconvolution: Errors, Errors Everywhere
Computing in Science and Engineering
Blind Deconvolution: A Matter of Norm
Computing in Science and Engineering
Smoothing splines estimators in functional linear regression with errors-in-variables
Computational Statistics & Data Analysis
Overview of total least-squares methods
Signal Processing
Sylvester Tikhonov-regularization methods in image restoration
Journal of Computational and Applied Mathematics
On a quadratic eigenproblem occurring in regularized total least squares
Computational Statistics & Data Analysis
Range Flow Estimation based on Photonic Mixing Device Data
International Journal of Intelligent Systems Technologies and Applications
Robust constrained receding-horizon predictive control via bounded data uncertainties
Mathematics and Computers in Simulation
Journal of Computational Physics
Regularized Total Least Squares: Computational Aspects and Error Bounds
SIAM Journal on Matrix Analysis and Applications
A box constrained gradient projection algorithm for compressed sensing
Signal Processing
Journal of Signal Processing Systems
Metabolic pathway inference from time series data: a non iterative approach
PRIB'11 Proceedings of the 6th IAPR international conference on Pattern recognition in bioinformatics
Structured Total Maximum Likelihood: An Alternative to Structured Total Least Squares
SIAM Journal on Matrix Analysis and Applications
Parametric Level Set Methods for Inverse Problems
SIAM Journal on Imaging Sciences
Revisiting the brightness constraint: probabilistic formulation and algorithms
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part III
Efficient determination of the hyperparameter in regularized total least squares problems
Applied Numerical Mathematics
Large-scale Tikhonov regularization of total least squares
Journal of Computational and Applied Mathematics
Learning a context aware dictionary for sparse representation
ACCV'12 Proceedings of the 11th Asian conference on Computer Vision - Volume Part II
A note on sparse least-squares regression
Information Processing Letters
Hi-index | 0.01 |
Discretizations of inverse problems lead to systems of linear equations with a highly ill-conditioned coefficient matrix, and in order to compute stable solutions to these systems it is necessary to apply regularization methods. We show how Tikhonov's regularization method, which in its original formulation involves a least squares problem, can be recast in a total least squares formulation suited for problems in which both the coefficient matrix and the right-hand side are known only approximately. We analyze the regularizing properties of this method and demonstrate by a numerical example that, in certain cases with large perturbations, the new method is superior to standard regularization methods.