Minimization methods for non-differentiable functions
Minimization methods for non-differentiable functions
A generalization of Polyak's convergence result for subgradient optimization
Mathematical Programming: Series A and B
Variable target value subgradient method
Mathematical Programming: Series A and B
Backward error and condition of structured linear systems
SIAM Journal on Matrix Analysis and Applications
Fundamentals of statistical signal processing: estimation theory
Fundamentals of statistical signal processing: estimation theory
Collinearity and Total Least Squares
SIAM Journal on Matrix Analysis and Applications
Matrix computations (3rd ed.)
Robust Solutions to Least-Squares Problems with Uncertain Data
SIAM Journal on Matrix Analysis and Applications
Parameter Estimation in the Presence of Bounded Data Uncertainties
SIAM Journal on Matrix Analysis and Applications
An Efficient Algorithm for a Bounded Errors-in-Variables Model
SIAM Journal on Matrix Analysis and Applications
Parameter estimation with multiple sources and levels ofuncertainties
IEEE Transactions on Signal Processing
Hi-index | 35.68 |
A standard and established method for solving a Least Squares problem in the presence of a structured uncertainty is to assemble and solve a semidefinite programming (SDP) equivalent problem. When the problem's dimensions are high, the solution of the structured robust least squares (RLS) problem via SDP becomes an expensive task in a computational complexity sense. We propose a subgradient based solution that utilizes the MinMax structure of the problem. This algorithm is justified by Danskin's MinMax Theorem and enjoys the well-known convergence properties of the subgradient method. The complexity of the new scheme is analyzed and its efficiency is verified by simulations of a robust equalization design.