Architectural Synthesis of Computational Engines for Subband Adaptive Filtering
Journal of VLSI Signal Processing Systems
Robust Adaptive Identification of Fuzzy Systems with Uncertain Data
Fuzzy Optimization and Decision Making
State-space recursive least-squares: part I
Signal Processing - Special section: New trends and findings in antenna array processing for radar
Robust adaptive techniques for minimization of EOG artefacts from EEG signals
Signal Processing - Signal processing in UWB communications
Adaptive filters with error nonlinearities: mean-square analysis and optimum design
EURASIP Journal on Applied Signal Processing
EURASIP Journal on Applied Signal Processing
EURASIP Journal on Applied Signal Processing
Feedback analysis of U-model via small gain theorem
ACMOS'08 Proceedings of the 10th WSEAS International Conference on Automatic Control, Modelling & Simulation
MIMO U-model based control: real-time tracking control and feedback analysis via small gain theorem
WSEAS Transactions on Circuits and Systems
Performance Analysis of Gradient Adaptive Lattice Joint Processing Algorithm
CAR '09 Proceedings of the 2009 International Asia Conference on Informatics in Control, Automation and Robotics
Krylov-proportionate adaptive filtering techniques not limited to sparse systems
IEEE Transactions on Signal Processing
On the estimation of parameters of Takagi-Sugeno fuzzy filte
IEEE Transactions on Fuzzy Systems
Adaptive fuzzy filtering in a deterministic setting
IEEE Transactions on Fuzzy Systems
Direct data-driven filter design for uncertain LTI systems with bounded noise
Automatica (Journal of IFAC)
H∞-robustness of adaptive filters against measurement noise and parameter drift
Automatica (Journal of IFAC)
An energy-gain bounding approach to robust fuzzy identification
Automatica (Journal of IFAC)
On the convergence of LMS filters under periodic signals
Digital Signal Processing
Hi-index | 35.69 |
We show that the celebrated least-mean squares (LMS) adaptive algorithm is H∞ optimal. The LMS algorithm has been long regarded as an approximate solution to either a stochastic or a deterministic least-squares problem, and it essentially amounts to updating the weight vector estimates along the direction of the instantaneous gradient of a quadratic cost function. We show that the LMS can be regarded as the exact solution to a minimization problem in its own right. Namely, we establish that it is a minimax filter: it minimizes the maximum energy gain from the disturbances to the predicted errors, whereas the closely related so-called normalized LMS algorithm minimizes the maximum energy gain from the disturbances to the filtered errors. Moreover, since these algorithms are central H∞ filters, they minimize a certain exponential cost function and are thus also risk-sensitive optimal. We discuss the various implications of these results and show how they provide theoretical justification for the widely observed excellent robustness properties of the LMS filter