Convex analysis and variational problems
Convex analysis and variational problems
Optimization by Vector Space Methods
Optimization by Vector Space Methods
Vector Space Projections: A Numerical Approach to Signal and Image Processing, Neural Nets, and Optics
A Weak-to-Strong Convergence Principle for Fejé-Monotone Methods in Hilbert Spaces
Mathematics of Operations Research
Projection algorithms and monotone operators
Projection algorithms and monotone operators
Convex Optimization
Incremental Support Vector Learning: Analysis, Implementation and Applications
The Journal of Machine Learning Research
Pattern Recognition, Fourth Edition
Pattern Recognition, Fourth Edition
Online prediction of time series data with kernels
IEEE Transactions on Signal Processing
Signal processing in dual domain by adaptive projected subgradient method
DSP'09 Proceedings of the 16th international conference on Digital Signal Processing
Support vector machine techniques for nonlinear equalization
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing
The kernel recursive least-squares algorithm
IEEE Transactions on Signal Processing
Robust Wideband Beamforming by the Hybrid Steepest Descent Method
IEEE Transactions on Signal Processing
An efficient robust adaptive filtering algorithm based on parallelsubgradient projection techniques
IEEE Transactions on Signal Processing
Online Kernel-Based Classification Using Adaptive Projection Algorithms
IEEE Transactions on Signal Processing - Part I
IEEE Transactions on Signal Processing
Convex set theoretic image recovery by extrapolated iterations of parallel subgradient projections
IEEE Transactions on Image Processing
A geometric approach to Support Vector Machine (SVM) classification
IEEE Transactions on Neural Networks
A unified view of adaptive variable-metric projection algorithms
EURASIP Journal on Advances in Signal Processing
The complex Gaussian kernel LMS algorithm
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
Fixed budget quantized kernel least-mean-square algorithm
Signal Processing
Hi-index | 35.69 |
This paper establishes a new paradigm for convexly constrained adaptive learning in reproducing kernel Hilbert spaces (RKHS). Although the technique is of a general nature, we present it in the context of the beamforming problem. A priori knowledge, like beampattern specifications and constraints concerning robustness against steering vector errors, takes the form of multiple closed convex sets in a high (possibly infinite) dimensional RKHS. Every robustness constraint is shown to be equivalent to a min-max optimization task formed by means of the robust statistics -insensitive loss function. Such a multiplicity of specifications turns out to obtain a simple expression by using the rich frame of fixed-point sets of certain mappings defined in a Hilbert space. Moreover, the cost function, that the final solution has to optimize, is expressed as an infinite sequence of convex, nondifferentiable loss functions, springing from the sequence of the incoming training data. A novel adaptive beamforming design, of linear complexity with respect to the number of unknown parameters, to such a constrained nonlinear learning problem is derived by employing a very recently developed version of the adaptive projected subgradient method (APSM). The method produces a sequence that, under mild conditions, exhibits properties like the strong convergence to a beamformer that satisfies all of the imposed constraints, and in the meantime asymptotically minimizes the sequence of the loss functions imposed by the training data. The numerical examples demonstrate that the proposed method displays increased resolution in cases where the classical linear beamforming solutions collapse. Moreover, it leads to solutions, which are in agreement with the imposed a priori knowledge, as opposed to unconstrained online kernel regression techniques.