Kernel least mean square algorithm with constrained growth
Signal Processing
Extended kernel recursive least squares algorithm
IEEE Transactions on Signal Processing
Active noise control based on kernel least-mean-square algorithm
Asilomar'09 Proceedings of the 43rd Asilomar conference on Signals, systems and computers
Extraction of signals with specific temporal structure using kernel methods
IEEE Transactions on Signal Processing
The complex Gaussian kernel LMS algorithm
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
An online AUC formulation for binary classification
Pattern Recognition
Mean square convergence analysis for kernel least mean square algorithm
Signal Processing
ECML PKDD'12 Proceedings of the 2012 European conference on Machine Learning and Knowledge Discovery in Databases - Volume Part II
KIMEL: A kernel incremental metalearning algorithm
Signal Processing
Fixed budget quantized kernel least-mean-square algorithm
Signal Processing
Kernel minimum error entropy algorithm
Neurocomputing
Hi-index | 35.69 |
The combination of the famed kernel trick and the least-mean-square (LMS) algorithm provides an interesting sample-by-sample update for an adaptive filter in reproducing kernel Hilbert spaces (RKHS), which is named in this paper the KLMS. Unlike the accepted view in kernel methods, this paper shows that in the finite training data case, the KLMS algorithm is well posed in RKHS without the addition of an extra regularization term to penalize solution norms as was suggested by Kivinen [Kivinen, Smola and Williamson, ldquoOnline Learning With Kernels,rdquo IEEE Transactions on Signal Processing, vol. 52, no. 8, pp. 2165-2176, Aug. 2004] and Smale [Smale and Yao, ldquoOnline Learning Algorithms,rdquo Foundations in Computational Mathematics, vol. 6, no. 2, pp. 145-176, 2006]. This result is the main contribution of the paper and enhances the present understanding of the LMS algorithm with a machine learning perspective. The effect of the KLMS step size is also studied from the viewpoint of regularization. Two experiments are presented to support our conclusion that with finite data the KLMS algorithm can be readily used in high dimensional spaces and particularly in RKHS to derive nonlinear, stable algorithms with comparable performance to batch, regularized solutions.