Adaptive signal processing
A resource-allocating network for function interpolation
Neural Computation
Adaptive filter theory (2nd ed.)
Adaptive filter theory (2nd ed.)
Regularization theory and neural networks architectures
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Kernel independent component analysis
The Journal of Machine Learning Research
IEEE Transactions on Signal Processing
The kernel recursive least-squares algorithm
IEEE Transactions on Signal Processing
The Kernel Least-Mean-Square Algorithm
IEEE Transactions on Signal Processing
Channel equalization using adaptive complex radial basis function networks
IEEE Journal on Selected Areas in Communications
Adaptive kernel principal component analysis
Signal Processing
Hi-index | 0.08 |
The linear least mean squares (LMS) algorithm has been recently extended to a reproducing kernel Hilbert space, resulting in an adaptive filter built from a weighted sum of kernel functions evaluated at each incoming data sample. With time, the size of the filter as well as the computation and memory requirements increase. In this paper, we shall propose a new efficient methodology for constraining the increase in length of a radial basis function (RBF) network resulting from the kernel LMS algorithm without significant sacrifice on performance. The method involves sequential Gaussian elimination steps on the Gram matrix to test the linear dependency of the feature vector corresponding to each new input with all the previous feature vectors. This gives an efficient method of continuing the learning as well as restricting the number of kernel functions used.