Kernel least mean square algorithm with constrained growth

  • Authors:
  • Puskal P. Pokharel;Weifeng Liu;Jose C. Principe

  • Affiliations:
  • Computational NeuroEngineering Laboratory, ECE Department, University of Florida, Gainesville, FL 32611, USA;Computational NeuroEngineering Laboratory, ECE Department, University of Florida, Gainesville, FL 32611, USA;Computational NeuroEngineering Laboratory, ECE Department, University of Florida, Gainesville, FL 32611, USA

  • Venue:
  • Signal Processing
  • Year:
  • 2009

Quantified Score

Hi-index 0.08

Visualization

Abstract

The linear least mean squares (LMS) algorithm has been recently extended to a reproducing kernel Hilbert space, resulting in an adaptive filter built from a weighted sum of kernel functions evaluated at each incoming data sample. With time, the size of the filter as well as the computation and memory requirements increase. In this paper, we shall propose a new efficient methodology for constraining the increase in length of a radial basis function (RBF) network resulting from the kernel LMS algorithm without significant sacrifice on performance. The method involves sequential Gaussian elimination steps on the Gram matrix to test the linear dependency of the feature vector corresponding to each new input with all the previous feature vectors. This gives an efficient method of continuing the learning as well as restricting the number of kernel functions used.