Adaptive filter theory
A resource-allocating network for function interpolation
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Sparse on-line Gaussian processes
Neural Computation
Sparse Online Greedy Support Vector Regression
ECML '02 Proceedings of the 13th European Conference on Machine Learning
On the influence of the kernel on the consistency of support vector machines
The Journal of Machine Learning Research
Learning the Kernel Function via Regularization
The Journal of Machine Learning Research
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Adaptive Filters
Linear Algebra With Applications
Linear Algebra With Applications
Learning convex combinations of continuously parameterized basic kernels
COLT'05 Proceedings of the 18th annual conference on Learning Theory
The kernel recursive least-squares algorithm
IEEE Transactions on Signal Processing
A tutorial on particle filters for online nonlinear/non-GaussianBayesian tracking
IEEE Transactions on Signal Processing
The Kernel Least-Mean-Square Algorithm
IEEE Transactions on Signal Processing
An online AUC formulation for binary classification
Pattern Recognition
Mean square convergence analysis for kernel least mean square algorithm
Signal Processing
Kernel minimum error entropy algorithm
Neurocomputing
Hi-index | 35.68 |
This paper presents a kernelized version of the extended recursive least squares (EX-KRLS) algorithm which implements for the first time a general linear state model in reproducing kernel Hilbert spaces (RKHS), or equivalently a general nonlinear state model in the input space. The center piece of this development is a reformulation of the well known extended recursive least squares (EX-RLS) algorithm in RKHS which only requires inner product operations between input vectors, thus enabling the application of the kernel property (commonly known as the kernel trick). The first part of the paper presents a set of theorems that shows the generality of the approach. The EX-KRLS is preferable to 1) a standard kernel recursive least squares (KRLS) in applications that require tracking the state-vector of general linear state-space models in the kernel space, or 2) an EX-RLS when the application requires a nonlinear observation and state models. The second part of the paper compares the EX-KRLS in nonlinear Rayleigh multipath channel tracking and in Lorenz system modeling problem. We show that the proposed algorithm is able to outperform the standard KRLS and EX-RLS in both simulations.