The nature of statistical learning theory
The nature of statistical learning theory
Nonlinear black-box modeling in system identification: a unified overview
Automatica (Journal of IFAC) - Special issue on trends in system identification
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
A bibliography on nonlinear system identification
Signal Processing - Special section on digital signal processing for multimedia communications and services
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Orthogonal series density estimation and the kernel eigenvalue problem
Neural Computation
Subset based least squares subspace regression in RKHS
Neurocomputing
An improved training algorithm for nonlinear kernel discriminants
IEEE Transactions on Signal Processing - Part I
IEEE Transactions on Signal Processing
The kernel recursive least-squares algorithm
IEEE Transactions on Signal Processing
Uncertainty principles and ideal atomic decomposition
IEEE Transactions on Information Theory
A generalized uncertainty principle and sparse representation in pairs of bases
IEEE Transactions on Information Theory
Greed is good: algorithmic results for sparse approximation
IEEE Transactions on Information Theory
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
Pruning error minimization in least squares support vector machines
IEEE Transactions on Neural Networks
Adaptive constrained learning in reproducing Kernel Hilbert spaces: the robust beamforming case
IEEE Transactions on Signal Processing
A decentralized approach for nonlinear prediction of time series data in sensor networks
EURASIP Journal on Wireless Communications and Networking - Special issue on theoretical and algorithmic foundations of wireless ad hoc and sensor networks
Artificial Intelligence Review
Mean square convergence analysis for kernel least mean square algorithm
Signal Processing
Nonlinear spline adaptive filtering
Signal Processing
Design of distribution independent noise filters with online PDF estimation
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part I
A sparse kernel algorithm for online time series data prediction
Expert Systems with Applications: An International Journal
Fixed budget quantized kernel least-mean-square algorithm
Signal Processing
Kernel autoregressive models using Yule-Walker equations
Signal Processing
Kernel minimum error entropy algorithm
Neurocomputing
An information theoretic sparse kernel algorithm for online learning
Expert Systems with Applications: An International Journal
Online detection of abnormal events in video streams
Journal of Electrical and Computer Engineering
Hi-index | 35.69 |
Kernel-based algorithms have been a topic of considerable interest in the machine learning community over the last ten years. Their attractiveness resides in their elegant treatment of nonlinear problems. They have been successfully applied to pattern recognition, regression and density estimation. A common characteristic of kernel-based methods is that they deal with kernel expansions whose number of terms equals the number of input data, making them unsuitable for online applications. Recently, several solutions have been proposed to circumvent this computational burden in time series prediction problems. Nevertheless, most of them require excessively elaborate and costly operations. In this paper, we investigate a new model reduction criterion that makes computationally demanding sparsification procedures unnecessary. The increase in the number of variables is controlled by the coherence parameter, a fundamental quantity that characterizes the behavior of dictionaries in sparse approximation problems. We incorporate the coherence criterion into a new kernel-based affine projection algorithm for time series prediction. We also derive the kernel-based normalized LMS algorithm as a particular case. Finally, experiments are conducted to compare our approach to existing methods.