A resource-allocating network for function interpolation
Neural Computation
Regularization theory and neural networks architectures
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Online prediction of time series data with kernels
IEEE Transactions on Signal Processing
Extended kernel recursive least squares algorithm
IEEE Transactions on Signal Processing
Bounded Kernel-Based Online Learning
The Journal of Machine Learning Research
Kernel Adaptive Filtering: A Comprehensive Introduction
Kernel Adaptive Filtering: A Comprehensive Introduction
Information Theoretic Learning: Renyi's Entropy and Kernel Perspectives
Information Theoretic Learning: Renyi's Entropy and Kernel Perspectives
IEEE Transactions on Signal Processing
The kernel recursive least-squares algorithm
IEEE Transactions on Signal Processing
An error-entropy minimization algorithm for supervised training ofnonlinear adaptive systems
IEEE Transactions on Signal Processing
Entropy minimization for supervised digital communications channelequalization
IEEE Transactions on Signal Processing
The Kernel Least-Mean-Square Algorithm
IEEE Transactions on Signal Processing
Correntropy: Properties and Applications in Non-Gaussian Signal Processing
IEEE Transactions on Signal Processing
Mean square convergence analysis for kernel least mean square algorithm
Signal Processing
Generalized information potential criterion for adaptive system training
IEEE Transactions on Neural Networks
An Information Theoretic Approach of Designing Sparse Kernel Adaptive Filters
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
As an alternative adaptation criterion, the minimum error entropy (MEE) criterion has been receiving increasing attention due to its successful use in, especially, nonlinear and non-Gaussian signal processing. In this paper, we study the application of error entropy minimization to kernel adaptive filtering, a new and promising technique that implements the conventional linear adaptive filters in reproducing kernel Hilbert space (RKHS) and obtains the nonlinear adaptive filters in original input space. The kernel minimum error entropy (KMEE) algorithm is derived, which is essentially a generalized stochastic information gradient (SIG) algorithm in RKHS. The computational complexity of KMEE is just similar to the kernel affine projection algorithm (KAPA). We also utilize the quantization approach to constrain the network size growth, and develop the quantized KMEE (QKMEE) algorithm. Further, we analyze the mean square convergence of KMEE. The energy conservation relation is derived and a sufficient condition that ensures the mean square convergence is obtained. The performance of the new algorithm is demonstrated in nonlinear system identification and short-term chaotic time series prediction.