Adaptive filter theory (2nd ed.)
Adaptive filter theory (2nd ed.)
A family of minimum renyi's error entropy algorithm for information processing
A family of minimum renyi's error entropy algorithm for information processing
The Plenacoustic Function and Its Sampling
IEEE Transactions on Signal Processing
Correntropy: Properties and Applications in Non-Gaussian Signal Processing
IEEE Transactions on Signal Processing
Generalized correlation function: definition, properties, and application to blind equalization
IEEE Transactions on Signal Processing - Part I
Generalized information potential criterion for adaptive system training
IEEE Transactions on Neural Networks
Correntropy-Based document clustering via nonnegative matrix factorization
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
The C-loss function for pattern classification
Pattern Recognition
Hi-index | 0.00 |
Correntropy has been recently defined as a localised similarity measure between two random variables, exploiting higher order moments of the data. This paper presents the use of Correntropy as a cost function for minimizing the error between the desired signal and the output of an adaptive filter, in order to train the filter weights. We have shown that this cost function has the computational simplicity of the popular LMS algorithm, along with the robustness that is obtained by using higher order moments for error minimization. We apply this technique for system identification and noise cancellation configurations. The results demonstrate the advantages of the proposed cost function as compared to LMS algorithm, and the recently proposed Minimum Error Entropy (MEE) cost function.