Independent component analysis: algorithms and applications
Neural Networks
Classes of kernels for machine learning: a statistics perspective
The Journal of Machine Learning Research
Kernel independent component analysis
The Journal of Machine Learning Research
An introduction to variable and feature selection
The Journal of Machine Learning Research
Multidimensional dependency measures
Journal of Multivariate Analysis
Correntropy as a novel measure for nonlinearity tests
Signal Processing
A Hilbert Space Embedding for Distributions
ALT '07 Proceedings of the 18th international conference on Algorithmic Learning Theory
Pattern Recognition
Robust feature extraction via information theoretic learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
On speeding up computation in information theoretic learning
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Fast kernel density independent component analysis
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Correntropy: Properties and Applications in Non-Gaussian Signal Processing
IEEE Transactions on Signal Processing
Generalized correlation function: definition, properties, and application to blind equalization
IEEE Transactions on Signal Processing - Part I
Strictly positive-definite spike train kernels for point-process divergences
Neural Computation
Hi-index | 0.08 |
In this paper, we propose a novel test of independence based on the concept of correntropy. We explore correntropy from a statistical perspective and discuss its properties in the context of testing independence. We introduce the novel concept of parametric correntropy and design a test of independence based on it. We further discuss how the proposed test relaxes the assumption of Gaussianity. Finally, we discuss some computational issues related to the proposed method and compare it with state-of-the-art techniques.