Matrix analysis
Elements of information theory
Elements of information theory
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
The Geometry of Algorithms with Orthogonality Constraints
SIAM Journal on Matrix Analysis and Applications
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Kernel-based nonlinear blind source separation
Neural Computation
Estimation of entropy and mutual information
Neural Computation
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
On the influence of the kernel on the consistency of support vector machines
The Journal of Machine Learning Research
Kernel partial least squares regression in reproducing kernel hilbert space
The Journal of Machine Learning Research
Efficient svm training using low-rank kernel representations
The Journal of Machine Learning Research
Kernel independent component analysis
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Dimensionality Reduction for Supervised Learning with Reproducing Kernel Hilbert Spaces
The Journal of Machine Learning Research
ICA using spacings estimates of entropy
The Journal of Machine Learning Research
Source separation in post-nonlinear mixtures
IEEE Transactions on Signal Processing
A blind source separation technique using second-order statistics
IEEE Transactions on Signal Processing
Blind separation of mixture of independent sources through aquasi-maximum likelihood approach
IEEE Transactions on Signal Processing
Blind source separation based on time-frequency signalrepresentations
IEEE Transactions on Signal Processing
Consistent Feature Selection for Pattern Recognition in Polynomial Time
The Journal of Machine Learning Research
Undercomplete Blind Subspace Deconvolution
The Journal of Machine Learning Research
Towards scalable and data efficient learning of Markov boundaries
International Journal of Approximate Reasoning
Consistency of the Group Lasso and Multiple Kernel Learning
The Journal of Machine Learning Research
A Hilbert Space Embedding for Distributions
ALT '07 Proceedings of the 18th international conference on Algorithmic Learning Theory
Semi-supervised Laplacian Regularization of Kernel Canonical Correlation Analysis
ECML PKDD '08 Proceedings of the 2008 European Conference on Machine Learning and Knowledge Discovery in Databases - Part I
Fast kernel-based independent component analysis
IEEE Transactions on Signal Processing
On speeding up computation in information theoretic learning
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Twin Gaussian Processes for Structured Prediction
International Journal of Computer Vision
Joint diagonalization of kernels for information fusion
CIARP'07 Proceedings of the Congress on pattern recognition 12th Iberoamerican conference on Progress in pattern recognition, image analysis and applications
Consistent Nonparametric Tests of Independence
The Journal of Machine Learning Research
Hilbert Space Embeddings and Metrics on Probability Measures
The Journal of Machine Learning Research
Causal inference using the algorithmic Markov condition
IEEE Transactions on Information Theory
Information-geometric approach to inferring causal directions
Artificial Intelligence
Dimensionality reduction by Mixed Kernel Canonical Correlation Analysis
Pattern Recognition
The Journal of Machine Learning Research
Sparse spectral clustering method based on the incomplete Cholesky decomposition
Journal of Computational and Applied Mathematics
Hi-index | 0.06 |
We introduce two new functionals, the constrained covariance and the kernel mutual information, to measure the degree of independence of random variables. These quantities are both based on the covariance between functions of the random variables in reproducing kernel Hilbert spaces (RKHSs). We prove that when the RKHSs are universal, both functionals are zero if and only if the random variables are pairwise independent. We also show that the kernel mutual information is an upper bound near independence on the Parzen window estimate of the mutual information. Analogous results apply for two correlation-based dependence functionals introduced earlier: we show the kernel canonical correlation and the kernel generalised variance to be independence measures for universal kernels, and prove the latter to be an upper bound on the mutual information near independence. The performance of the kernel dependence functionals in measuring independence is verified in the context of independent component analysis.