Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
The Geometry of Algorithms with Orthogonality Constraints
SIAM Journal on Matrix Analysis and Applications
On the influence of the kernel on the consistency of support vector machines
The Journal of Machine Learning Research
Efficient svm training using low-rank kernel representations
The Journal of Machine Learning Research
Kernel independent component analysis
The Journal of Machine Learning Research
Characteristic-function-based independent component analysis
Signal Processing - Special section: Security of data hiding technologies
ICA using spacings estimates of entropy
The Journal of Machine Learning Research
Kernel Methods for Measuring Independence
The Journal of Machine Learning Research
The Journal of Machine Learning Research
Optimization Algorithms on Matrix Manifolds
Optimization Algorithms on Matrix Manifolds
Measuring statistical dependence with hilbert-schmidt norms
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
Fast kernel density independent component analysis
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Fast algorithms for mutual information based independent component analysis
IEEE Transactions on Signal Processing - Part I
Consistent independent component analysis and prewhitening
IEEE Transactions on Signal Processing - Part I
Independent component analysis based on nonparametric density estimation
IEEE Transactions on Neural Networks
The FastICA Algorithm Revisited: Convergence Analysis
IEEE Transactions on Neural Networks
Hilbert Space Embeddings and Metrics on Probability Measures
The Journal of Machine Learning Research
The Journal of Machine Learning Research
Computers & Mathematics with Applications
Hi-index | 35.68 |
Recent approaches to independent component analysis (ICA) have used kernel independence measures to obtain highly accurate solutions, particularly where classical methods experience difficulty (for instance, sources with near-zero kurtosis). FastKICA (fast HSIC-based kernel ICA) is a new optimization method for one such kernel independence measure, the Hilbert-Schmidt Independence Criterion (HSIC). The high computational efficiency of this approach is achieved by combining geometric optimization techniques, specifically an approximate Newton-like method on the orthogonal group, with accurate estimates of the gradient and Hessian based on an incomplete Cholesky decomposition. In contrast to other efficient kernel-based ICA algorithms, FastKICA is applicable to any twice differentiable kernel function. Experimental results for problems with large numbers of sources and observations indicate that FastKICA provides more accurate solutions at a given cost than gradient descent on HSIC. Comparing with other recently published ICA methods, FastKICA is competitive in terms of accuracy, relatively insensitive to local minima when initialized far from independence, and more robust towards outliers. An analysis of the local convergence properties of FastKICA is provided.