Fast kernel-based independent component analysis

  • Authors:
  • Hao Shen;Stefanie Jegelka;Arthur Gretton

  • Affiliations:
  • Institute for Data Processing, Technische Universität München, München, Germany and Department of Information Engineering, The Australian National University, Australia and NICTA, A ...;Department of Empirical Inference for Machine Learning and Perception, Max Planck Institute for Biological Cybernetics, Tübingen, Germany;Department of Empirical Inference for Machine Learning and Perception, Max Planck Institute for Biological Cybernetics, Tübingen, Germany

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 2009

Quantified Score

Hi-index 35.68

Visualization

Abstract

Recent approaches to independent component analysis (ICA) have used kernel independence measures to obtain highly accurate solutions, particularly where classical methods experience difficulty (for instance, sources with near-zero kurtosis). FastKICA (fast HSIC-based kernel ICA) is a new optimization method for one such kernel independence measure, the Hilbert-Schmidt Independence Criterion (HSIC). The high computational efficiency of this approach is achieved by combining geometric optimization techniques, specifically an approximate Newton-like method on the orthogonal group, with accurate estimates of the gradient and Hessian based on an incomplete Cholesky decomposition. In contrast to other efficient kernel-based ICA algorithms, FastKICA is applicable to any twice differentiable kernel function. Experimental results for problems with large numbers of sources and observations indicate that FastKICA provides more accurate solutions at a given cost than gradient descent on HSIC. Comparing with other recently published ICA methods, FastKICA is competitive in terms of accuracy, relatively insensitive to local minima when initialized far from independence, and more robust towards outliers. An analysis of the local convergence properties of FastKICA is provided.