Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Adaptive blind separation of independent sources: a deflation approach
Signal Processing
The Geometry of Algorithms with Orthogonality Constraints
SIAM Journal on Matrix Analysis and Applications
One-Bit-Matching Conjecture for Independent Component Analysis
Neural Computation
A Constrained EM Algorithm for Independent Component Analysis
Neural Computation
One-Bit-Matching ICA theorem, convex-concave programming, and combinatorial optimization
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
Independent Component Analysis for Time-dependent Processes Using AR Source Model
Neural Processing Letters
Letters: Gaussian moments for noisy unifying model
Neurocomputing
An EM algorithm for independent component analysis using an AR-GGD source model
AI'07 Proceedings of the 20th Australian joint conference on Advances in artificial intelligence
Machine learning problems from optimization perspective
Journal of Global Optimization
Hi-index | 0.01 |
According to the proof by Liu, Chiu, and Xu (2004) on the so-called one-bit-matching conjecture (Xu, Cheung, and Amari, 1998a), all the sources can be separated as long as there is an one-to-one same-sign correspondence between the kurtosis signs of all source probability density functions (pdf's) and the kurtosis signs of all model pdf's, which is widely believed and implicitly supported by many empirical studies. However, this proof is made only in a weak sense that the conjecture is true when the global optimal solution of an independent component analysis criterion is reached. Thus, it cannot support the successes of many existing iterative algorithms that usually converge at one of the local optimal solutions. This article presents a new mathematical proof that is obtained in a strong sense that the conjecture is also true when any one of local optimal solutions is reached in helping to investigating convex-concave programming on a polyhedral set. Theorems are also provided not only on partial separation of sources when there is a partial matching between the kurtosis signs, but also on an interesting duality of maximization and minimization on source separation. Moreover, corollaries are obtained on an interesting duality, with supergaussian sources separated by maximization and subgaussian sources separated by minimization. Also, a corollary is obtained to confirm the symmetric orthogonalization implementation of the kurtosis extreme approach for separating multiple sources in parallel, which works empirically but lacks mathematical proof. Furthermore, a linkage has been set up to combinatorial optimization from a distribution approximation perspective and a Stiefel manifold perspective, with algorithms that guarantee convergence as well as satisfaction of constraints.