Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Adaptive blind separation of independent sources: a deflation approach
Signal Processing
The Geometry of Algorithms with Orthogonality Constraints
SIAM Journal on Matrix Analysis and Applications
One-Bit-Matching Conjecture for Independent Component Analysis
Neural Computation
A Constrained EM Algorithm for Independent Component Analysis
Neural Computation
Machine learning problems from optimization perspective
Journal of Global Optimization
Local stability analysis of maximum nongaussianity estimation in independent component analysis
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
Recently, a mathematical proof is obtained in (Liu, Chiu, Xu, 2004) on the so called one-bit-matching conjecture that all the sources can be separated as long as there is an one-to-one same-sign-correspondence between the kurtosis signs of all source probability density functions (pdf's) and the kurtosis signs of all model pdf's (Xu, Cheung, Amari, 1998a), which is widely believed and implicitly supported by many empirical studies. However, this proof is made only in a weak sense that the conjecture is true when the global optimal solution of an ICA criterion is reached. Thus, it can not support the successes of many existing iterative algorithms that usually converge at one of local optimal solutions. In this paper, a new mathematical proof is obtained in a strong sense that the conjecture is also true when anyone of local optimal solutions is reached, in help of investigating convex-concave programming on a polyhedral-set. Theorems have also been proved not only on partial separation of sources when there is a partial matching between the kurtosis signs, but also on an interesting duality of maximization and minimization on source separation. Moreover, corollaries are obtained from the theorems to state that seeking a one-to-one same-sign-correspondence can be replaced by a use of the duality, i.e., super-gaussian sources can be separated via maximization and sub-gaussian sources can be separated via minimization. Also, a corollary is obtained to confirm the symmetric orthogonalization implementation of the kurtosis extreme approach for separating multiple sources in parallel, which works empirically but in a lack of mathematical proof. Furthermore, a linkage has been set up to combinatorial optimization from a Stiefel manifold perspective, with algorithms that guarantee convergence and satisfaction of constraints.