A fast fixed-point algorithm for independent component analysis
Neural Computation
Convex Optimization
Optimization Algorithms on Matrix Manifolds
Optimization Algorithms on Matrix Manifolds
Local convergence analysis of FastICA
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Globally convergent blind source separation based on a multiuser kurtosis maximization criterion
IEEE Transactions on Signal Processing
Optimization algorithms exploiting unitary constraints
IEEE Transactions on Signal Processing
Complex-Valued Matrix Differentiation: Techniques and Key Results
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing
Undermodeled equalization: a characterization of stationary pointsfor a family of blind criteria
IEEE Transactions on Signal Processing
Monotonic convergence of fixed-point algorithms for ICA
IEEE Transactions on Neural Networks
The FastICA Algorithm Revisited: Convergence Analysis
IEEE Transactions on Neural Networks
Complex blind source extraction from noisy mixtures using second-order statistics
IEEE Transactions on Circuits and Systems Part I: Regular Papers
One-unit second-order blind identification with reference for short transient signals
Information Sciences: an International Journal
On the convergence of ICA algorithms with weighted orthogonal constraint
Digital Signal Processing
Hi-index | 35.68 |
Independent component analysis (ICA) problem is often posed as the maximization/minimization of an objective/ cost function under a unitary constraint, which presumes the prewhitening of the observed mixtures. The parallel adaptive algorithms corresponding to this optimization setting, where all the separators are jointly trained, are typically implemented by a gradient-based update of the separation matrix followed by the so-called symmetrical orthogonalization procedure to impose the unitary constraint. This article addresses the convergence analysis of such algorithms, which has been considered as a difficult task due to the complication caused by the minimum-(Frobenius or induced 2-norm) distance mapping step. We first provide a general characterization of the stationary points corresponding to these algorithms. Furthermore, we show that fixed point algorithms employing symmetrical orthogonalization are monotonically convergent for convex objective functions. We later generalize this convergence result for nonconvex objective functions. At the last part of the article, we concentrate on the kurtosis objective function as a special case. We provide a new set of critical points based on Householder reflection and we also provide the analysis for the minima/maxima/saddle-point classification of these critical points.