Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
A fast fixed-point algorithm for independent component analysis
Neural Computation
Independent component analysis for identification of artifacts in magnetoencephalographic recordings
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Factorizing multivariate function classes
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Blind source separation via the second characteristic function
Signal Processing
A new concept for separability problems in blind source separation
Neural Computation
Estimating Functions for Blind Separation When Sources Have Variance Dependencies
The Journal of Machine Learning Research
In Search of Non-Gaussian Components of a High-Dimensional Distribution
The Journal of Machine Learning Research
A Projection Pursuit Algorithm for Exploratory Data Analysis
IEEE Transactions on Computers
Linear dimension reduction based on the fourth-order cumulant tensor
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Estimating non-gaussian subspaces by characteristic functions
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Cross-Entropy optimization for independent process analysis
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Uniqueness of non-gaussian subspace analysis
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
A blind source separation technique using second-order statistics
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing
Blind separation of mixture of independent sources through aquasi-maximum likelihood approach
IEEE Transactions on Signal Processing
Blind source separation-semiparametric statistical approach
IEEE Transactions on Signal Processing
Fast and robust fixed-point algorithms for independent component analysis
IEEE Transactions on Neural Networks
Soft Dimension Reduction for ICA by Joint Diagonalization on the Stiefel Manifold
ICA '09 Proceedings of the 8th International Conference on Independent Component Analysis and Signal Separation
Colored subspace analysis: dimension reduction based on a signal's autocorrelation structure
IEEE Transactions on Circuits and Systems Part I: Regular Papers
Hi-index | 0.08 |
In this article, we consider high-dimensional data which contains a low-dimensional non-Gaussian structure contaminated with Gaussian noise. Motivated by the joint diagonalization algorithms, we propose a linear dimension reduction procedure called joint low-dimensional approximation (JLA) to identify the non-Gaussian subspace. The method uses matrices whose non-zero eigen spaces coincide with the non-Gaussian subspace. We also prove its global consistency, that is the true mapping to the non-Gaussian subspace is achieved by maximizing the contrast function defined by such matrices. As examples, we will present two implementations of JLA, one with the fourth-order cumulant tensors and the other with Hessian of the characteristic functions. A numerical study demonstrates validity of our method. In particular, the second algorithm works more robustly and efficiently in most cases.