Jacobi Angles for Simultaneous Diagonalization
SIAM Journal on Matrix Analysis and Applications
The Geometry of Algorithms with Orthogonality Constraints
SIAM Journal on Matrix Analysis and Applications
Kernel PCA and de-noising in feature spaces
Proceedings of the 1998 conference on Advances in neural information processing systems II
Independent component analysis: algorithms and applications
Neural Networks
Cluster Analysis of Biomedical Image Time-Series
International Journal of Computer Vision
Slow feature analysis: unsupervised learning of invariances
Neural Computation
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
The Journal of Machine Learning Research
Joint low-rank approximation for extracting non-Gaussian subspaces
Signal Processing
In Search of Non-Gaussian Components of a High-Dimensional Distribution
The Journal of Machine Learning Research
A robust model for spatiotemporal dependencies
Neurocomputing
On Relevant Dimensions in Kernel Feature Spaces
The Journal of Machine Learning Research
ICA'07 Proceedings of the 7th international conference on Independent component analysis and signal separation
Estimating non-gaussian subspaces by characteristic functions
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Uniqueness of non-gaussian subspace analysis
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
A blind source separation technique using second-order statistics
IEEE Transactions on Signal Processing
Integrating Discriminant and Descriptive Information for Dimension Reduction and Classification
IEEE Transactions on Circuits and Systems for Video Technology
Reconstruction and Recognition of Tensor-Based Objects With Concurrent Subspaces Analysis
IEEE Transactions on Circuits and Systems for Video Technology
Fast and robust fixed-point algorithms for independent component analysis
IEEE Transactions on Neural Networks
An information geometrical view of stationary subspace analysis
ICANN'11 Proceedings of the 21st international conference on Artificial neural networks - Volume Part II
Contrast functions for independent subspace analysis
LVA/ICA'12 Proceedings of the 10th international conference on Latent Variable Analysis and Signal Separation
Hi-index | 0.00 |
Identifying relevant signals within high-dimensional observations is an important preprocessing step for efficient data analysis. However, many classical dimension reduction techniques such as principal component analysis do not take the often rich statistics of real-world data into account, and thereby fail if for example the signal space is of low power but meaningful in terms of some other statistics. With "colored subspace analysis," we propose a method for linear dimension reduction that evaluates the time structure of the multivariate observations. We differentiate the signal subspace from noise by searching for a subspace of nontrivially autocorrelated data. We prove that the resulting signal subspace is uniquely determined by the data, given that all white components have been removed. Algorithmically we propose three efficient methods to perform this search, based on joint diagonalization, using a component clustering scheme, and via joint low-rank approximation. In contrast to temporal mixture approaches from blind signal processing we do not need a generative model, i.e., we do not require the existence of sources, so the model is applicable to any wide-sense stationary time series without restrictions. Moreover, since the method is based on second-order time structure, it can be efficiently implemented and applied even in large dimensions. Numerical examples together with an application to dimension reduction of functional MRI recordings demonstrate the usefulness of the proposed method. The implementation is publicly available as a Matlab package at http://cmb.helmholtzmuenchen. de/CSA.