Generalization by weight-elimination with application to forecasting
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Natural gradient works efficiently in learning
Neural Computation
Learning nonlinear overcomplete representations for efficient coding
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
A Linear Non-Gaussian Acyclic Model for Causal Discovery
The Journal of Machine Learning Research
Blind separation of mixture of independent sources through aquasi-maximum likelihood approach
IEEE Transactions on Signal Processing
ICA with Sparse Connections: Revisited
ICA '09 Proceedings of the 8th International Conference on Independent Component Analysis and Signal Separation
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part II
Hi-index | 0.00 |
When applying independent component analysis (ICA), sometimes that the connections between the observed mixtures and the recovered independent components (or the original sources) to be sparse, to make the interpretation easier or to reduce the model complexity. In this paper we propose natural gradient algorithms for ICA with a sparse separation matrix, as well as ICA with a sparse mixing matrix. The sparsity of the matrix is achieved by applying certain penalty functions to its entries. The properties of the penalty functions are investigated. Experimental results on both artificial data and causality discovery in financial stocks show the usefulness of the proposed methods.