AIP Conference Proceedings 151 on Neural Networks for Computing
A neural net for blind separation of nonstationary signals
Neural Networks
Natural gradient works efficiently in learning
Neural Computation
Flexible Independent Component Analysis
Journal of VLSI Signal Processing Systems
Equivariant nonstationary source separation
Neural Networks
Adaptive Differential Decorrelation: A Natural Gradient Algorithm
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Nonholonomic Orthogonal Learning Algorithms for Blind Source Separation
Neural Computation
Journal of Cognitive Neuroscience
Hi-index | 0.00 |
Decorrelation and its higher-order generalization, independent component analysis (ICA), are fundamental and important tasks in unsupervised learning, that were studied mainly in the domain of Hebbian learning. In this paper we present a variation of the natural gradient ICA, differential ICA, where the learning relies on the concurrent change of output variables. We interpret the differential learning as the maximum likelihood estimation of parameters with latent variables represented by the random walk model. In such a framework, we derive the differential ICA algorithm and, in addition, we also present the differential decorrelation algorithm that is treated as a special instance of the differential ICA. Algorithm derivation and local stability analysis are given with some numerical experimental results.