An Information-Theoretic Approach to Neural Computing
An Information-Theoretic Approach to Neural Computing
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
A Theory for Learning by Weight Flow on Stiefel-Grassman Manifold
Neural Computation
Robust recursive least squares learning algorithm for principal component analysis
IEEE Transactions on Neural Networks
A class of learning algorithms for principal component analysis and minor component analysis
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Learning in linear neural networks: a survey
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper presents a class of algorithms for principal component analysis obtained by modification of a class of algorithms for principal subspace analysis (PSA) known as Plumbley's General Stochastic Approximation. Modification of the algorithms is based on Time-Oriented Hierarchical Method. The method uses two distinct time scales. On a faster time scale PSA algorithm is responsible for the "behaviour" of all output neurons. On a slower time scale, output neurons will compete for fulfilment of their "own interests". On this scale, basis vectors in the principal subspace are rotated toward the principal eigenvectors.