Introducing asymmetry into interneuron learning
Neural Computation
A Neural Network for PCA and Beyond
Neural Processing Letters
A class of learning algorithms for principal component analysis and minor component analysis
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
We review a recently proposed family of functions for finding principal and minor components of a data set. We extend the family so that the Principal Subspace of the data set is found by using a method similar to that known as the Bigradient algorithm. We then amend the method in a way which was shown to change a Principal Component Analysis (PCA) rule to a rule for performing Factor Analysis (FA) and show its power on a standard problem. We find in both cases that, whereas the one Principal Component family all have similar convergence and stability properties, the multiple output networks for both PCA and FA have different properties.