Principal component neural networks: theory and applications
Principal component neural networks: theory and applications
SVD Algorithms: APEX-like versus Subspace Methods
Neural Processing Letters
Projection approximation subspace tracking
IEEE Transactions on Signal Processing
Learning in linear neural networks: a survey
IEEE Transactions on Neural Networks
IDEAL'07 Proceedings of the 8th international conference on Intelligent data engineering and automated learning
Hi-index | 0.00 |
We present a numerical and structural comparison of three neural PCA techniques: The GHA by Sanger, the APEX by Kung and Diamantaras, and the ψ–APEX first proposed by the present author. Through computer simulations we illustrate the performances of the algorithms in terms of convergence speed and minimal attainable error; then an evaluation of the computational efforts for the different algorithms is presented and discussed. A close examination of the obtained results shows that the members of the new class improve the numerical performances of the considered existing algorithms, and are also easier to implement.