Nonlinear Complex-Valued Extensions of Hebbian Learning: An Essay
Neural Computation
Theoretical Computer Science
Global Convergence of a PCA Learning Algorithm with a Constant Learning Rate
Computers & Mathematics with Applications
Convergence analysis of the OJAn MCA learning algorithm by the deterministic discrete time method
Theoretical Computer Science
A stable MCA learning algorithm
Computers & Mathematics with Applications
Stability and Chaos of a Class of Learning Algorithms for ICA Neural Networks
Neural Processing Letters
A unified learning algorithm to extract principal and minor components
Digital Signal Processing
IEEE Transactions on Signal Processing
A Self-Stabilizing Neural Algorithm for Total Least Squares Filtering
Neural Processing Letters
A robust and globally convergent PCA learning algorithm
Control and Intelligent Systems
Analysis of Hebbian models with lateral weight connections
IWANN'07 Proceedings of the 9th international work conference on Artificial neural networks
A family of fuzzy learning algorithms for robust principal component analysis neural networks
IEEE Transactions on Fuzzy Systems
On the discrete-time dynamics of a class of self-stabilizing MCA extraction algorithms
IEEE Transactions on Neural Networks
A self-stabilizing MSA algorithm in high-dimension data stream
Neural Networks
Stability and chaos analysis for an ICA algorithm
Computers & Mathematics with Applications
Adaptive multiple minor directions extraction in parallel using a PCA neural network
Theoretical Computer Science
A modified MCA EXIN algorithm and its convergence analysis
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
Analysis of the sanger hebbian neural network
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
Monotonic convergence of a nonnegative ICA algorithm on stiefel manifold
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
On the discrete time dynamics of a self-stabilizing MCA learning algorithm
Mathematical and Computer Modelling: An International Journal
Convergence analysis for feng's MCA neural network learning algorithm
ISNN'13 Proceedings of the 10th international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.01 |
In this paper, the dynamical behavior of the basic node used for constructing Hebbian artificial neural networks (NNs) is analyzed. Hebbian NNs are employed in communications and signal processing applications, among others. They have been traditionally studied on a continuous-time formulation whose validity is justified via some analytical procedures that presume, among other hypotheses, a specific asymptotic behavior of the learning gain. The main contribution of this paper is the study of a deterministic discrete-time (DDT) formulation that characterizes the average evolution of the node, preserving the discrete-time form of the original network and gathering a more realistic behavior of the learning gain. The new deterministic discrete-time model provides some unstability results (critical for the case of large similar variance signals) which are drastically different to the ones known for the continuous-time formulation. Simulation examples support the presented results, illustrating the practical limitations of the basic Hebbian model.