An introduction to analog and digital communications
An introduction to analog and digital communications
Robust regression computation computation using iteratively reweighted least squares
SIAM Journal on Matrix Analysis and Applications
Adaptation and decorrelation in the cortex
The computing neuron
Modified Hebbian learning for curve and surface fitting
Neural Networks
Convergent algorithm for sensory receptive field development
Neural Computation
The role of constraints in Hebbian learning
Neural Computation
Principal component neural networks: theory and applications
Principal component neural networks: theory and applications
A fast fixed-point algorithm for independent component analysis
Neural Computation
An Analysis of the Fundamental Structure of Complex-Valued Neurons
Neural Processing Letters
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Oscillations in Neural Systems
Oscillations in Neural Systems
Applied Neural Networks for Signal Processing
Applied Neural Networks for Signal Processing
Preintegration lateral inhibition enhances unsupervised learning
Neural Computation
Interpretation of Optical Flow Through Complex Neural Network
IWANN '93 Proceedings of the International Workshop on Artificial Neural Networks: New Trends in Neural Computation
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Orthogonality of decision boundaries in complex-valued neural networks
Neural Computation
Robust Principal Component Analysis with Adaptive Selection for Tuning Parameters
The Journal of Machine Learning Research
Neural learning by geometric integration of reduced 'rigid-body' equations
Journal of Computational and Applied Mathematics
A Theory for Learning by Weight Flow on Stiefel-Grassman Manifold
Neural Computation
Fixed-point neural independent component analysis algorithms on the orthogonal group
Future Generation Computer Systems
Projection approximation subspace tracking
IEEE Transactions on Signal Processing
On gradient adaptation with unit-norm constraints
IEEE Transactions on Signal Processing
On the existence of universal nonlinearities for blind sourceseparation
IEEE Transactions on Signal Processing
Robust error measure for supervised neural network learning with outliers
IEEE Transactions on Neural Networks
Fast and robust fixed-point algorithms for independent component analysis
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A theory for learning based on rigid bodies dynamics
IEEE Transactions on Neural Networks
On the discrete-time dynamics of the basic Hebbian neural network node
IEEE Transactions on Neural Networks
Algorithms for nonnegative independent component analysis
IEEE Transactions on Neural Networks
A new design method for the complex-valued multistate Hopfield associative memory
IEEE Transactions on Neural Networks
Fast fixed-point neural blind-deconvolution algorithm
IEEE Transactions on Neural Networks
Robust principal component analysis by self-organizing rules based on statistical physics approach
IEEE Transactions on Neural Networks
Principal component extraction using recursive least squares learning
IEEE Transactions on Neural Networks
Learning in linear neural networks: a survey
IEEE Transactions on Neural Networks
Self-association and Hebbian learning in linear neural networks
IEEE Transactions on Neural Networks
Descent methods for optimization on homogeneous manifolds
Mathematics and Computers in Simulation
Learning averages over the lie group of unitary matrices
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
An algorithm to compute averages on matrix Lie groups
IEEE Transactions on Signal Processing
Hi-index | 0.00 |
The Hebbian paradigm is perhaps the best-known unsupervised learning theory in connectionism. It has inspired wide research activity in the artificial neural network field because it embodies some interesting properties such as locality and the capability of being applicable to the basic weight-and-sum structure of neuron models. The plain Hebbian principle, however, also presents some inherent theoretical limitations that make it impractical in most cases. Therefore, modifications of the basic Hebbian learning paradigm have been proposed over the past 20 years in order to design profitable signal and data processing algorithms. Such modifications led to the principal component analysis type class of learning rules along with their nonlinear extensions. The aim of this review is primarily to present part of the existing fragmented material in the field of principal component learning within a unified view and contextually to motivate and present extensions of previous works on Hebbian learning to complex-weighted linear neural networks. This work benefits from previous studies on linear signal decomposition by artificial neural networks, nonquadratic component optimization and reconstruction error definition, neural parameters adaptation by constrained optimization of learning criteria of complex-valued arguments, and orthonormality expression via the insertion of topological elements in the networks or by modifying the network learning criterion. In particular, the learning principles considered here and their analysis concern complex-valued principal/minor component/subspace linear/nonlinear rules for complexweighted neural structures, both feedforward and laterally connected.