A massively parallel architecture for a self-organizing neural pattern recognition machine
Computer Vision, Graphics, and Image Processing
Self-organization and associative memory: 3rd edition
Self-organization and associative memory: 3rd edition
Competitive learning algorithms for vector quantization
Neural Networks
Introduction to the theory of neural computation
Introduction to the theory of neural computation
Modified Hebbian learning for curve and surface fitting
Neural Networks
Backpropagation and unsupervised learning in linear networks
Backpropagation
Adaptive mixtures of local experts
Neural Computation
Fast learning in networks of locally-tuned processing units
Neural Computation
Rival penalized competitive learning for clustering analysis, RBF net, and curve detection
IEEE Transactions on Neural Networks
Handwritten Digit Recognition by a Mixture of Local PrincipalComponent Analysis
Neural Processing Letters
Sequential Extraction of Minor Components
Neural Processing Letters
On a Class of Orthonormal Algorithms for Principal and Minor Subspace Tracking
Journal of VLSI Signal Processing Systems
Time Series Segmentation Using a Novel Adaptive Eigendecomposition Algorithm
Journal of VLSI Signal Processing Systems
Maximum and Minimum Likelihood Hebbian Learning for Exploratory Projection Pursuit
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
A New Neural Implementation of Exploratory Projection Pursuit
IDEAL '02 Proceedings of the Third International Conference on Intelligent Data Engineering and Automated Learning
A Framework for Robust Subspace Learning
International Journal of Computer Vision - Special Issue on Computational Vision at Brown University
BYY harmony learning, structural RPCL, and topological self-organizing on mixture models
Neural Networks - New developments in self-organizing maps
Multi-Channel Subspace Mapping Using an Information Maximization Criterion
Multidimensional Systems and Signal Processing
Maximum and Minimum Likelihood Hebbian Learning for Exploratory Projection Pursuit
Data Mining and Knowledge Discovery
Journal of VLSI Signal Processing Systems
Blind separation of positive sources by globally convergent gradient search
Neural Computation
Adaptive algorithms for first principal eigenvector computation
Neural Networks
Nonlinear Complex-Valued Extensions of Hebbian Learning: An Essay
Neural Computation
Mining Adaptive Ratio Rules from Distributed Data Sources
Data Mining and Knowledge Discovery
Constrained Projection Approximation Algorithms for Principal Component Analysis
Neural Processing Letters
Journal of VLSI Signal Processing Systems
Journal of VLSI Signal Processing Systems
Evaluating the air-sea interactions and fluxes using an instance-based reasoning system
AI Communications - Binding Environmental Sciences and AI
Simultaneous principal-component extraction with application to adaptive blind multiuser detection
EURASIP Journal on Applied Signal Processing
An Efficient Measure of Signal Temporal Predictability for Blind Source Separation
Neural Processing Letters
Recursive principal components analysis using eigenvector matrix perturbation
EURASIP Journal on Applied Signal Processing
A stable MCA learning algorithm
Computers & Mathematics with Applications
Neural Information Processing
A geometric newton method for oja's vector field
Neural Computation
A unified learning algorithm to extract principal and minor components
Digital Signal Processing
Information Maximization in a Linear Manifold Topographic Map
Neural Processing Letters
A robust and globally convergent PCA learning algorithm
Control and Intelligent Systems
Algorithms and networks for accelerated convergence of adaptive LDA
Pattern Recognition
Neural business control system
ICDM'07 Proceedings of the 7th industrial conference on Advances in data mining: theoretical aspects and applications
A family of fuzzy learning algorithms for robust principal component analysis neural networks
IEEE Transactions on Fuzzy Systems
Bayesian Ying Yang system, best harmony learning, and Gaussian manifold based family
WCCI'08 Proceedings of the 2008 IEEE world conference on Computational intelligence: research frontiers
A constrained optimization approach for an adaptive generalized subspace tracking algorithm
Computers and Electrical Engineering
Convergence proof of matrix dynamics for online linear discriminant analysis
Journal of Multivariate Analysis
MUSP'06 Proceedings of the 6th WSEAS international conference on Multimedia systems & signal processing
A fixed-point EM algorithm for straight line detection
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part II
Fast adaptive algorithms for minor component analysis using Householder transformation
Digital Signal Processing
Differentiation of syndromes with SVM
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part III
Gradient algorithm for nonnegative independent component analysis
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
A stable dual purpose adaptive algorithm for subspace tracking on noncompact stiefel manifold
ISNN'13 Proceedings of the 10th international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
We proposed a new self-organizing net based on the principle of Least Mean Square Error Reconstruction (LMSER) of an input pattern. With this principle, a local learning rule called LMSER is naturally obtained for training nets consisting of either one or several layers. We proved that for one layer with n"1 linear units, the LMSER rule lets their weights converge to rotations of the data's first n"1 principal components. These converged points are stable and corresponding to the global minimum in the Mean Square Error (MSE) landscape, which has many saddles but no local minimum. The results indirectly provided a picture about LMSER's global convergence, which is also suitable for Oja rule since we proved that the evolution direction of the Oja rule has a positive projection on that of LMSER. We have also revealed an interesting fact that slight modifications of the LMSER rule (also the Oja rule) can perform the true Principal Component Analysis (PCA) without externally designing for building asymmetrical circuits required by previous studies.