Dimension reduction by local principal component analysis
Neural Computation
Soft vector quantization and the EM algorithm
Neural Networks
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Mixtures of probabilistic principal component analyzers
Neural Computation
Learning and Design of Principal Curves
IEEE Transactions on Pattern Analysis and Machine Intelligence
Self-organizing continuous attractor networks and motor function
Neural Networks
Robust recursive least squares learning algorithm for principal component analysis
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
The parameterless self-organizing map algorithm
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
This paper proposes a local PCA-SOM algorithm. The new competition measure is computational efficient, and implicitly incorporates the Mahalanobis distance and the reconstruction error. The matrix inversion or PCA decomposition for each data input is not needed as compared to the previous models. Moreover, the local data distribution is completely stored in the covariance matrix instead of the pre-defined numbers of the principal components. Thus, no priori information of the optimal principal subspace is required. Experiments on both the synthesis data and a pattern learning task are carried out to show the performance of the proposed method.