Applied multivariate statistical analysis
Applied multivariate statistical analysis
Principal component neural networks: theory and applications
Principal component neural networks: theory and applications
EM algorithms for PCA and SPCA
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Mixtures of probabilistic principal component analyzers
Neural Computation
LAPACK Users' guide (third ed.)
LAPACK Users' guide (third ed.)
BYY harmony learning, structural RPCL, and topological self-organizing on mixture models
Neural Networks - New developments in self-organizing maps
Data smoothing regularization, multi-sets-learning, and problem solving strategies
Neural Networks - 2003 Special issue: Advances in neural networks research IJCNN'03
Temporal BYY learning for state space approach, hidden Markovmodel, and blind source separation
IEEE Transactions on Signal Processing
The minimum description length principle in coding and modeling
IEEE Transactions on Information Theory
BYY harmony learning, independent state space, and generalized APT financial analyses
IEEE Transactions on Neural Networks
Robust principal component analysis by self-organizing rules based on statistical physics approach
IEEE Transactions on Neural Networks
Learning in linear neural networks: a survey
IEEE Transactions on Neural Networks
Performance evaluation of PCA-based spike sorting algorithms
Computer Methods and Programs in Biomedicine
Bayesian Ying Yang system, best harmony learning, and Gaussian manifold based family
WCCI'08 Proceedings of the 2008 IEEE world conference on Computational intelligence: research frontiers
A theoretical investigation of several model selection criteria for dimensionality reduction
Pattern Recognition Letters
Hi-index | 0.01 |
It is well-known that constrained Hebbian self-organization on multiple linear neural units leads to the same k-dimensional subspace spanned by the first k principal components. Not only the batch PCA algorithm has been widely applied in various fields since 1930s, but also a variety of adaptive algorithms have been proposed in the past two decades. However, most studies assume a known dimension k or determine it heuristically, though there exist a number of model selection criteria in the literature of statistics. Recently, criteria have also been obtained under the framework of Bayesian Ying-Yang (BYY) harmony learning. This paper further investigates the BYY criteria in comparison with existing typical criteria, including Akaike's information criterion (AIC), the consistent Akaike's information criterion (CAIC), the Bayesian inference criterion (BIC), and the cross-validation (CV) criterion. This comparative study is made via experiments not only on simulated data sets of different sample sizes, noise variances, data space dimensions, and subspace dimensions, but also on two real data sets from air pollution problem and sport track records, respectively. Experiments have shown that BIC outperforms AIC, CAIC, and CV while the BYY criteria are either comparable with or better than BIC. Therefore, BYY harmony learning is a more preferred tool for subspace dimension determination by further considering that the appropriate subspace dimension k can be automatically determined during implementing BYY harmony learning for the principal subspace while the selection of subspace dimension k by BIC, AIC, CAIC, and CV has to be made at the second stage based on a set of candidate subspaces with different dimensions which have to be obtained at the first stage of learning.