Principal component neural networks: theory and applications
Principal component neural networks: theory and applications
EM algorithms for PCA and SPCA
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
A unifying review of linear Gaussian models
Neural Computation
Mixtures of probabilistic principal component analyzers
Neural Computation
Simultaneous Identification of Face and Orientation
Neural Processing Letters
Projection approximation subspace tracking
IEEE Transactions on Signal Processing
Variational learning for rectified factor analysis
Signal Processing
Hi-index | 0.10 |
Subspace analysis is one of popular multivariate data analysis methods, which has been widely used in pattern recognition. Typically data space belongs to very high dimension, but only a few principal components need to be extracted. In this paper, we present a fast sequential algorithm which behaves like expectation maximization (EM), for subspace analysis or tracking. In addition we also present a slight modification of the subspace algorithm by employing a rectifier, that is quite useful in handling nonnegative data (for example, images), which leads to rectified subspace analysis. The useful behavior of our proposed algorithms are confirmed through numerical experimental results with toy data and dynamic PET images.