Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
The SVD and reduced rank signal processing
Signal Processing - Theme issue on singular value decomposition
Functional principal components analysis by choice of norm
Journal of Multivariate Analysis
Optimization by Vector Space Methods
Optimization by Vector Space Methods
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Relative Karhunen-Loeve transform
IEEE Transactions on Signal Processing
An optimal filter of the second order
IEEE Transactions on Signal Processing
Best approximation of the identity mapping: The case of variable finite memory
Journal of Approximation Theory
Optimal multilinear estimation of a random vector under constraints of causality and limited memory
Computational Statistics & Data Analysis
Towards theory of generic Principal Component Analysis
Journal of Multivariate Analysis
Hi-index | 0.00 |
We propose a new approach which generalizes and improves principal component analysis (PCA) and its recent advances. The approach is based on the following underlying ideas. PCA can be reformulated as a technique which provides the best linear estimator of the fixed rank for random vectors. By the proposed method, the vector estimate is presented in a special quadratic form aimed to improve the error of estimation compared with customary linear estimates. The vector is first pre-estimated from the special iterative procedure such that each iterative loop consists of a solution of the unconstrained nonlinear best approximation problem. Then, the final vector estimate is obtained from a solution of the constrained best approximation problem with the quadratic approximant. We show that the combination of these techniques allows us to provide a new nonlinear estimator with a significantly better performance compared with that of PCA and its known modifications.