A singular value decomposition updating algorithm for subspace tracking
SIAM Journal on Matrix Analysis and Applications
The nature of statistical learning theory
The nature of statistical learning theory
Matrix computations (3rd ed.)
An eigenspace update algorithm for image analysis
Graphical Models and Image Processing
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
On Updating Problems in Latent Semantic Indexing
SIAM Journal on Scientific Computing
Kernel PCA and de-noising in feature spaces
Proceedings of the 1998 conference on Advances in neural information processing systems II
Matrix algorithms
Recursive Calculation of Dominant Singular Subspaces
SIAM Journal on Matrix Analysis and Applications
An Expectation-Maximization Approach to Nonlinear Component Analysis
Neural Computation
Efficient tracking of the dominant eigenspace of a normalized kernel matrix
Neural Computation
Sliding window adaptive SVD algorithms
IEEE Transactions on Signal Processing
Sequential Karhunen-Loeve basis extraction and its application to images
IEEE Transactions on Image Processing
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
Recursive approximation of the dominant eigenspace of an indefinite matrix
Journal of Computational and Applied Mathematics
Hi-index | 0.00 |
Many important kernel methods in the machine learning area, such as kernel principal component analysis, feature approximation, denoising, compression, and prediction require the computation of the dominant set of eigenvectors of the symmetric kernel Gram matrix. Recently, an efficient incremental approach was presented for the fast calculation of the dominant kernel eigenbasis. In this paper we propose faster algorithms for incrementally updating and downsizing the dominant kernel eigenbasis. These methods are well-suited for large scale problems since they are efficient in terms of both complexity and data management.