Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Kernel independent component analysis
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Spectral Grouping Using the Nyström Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Feature Extraction Using Linear and Non-linear Subspace Techniques
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part II
Hi-index | 0.00 |
In this work we use kernel subspace techniques to perform feature extraction. The projections of the data onto the coordinates of the high-dimensional space created by the kernel function are called features. The basis vectors to project the data depend on the eigendecomposition of the kernel matrix which might become very high-dimensional in case of a large training set. Nevertheless only the largest eigenvalues and corresponding eigenvectors are used to extract relevant features. In this work, we present low-rank approximations to the kernel matrix based on the Nyström method. Numerical simulations will then be used to demonstrate the Nyström extension method applied to feature extraction and classification. The performance of the presented methods is demonstrated using the USPS data set.