A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Making large-scale support vector machine learning practical
Advances in kernel methods
Kernel k-means: spectral clustering and normalized cuts
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
K-means clustering via principal component analysis
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Iterative Kernel Principal Component Analysis for Image Modeling
IEEE Transactions on Pattern Analysis and Machine Intelligence
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
Kernel-based principal components analysis on large telecommunication data
AusDM '09 Proceedings of the Eighth Australasian Data Mining Conference - Volume 101
Hi-index | 0.00 |
Kernel PCA, like other kernel-based techniques, is suffered from memory requirement and computational problems as well as from a tedious training procedure. This work shows that the objective function of Kernel PCA, i.e. the reconstruction error can be upper bounded by the distortion of K-means algorithm in the feature space. From this relation, we propose a simplification of Kernel PCA's training procedure by Kernel K-means algorithm. The application of preimage reconstruction algorithm allows further simplification and leads to a more computational economic solution.