GTM: the generative topographic mapping
Neural Computation
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Kernel Principal Component Analysis
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models
The Journal of Machine Learning Research
Ambiguity Modeling in Latent Spaces
MLMI '08 Proceedings of the 5th international workshop on Machine Learning for Multimodal Interaction
Facial age estimation by nonlinear aging pattern subspace
MM '08 Proceedings of the 16th ACM international conference on Multimedia
Hi-index | 0.00 |
Kernel Principal Component Analysis (KPCA) is a widely used technique for visualisation and feature extraction. Despite its success and flexibility, the lack of a probabilistic interpretation means that some problems, such as handling missing or corrupted data, are very hard to deal with. In this paper we exploit the probabilistic interpretation of linear PCA together with recent results on latent variable models in Gaussian Processes in order to introduce an objective function for KPCA. This in turn allows a principled approach to the missing data problem. Furthermore, this new approach can be extended to reconstruct corrupted test data using fixed kernel feature extractors. The experimental results show strong improvements over widely used heuristics.