Principal component neural networks: theory and applications
Principal component neural networks: theory and applications
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Kernel Principal Component Analysis
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
Kernel independent component analysis
The Journal of Machine Learning Research
Learning Eigenfunctions Links Spectral Embedding and Kernel PCA
Neural Computation
Information, Divergence and Risk for Binary Experiments
The Journal of Machine Learning Research
Hi-index | 0.00 |
Principal Component Analysis (PCA) is a very well known statistical tool. Kernel PCA is a nonlinear extension to PCA based on the kernel paradigm. In this paper we characterize the projections found by Kernel PCA from a information theoretic perspective. We prove that Kernel PCA provides optimum entropy projections in the input space when the Gaussian kernel is used for the mapping and a sample estimate of Renyi’s entropy based on the Parzen window method is employed. The information theoretic interpretation motivates the choice and specifices the kernel used for the transformation to feature space.