The symmetric eigenvalue problem
The symmetric eigenvalue problem
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Kernel principal component analysis
Advances in kernel methods
Composite Kernels for Hypertext Categorisation
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Kernel independent component analysis
The Journal of Machine Learning Research
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Hi-index | 0.00 |
With the emergence of data fusion techniques (kernel combinations, ensemble methods and boosting algorithms), the task of comparing distance/similarity/kernel matrices is becoming increasingly relevant. However, the choice of an appropriate metric for matrices involved in pattern recognition problems is far from trivial. In this work we propose a general spectral framework to build metrics for matrix spaces. Within the general framework of matrix pencils, we propose a new metric for symmetric and semi-positive definite matrices, called Pencil Distance (PD). The generality of our approach is demonstrated by showing that the Kernel Alignment (KA) measure is a particular case of our spectral approach. We illustrate the performance of the proposed measures using some classification problems.