IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Learning Eigenfunctions Links Spectral Embedding and Kernel PCA
Neural Computation
Accurate Error Bounds for the Eigenvalues of the Kernel Matrix
The Journal of Machine Learning Research
Joint diagonalization of kernels for information fusion
CIARP'07 Proceedings of the Congress on pattern recognition 12th Iberoamerican conference on Progress in pattern recognition, image analysis and applications
From indefinite to positive semi-definite matrices
SSPR'06/SPR'06 Proceedings of the 2006 joint IAPR international conference on Structural, Syntactic, and Statistical Pattern Recognition
Combining Functional Data Projections for Time Series Classification
CIARP '09 Proceedings of the 14th Iberoamerican Conference on Pattern Recognition: Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
Hi-index | 0.00 |
When there are several sources of information available in pattern recognition problems, the task of combining them is most interesting. In the context of kernel methods it means to design a single kernel function that collects all the relevant information of each kernel for the classification task at hand. The problem is then solved by training a Support Vector Machine (SVM) on the resulting kernel. Here we propose a consistent method to produce kernel functions from kernel matrices created by any given kernel combination technique. Once this fusion kernel function is available, it will be possible to evaluate the kernel at any data point. The performance of the proposed fusion Kernel is illustrated on several classification and visualization tasks.