Choosing Multiple Parameters for Support Vector Machines
Machine Learning
MARK: a boosting algorithm for heterogeneous kernel models
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Column-generation boosting methods for mixture of kernels
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
A statistical framework for genomic data fusion
Bioinformatics
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Optimal kernel selection in Kernel Fisher discriminant analysis
ICML '06 Proceedings of the 23rd international conference on Machine learning
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
Improved Nyström low-rank approximation and error analysis
Proceedings of the 25th international conference on Machine learning
Multi-class Discriminant Kernel Learning via Convex Programming
The Journal of Machine Learning Research
ICDAR '09 Proceedings of the 2009 10th International Conference on Document Analysis and Recognition
Non-sparse Multiple Kernel Learning for Fisher Discriminant Analysis
ICDM '09 Proceedings of the 2009 Ninth IEEE International Conference on Data Mining
Fusion of gaussian kernels within support vector classification
CIARP'06 Proceedings of the 11th Iberoamerican conference on Progress in Pattern Recognition, Image Analysis and Applications
On the fusion of polynomial kernels for support vector classifiers
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
Incremental Linear Discriminant Analysis for Face Recognition
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.01 |
Multiple Kernel Discriminant Analysis (MKDA) adopts an ensemble of multiple kernel matrices K"is and is supposed to be more flexible and effective than the original Kernel Discriminant Analysis (KDA). However, with n training samples and p kernel matrices K"is, MKDA employs pn^2 space units for all the K"is in the optimizing process and simultaneously depends on its solving techniques to handle the optimization problem, which would cause a large space and computational complexity and limit the efficiency and applicability. In order to mitigate this problem, this manuscript adopts the Nystrom method approximating K"i and therefore develops a novel Multiple Nystrom-Approximating Kernel Discriminant Analysis (MNKDA). In practice, the proposed MNKDA first adopts m (m@?@?n) samples to generate an approximating kernel matrix K@?"i for each K"i and forms an ensemble matrix G=@?"i"="1^p@m"iK@?"i. Then, MNKDA directly applies the eigenvalue decomposition onto the Nystrom-based ensemble matrix G and reformulates the proposed discriminant analysis as an eigenvalue problem. The experimental results show that the proposed method can achieve an effective and efficient performance than the classical MKDA. The advantages of the proposed MNKDA are (1) expressing the formulation as an eigenvalue problem resolution instead of using commercial softwares; (2) decreasing the space complexity from O(pn^2) to O(n^2) and mitigating the computational complexity from O(n^3) to O(pmn^2); and (3) providing an alternative multiple kernel learning technique and inheriting the advantage of multiple kernel learning.