A novel multiple Nyström-approximating kernel discriminant analysis

  • Authors:
  • Zhe Wang;Wenbo Jie;Daqi Gao

  • Affiliations:
  • -;-;-

  • Venue:
  • Neurocomputing
  • Year:
  • 2013

Quantified Score

Hi-index 0.01

Visualization

Abstract

Multiple Kernel Discriminant Analysis (MKDA) adopts an ensemble of multiple kernel matrices K"is and is supposed to be more flexible and effective than the original Kernel Discriminant Analysis (KDA). However, with n training samples and p kernel matrices K"is, MKDA employs pn^2 space units for all the K"is in the optimizing process and simultaneously depends on its solving techniques to handle the optimization problem, which would cause a large space and computational complexity and limit the efficiency and applicability. In order to mitigate this problem, this manuscript adopts the Nystrom method approximating K"i and therefore develops a novel Multiple Nystrom-Approximating Kernel Discriminant Analysis (MNKDA). In practice, the proposed MNKDA first adopts m (m@?@?n) samples to generate an approximating kernel matrix K@?"i for each K"i and forms an ensemble matrix G=@?"i"="1^p@m"iK@?"i. Then, MNKDA directly applies the eigenvalue decomposition onto the Nystrom-based ensemble matrix G and reformulates the proposed discriminant analysis as an eigenvalue problem. The experimental results show that the proposed method can achieve an effective and efficient performance than the classical MKDA. The advantages of the proposed MNKDA are (1) expressing the formulation as an eigenvalue problem resolution instead of using commercial softwares; (2) decreasing the space complexity from O(pn^2) to O(n^2) and mitigating the computational complexity from O(n^3) to O(pmn^2); and (3) providing an alternative multiple kernel learning technique and inheriting the advantage of multiple kernel learning.