Supervised subspace learning with multi-class lagrangian SVM on the grassmann manifold

  • Authors:
  • Duc-Son Pham;Svetha Venkatesh

  • Affiliations:
  • Institute for Multi-sensor Processing and Content Analysis, Curtin University, Perth, Western Australia, Australia;Institute for Multi-sensor Processing and Content Analysis, Curtin University, Perth, Western Australia, Australia

  • Venue:
  • AI'11 Proceedings of the 24th international conference on Advances in Artificial Intelligence
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Learning robust subspaces to maximize class discrimination is challenging, and most current works consider a weak connection between dimensionality reduction and classifier design. We propose an alternate framework wherein these two steps are combined in a joint formulation to exploit the direct connection between dimensionality reduction and classification. Specifically, we learn an optimal subspace on the Grassmann manifold jointly minimizing the classification error of an SVM classifier. We minimize the regularized empirical risk over both the hypothesis space of functions that underlies this new generalized multi-class Lagrangian SVM and the Grassmann manifold such that a linear projection is to be found. We propose an iterative algorithm to meet the dual goal of optimizing both the classifier and projection. Extensive numerical studies on challenging datasets show robust performance of the proposed scheme over other alternatives in contexts wherein limited training data is used, verifying the advantage of the joint formulation.