A Novel Scalable Algorithm for Supervised Subspace Learning

  • Authors:
  • Jun Yan;Ning Liu;Benyu Zhang;Qiang Yang;Shuicheng Yan;Zheng Chen

  • Affiliations:
  • Microsoft Research Asia, China;Microsoft Research Asia, China;Microsoft Research Asia, China;Hong Kong University of Science and Technology, Hong Kong;University of Illinois at Urbana Champaign, USA;Microsoft Research Asia, China

  • Venue:
  • ICDM '06 Proceedings of the Sixth International Conference on Data Mining
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Subspace learning approaches aim to discover important statistical distribution on lower dimensions for high dimensional data. Methods such as Principal Component Analysis (PCA) do not make use of the class information, and Linear Discriminant Analysis (LDA) could not be performed efficiently in a scalable way. In this paper, we propose a novel highly scalable supervised subspace learning algorithm called as Supervised Kampong Measure (SKM). It assigns data points as close as possible to their corresponding class mean, simultaneously assigns data points to be as far as possible from the other class means in the transformed lower dimensional subspace. Theoretical derivation shows that our algorithm is not limited by the number of classes or the singularity problem faced by LDA. Furthermore, our algorithm can be executed in an incremental manner in which learning is done in an online fashion as data streams are received. Experimental results on several datasets, including a very large text data set RCV1, show the outstanding performance of our proposed algorithm on classification problems as compared to PCA, LDA and a popular feature selection approach, Information Gain (IG).