Effective multiplicative updates for non-negative discriminative learning in multimodal dimensionality reduction

  • Authors:
  • Zhao Zhang;Man Jiang;Ning Ye

  • Affiliations:
  • Department of Computer Science and Technology, Nanjing Forestry University, Nanjing, China 210037;Department of Language Studies, Nanjing Forestry University, Nanjing, China 210037;Department of Computer Science and Technology, Nanjing Forestry University and Shandong University, Jinan, China 250100

  • Venue:
  • Artificial Intelligence Review
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Fisher discriminant analysis gives the unsatisfactory results if points in the same class have within-class multimodality and fails to produce the non-negativity of projection vectors. In this paper, we focus on the newly formulated within and between-class scatters based supervised locality preserving dimensionality reduction problem and propose an effective dimensionality reduction algorithm, namely, Multiplicative Updates based non-negative Discriminative Learning (MUNDL), which optimally seeks to obtain two non-negative embedding transformations with high preservation and discrimination powers for two data sets in different classes such that nearby sample pairs in the original space compact in the learned embedding space, under which the projections of the original data in different classes can be appropriately separated from each other. We also show that MUNDL can be easily extended to nonlinear dimensionality reduction scenarios by employing the standard kernel trick. We verify the feasibility and effectiveness of MUNDL by conducting extensive data visualization and classification experiments. Numerical results on some benchmark UCI and real-world datasets show the MUNDL method tends to capture the intrinsic local and multimodal structure characteristics of the given data and outperforms some established dimensionality reduction methods, while being much more efficient.