Dimensionality Reduction by Minimal Distance Maximization

  • Authors:
  • Bo Xu;Kaizhu Huang;Cheng-Lin Liu

  • Affiliations:
  • -;-;-

  • Venue:
  • ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we propose a novel discriminant analysis method, called Minimal Distance Maximization (MDM). In contrast to the traditional LDA, which actually maximizes the average divergence among classes, MDM attempts to find a low-dimensional subspace that maximizes the minimal (worst-case) divergence among classes. This ``minimal" setting solves the problem caused by the ``average" setting of LDA that tends to merge similar classes with smaller divergence when used for multi-class data. Furthermore, we elegantly formulate the worst-case problem as a convex problem, making the algorithm solvable for larger data sets. Experimental results demonstrate the advantages of our proposed method against five other competitive approaches on one synthetic and six real-life data sets.