Generalized locality preserving Maxi-Min Margin Machine

  • Authors:
  • Zhancheng Zhang;Kup-Sze Choi;Xiaoqing Luo;Shitong Wang

  • Affiliations:
  • School of Digital Media, Jiangnan University, Wuxi 214122, PR China and Suzhou Institute of Nanotech and Nanobionics, Chinese Academy of Sciences, Suzhou 215123, PR China;Centre for Integrative Digital Health, School of Nursing, The Hong Kong Polytechnic University, Hong Kong;School of Internet of Things, Jiangnan University, Wuxi 214122, PR China;School of Digital Media, Jiangnan University, Wuxi 214122, PR China

  • Venue:
  • Neural Networks
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Research on large margin classifiers from the ''local'' and ''global'' view has become an active topic in machine learning and pattern recognition. Inspired from the typical local and global learning machine Maxi-Min Margin Machine (M^4) and the idea of the Locality Preserving Projections (LPP), we propose a novel large margin classifier, the Generalized Locality Preserving Maxi-Min Margin Machine (GLPM), where the within-class matrices are constructed using the labeled training points in a supervised way, and then used to build the classifier. The within-class matrices of GLPM preserve the intra-class manifold in the training sets, as well as the covariance matrices which indicate the global projection direction in the M^4 model. Moreover, the connections among GLPM, M^4 and LFDA are theoretically analyzed, and we show that GLPM can be considered as a generalized M^4 machine. The GLPM is also more robust since it requires no assumption on data distribution while Gaussian data distribution is assumed in the M^4 machine. Experiments on data sets from the machine learning repository demonstrate its advantage over M^4 in both local and global learning performance.