Learning large margin classifiers locally and globally

  • Authors:
  • Kaizhu Huang;Haiqin Yang;Irwin King;Michael R. Lyu

  • Affiliations:
  • The Chinese University of Hong Kong, Shatin, N.T., Hong Kong;The Chinese University of Hong Kong, Shatin, N.T., Hong Kong;The Chinese University of Hong Kong, Shatin, N.T., Hong Kong;The Chinese University of Hong Kong, Shatin, N.T., Hong Kong

  • Venue:
  • ICML '04 Proceedings of the twenty-first international conference on Machine learning
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

A new large margin classifier, named Maxi-Min Margin Machine (M4) is proposed in this paper. This new classifier is constructed based on both a "local: and a "global" view of data, while the most popular large margin classifier, Support Vector Machine (SVM) and the recently-proposed important model, Minimax Probability Machine (MPM) consider data only either locally or globally. This new model is theoretically important in the sense that SVM and MPM can both be considered as its special case. Furthermore, the optimization of M4 can be cast as a sequential conic programming problem, which can be solved efficiently. We describe the M4 model definition, provide a clear geometrical interpretation, present theoretical justifications, propose efficient solving methods, and perform a series of evaluations on both synthetic data sets and real world benchmark data sets. Its comparison with SVM and MPM also demonstrates the advantages of our new model.