Select Objective Functions for Multiple Criteria Programming Classification

  • Authors:
  • Peng Zhang;Yingjie Tian;Zhiwang Zhang;Aihua Li;Xingquan Zhu

  • Affiliations:
  • -;-;-;-;-

  • Venue:
  • WI-IAT '08 Proceedings of the 2008 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology - Volume 03
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Existing supervised learning models are generally built upon the basis of only one single objective function, through the minimizing of the square-loss (neural networks) or the minimizing of the information entropy (decision tree). Due to the inherent complexity of the real-life data, learning models merely based on only one single objective function are always inadequate. Consequently, many well-known classification models adopt multiple objective optimization to guide the learning process. For example, Fisher’s linear discriminant analysis (LDA) is built by maximizing the “between-class variance” (the first objective function) and minimizing the “within-class variance” (the second objective function); SVM is built by maximizing the “marginal distance” (the first objective function) and minimizing the “error distance” (the second objective function). In this paper, we combine Fisher’s LDA (maximizing the “between-class variance”) measure and SVM’s minimizing the “error distance” measure to formulate a new multiple objective classification model, namely Minimal Error and Maximal Between-class Variance (MEMBV) model. Experimental results demonstrate the performance of the proposed new model on synthetic and real-life datasets.