The nature of statistical learning theory
The nature of statistical learning theory
Solving a Class of Linearly Constrained Indefinite QuadraticProblems by D.C. Algorithms
Journal of Global Optimization
Multiple-Criteria Linear Programming for VIP E-Mail Behavior Analysis
ICDMW '07 Proceedings of the Seventh IEEE International Conference on Data Mining Workshops
A Multi-criteria Convex Quadratic Programming model for credit data analysis
Decision Support Systems
Hi-index | 0.00 |
Existing supervised learning models are generally built upon the basis of only one single objective function, through the minimizing of the square-loss (neural networks) or the minimizing of the information entropy (decision tree). Due to the inherent complexity of the real-life data, learning models merely based on only one single objective function are always inadequate. Consequently, many well-known classification models adopt multiple objective optimization to guide the learning process. For example, Fisher’s linear discriminant analysis (LDA) is built by maximizing the “between-class variance” (the first objective function) and minimizing the “within-class variance” (the second objective function); SVM is built by maximizing the “marginal distance” (the first objective function) and minimizing the “error distance” (the second objective function). In this paper, we combine Fisher’s LDA (maximizing the “between-class variance”) measure and SVM’s minimizing the “error distance” measure to formulate a new multiple objective classification model, namely Minimal Error and Maximal Between-class Variance (MEMBV) model. Experimental results demonstrate the performance of the proposed new model on synthetic and real-life datasets.