Arbitrary norm support vector machines
Neural Computation
Novel multiclass classifiers based on the minimization of the within-class variance
IEEE Transactions on Neural Networks
Supervised self-taught learning: actively transferring knowledge from unlabeled data
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Generalized locality preserving Maxi-Min Margin Machine
Neural Networks
Structural twin parametric-margin support vector machine for binary classification
Knowledge-Based Systems
Using robust dispersion estimation in support vector machines
Pattern Recognition
Hi-index | 0.00 |
In this paper, we propose a novel large margin classifier, called the maxi-min margin machine (M4). This model learns the decision boundary both locally and globally. In comparison, other large margin classifiers construct separating hyperplanes only either locally or globally. For example, a state-of-the-art large margin classifier, the support vector machine (SVM), considers data only locally, while another significant model, the minimax probability machine (MPM), focuses on building the decision hyperplane exclusively based on the global information. As a major contribution, we show that SVM yields the same solution as M4 when data satisfy certain conditions, and MPM can be regarded as a relaxation model of M4. Moreover, based on our proposed local and global view of data, another popular model, the linear discriminant analysis, can easily be interpreted and extended as well. We describe the M4 model definition, provide a geometrical interpretation, present theoretical justifications, and propose a practical sequential conic programming method to solve the optimization problem. We also show how to exploit Mercer kernels to extend M4 for nonlinear classifications. Furthermore, we perform a series of evaluations on both synthetic data sets and real-world benchmark data sets. Comparison with SVM and MPM demonstrates the advantages of our new model.