Semi-supervised discriminatively regularized classifier with pairwise constraints
PRICAI'12 Proceedings of the 12th Pacific Rim international conference on Trends in Artificial Intelligence
Training mahalanobis kernels by linear programming
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Three-fold structured classifier design based on matrix pattern
Pattern Recognition
Structural twin support vector machine for classification
Knowledge-Based Systems
Structural twin parametric-margin support vector machine for binary classification
Knowledge-Based Systems
Calculation of melatonin and resveratrol effects on steatosis hepatis using soft computing methods
Computer Methods and Programs in Biomedicine
A second order cone programming approach for semi-supervised learning
Pattern Recognition
Hi-index | 0.00 |
Support vector machine (SVM), as one of the most popular classifiers, aims to find a hyperplane that can separate two classes of data with maximal margin. SVM classifiers are focused on achieving more separation between classes than exploiting the structures in the training data within classes. However, the structural information, as an implicit prior knowledge, has recently been found to be vital for designing a good classifier in different real-world problems. Accordingly, using as much prior structural information in data as possible to help improve the generalization ability of a classifier has yielded a class of effective structural large margin classifiers, such as the structured large margin machine (SLMM) and the Laplacian support vector machine (LapSVM). In this paper, we unify these classifiers into a common framework from the concept of “structural granularity” and the formulation for optimization problems. We exploit the quadratic programming (QP) and second-order cone programming (SOCP) methods, and derive a novel large margin classifier, we call the new classifier the structural regularized support vector machine (SRSVM). Unlike both SLMM at the cross of the cluster granularity and SOCP and LapSVM at the cross of the point granularity and QP, SRSVM is located at the cross of the cluster granularity and QP and thus follows the same optimization formulation as LapSVM to overcome large computational complexity and non-sparse solution in SLMM. In addition, it integrates the compactness within classes with the separability between classes simultaneously. Furthermore, it is possible to derive generalization bounds for these algorithms by using eigenvalue analysis of the kernel matrices. Experimental results demonstrate that SRSVM is often superior in classification and generalization performances to the state-of-the-art algorithms in the framework, both with the same and different structural granularities.