Making large-scale support vector machine learning practical
Advances in kernel methods
Multicategory Classification by Support Vector Machines
Computational Optimization and Applications - Special issue on computational optimization—a tribute to Olvi Mangasarian, part I
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Multiclass LS-SVMs: Moderated Outputs and Coding-Decoding Schemes
Neural Processing Letters
Pattern Recognition Letters
Rapid and brief communication: Unified dual for bi-class SVM approaches
Pattern Recognition
Which is the best multiclass SVM method? an empirical study
MCS'05 Proceedings of the 6th international conference on Multiple Classifier Systems
Support vector machines and the multiple hypothesis test problem
IEEE Transactions on Signal Processing
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Fast support vector machines for continuous data
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on cybernetics and cognitive informatics
A prototype classifier based on gravitational search algorithm
Applied Soft Computing
Geometrically local embedding in manifolds for dimension reduction
Pattern Recognition
Dynamics of a mean-shift-like algorithm and its applications on clustering
Information Processing Letters
Theoretical aspects of mapping to multidimensional optimal regions as a multi-classifier
Intelligent Data Analysis
Hi-index | 0.01 |
Use different real positive numbers p"i to represent all kinds of pattern categories, after mapping the inputted patterns into a special feature space by a non-linear mapping, a linear relation between the mapped patterns and numbers p"i is assumed, whose bias and coefficients are undetermined, and the hyper-plane corresponding to zero output of the linear relation is looked as the base hyper-plane. To determine the pending parameters, an objective function is founded aiming to minimize the difference between the outputs of the patterns belonging to a same type and the corresponding p"i, and to maximize the distance between any two different hyper-planes corresponding to different pattern types. The objective function is same to that of support vector regression in form, so the coefficients and bias of the linear relation are calculated by some known methods such as SVM^l^i^g^h^t approach. Simultaneously, three methods are also given to determine p"i, the best one is to determine them in training process, which has relatively high accuracy. Experiment results of the IRIS data set show that, the accuracy of this method is better than those of many SVM-based multi-class classifiers, and close to that of DAGSVM (decision-directed acyclic graph SVM), emphatically, the recognition speed is the highest.