Multiclass core vector machine
Proceedings of the 24th international conference on Machine learning
The effect of target vector selection on the invariance of classifier performance measures
IEEE Transactions on Neural Networks
Multi-classification with tri-class support vector machines: a review
IWANN'07 Proceedings of the 9th international work conference on Artificial neural networks
Using emerging subsequence in classifying protein structural class
FSKD'09 Proceedings of the 6th international conference on Fuzzy systems and knowledge discovery - Volume 1
ROLEX-SP: Rules of lexical syntactic patterns for free text categorization
Knowledge-Based Systems
Discriminative mixture-of-templates for viewpoint classification
ECCV'10 Proceedings of the 11th European conference on Computer vision: Part V
Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining
Fast Kernel Discriminant Analysis for Classification of Liver Cancer Mass Spectra
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Hi-index | 0.00 |
Winner-take-all multiclass classifiers are built on the top of a set of prototypes each representing one of the available classes. A pattern is then classified with the label associated to the most 'similar' prototype. Recent proposal of SVM extensions to multiclass can be considered instances of the same strategy with one prototype per class.The multi-prototype SVM proposed in this paper extends multiclass SVM to multiple prototypes per class. It allows to combine several vectors in a principled way to obtain large margin decision functions. For this problem, we give a compact constrained quadratic formulation and we propose a greedy optimization algorithm able to find locally optimal solutions for the non convex objective function.This algorithm proceeds by reducing the overall problem into a series of simpler convex problems. For the solution of these reduced problems an efficient optimization algorithm is proposed. A number of pattern selection strategies are then discussed to speed-up the optimization process. In addition, given the combinatorial nature of the overall problem, stochastic search strategies are suggested to escape from local minima which are not globally optimal.Finally, we report experiments on a number of datasets. The performance obtained using few simple linear prototypes is comparable to that obtained by state-of-the-art kernel-based methods but with a significant reduction (of one or two orders) in response time.