The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Pairwise classification and support vector machines
Advances in kernel methods
Combining support vector and mathematical programming methods for classification
Advances in kernel methods
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Improved Pairwise Coupling Classification with Correcting Classifiers
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Ordinal Regression with K-SVCR Machines
IWANN '01 Proceedings of the 6th International Work-Conference on Artificial and Natural Neural Networks: Connectionist Models of Neurons, Learning Processes and Artificial Intelligence-Part I
Classification of Abnormal Situations in a Waste Water Treatment Plant
CCIA '02 Proceedings of the 5th Catalonian Conference on AI: Topics in Artificial Intelligence
Rule extraction from support vector machines based on consistent region covering reduction
Knowledge-Based Systems
Hi-index | 0.00 |
Support Vector Machines for pattern recognition are addressed to binary classification problems. The problem of multi-class classification is typically solved by the combination of 2-class decision functions using voting scheme methods or decison trees. We present a new multi-class classification SVM for the separable case, called K-SVCR. Learning machines operating in a kernel-induced feature space are constructed assigning output +1 or -1 if training patterns belongs to the classes to be separated, and assigning output 0 if patterns have a different label to the formers. This formulation of multi-class classification problem ever assigns a meaningful answer to every input and its architecture is more fault-tolerant than standard methods one.