A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Machine Learning
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Pairwise classification and support vector machines
Advances in kernel methods
Support Vector Machines: Training and Applications
Support Vector Machines: Training and Applications
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Estimating the Support of a High-Dimensional Distribution
Neural Computation
Sphere-structured support vector machines for multi-class pattern recognition
RSFDGrC'03 Proceedings of the 9th international conference on Rough sets, fuzzy sets, data mining, and granular computing
A Fuzzy support vector classifier based on Bayesian optimization
Fuzzy Optimization and Decision Making
Multimedia Tools and Applications
Fault classifier of rotating machinery based on weighted support vector data description
Expert Systems with Applications: An International Journal
Fuzzy multi-class classifier based on support vector data description and improved PCM
Expert Systems with Applications: An International Journal
High-dimensional indexing with oriented cluster representation for multimedia databases
ICME'09 Proceedings of the 2009 IEEE international conference on Multimedia and Expo
Support vector classifier based on fuzzy c-means and Mahalanobis distance
Journal of Intelligent Information Systems
LSMS/ICSEE'10 Proceedings of the 2010 international conference on Life system modeling and and intelligent computing, and 2010 international conference on Intelligent computing for sustainable energy and environment: Part I
Hi-index | 0.01 |
The minimum bounding sphere of a set of data, defined as the smallest sphere enclosing the data, was first used by Scholkopf et al. to estimate the VC-dimension of support vector classifiers and later applied by Tax and Duin to data domain description. Given a set of data, the minimum bounding sphere of each class can be computed by solving a quadratic programming problem. Since the spheres are constructed for each class separately, they can be used to deal with the multi-class classification problem easily, as proposed by Zhu et al. In this paper, we show that the decision rule proposed by Zhu et al. is generally insufficient to achieve the state-of-the-art classification performance. We, therefore, propose a new decision rule based on the Bayes decision theory. This new decision rule significantly improves the performance of the resulting sphere-based classifier. In addition to its low computational complexity and easy expandability to multi-class problems, the new classifier achieves comparable performance to the standard support vector machines on most of the real-world data sets being tested.