On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Training Support Vector Machines: an Application to Face Detection
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
Learning to Detect and Classify Malicious Executables in the Wild
The Journal of Machine Learning Research
Semi-Supervised Learning
Perceptron-based learning algorithms
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
A new linear classifier based on minimum circum circle (CMCC) is proposed in this paper. It first calculates the minimum circum circle of samples for each class. Then the distributions of the samples can be described by these circles. The linear separating hyperplane will intersect the connecting line of the centers of circles. Consequently, the perpendicular to the connecting line of each two centers is defined as the classifier of these two classes. Moreover, some improved classifiers are proposed when the separating hyperplane is not perpendicular to the connecting line or when there are outliers in the samples. The combined classifier based on subclasses is also discussed. In the experiments, the CMCC and its improved algorithms are compared with some other classifiers such as support vector machine, linear discriminant analysis, etc. The experimental results show that the CMCC gives a relatively good performance on both classification accuracy and time cost.