Communications of the ACM
Rule extraction from decision trees with complex nominal data
Neural, Parallel & Scientific Computations
Classification based on dimension transposition for high dimension data
Soft Computing - A Fusion of Foundations, Methodologies and Applications
The minimum consistent subset cover problem and its applications in data mining
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
HyperSurface classifiers ensemble for high dimensional data sets
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
A novel classification method based on hypersurface
Mathematical and Computer Modelling: An International Journal
Hi-index | 0.00 |
Hyper Surface Classification (HSC), which is based on Jordan Curve Theorem in Topology, is one of the accurate and efficient classification algorithms. The hyper surface obtained by the training process exhibits excellent generalization performance on datasets not only of large size but also of high dimensionality. The classification knowledge hidden in the classifier, however, is hard to interpret by human. How to obtain the classification rules is an important problem. In this paper, we firstly extract rule from the sample directly. In order to avoid rule redundance, two optimal policies, selecting Minimal Consistent Subset (MCS) for the training set and merging some neighboring cubes, are exerted to reduce the rules set. Experimental results show that the two policies are able to accurately acquire the knowledge implied by the hyper surface and express the good generalization performance of HSC. Moreover, the time for classifying the unlabeled sample by the rules set can be shorten correspondingly.