C4.5: programs for machine learning
C4.5: programs for machine learning
A hierarchical method for multi-class support vector machines
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Computational Biology and Chemistry
An Information Theoretic Perspective on Multiple Classifier Systems
MCS '09 Proceedings of the 8th International Workshop on Multiple Classifier Systems
Constraints in Weighted Averaging
MCS '09 Proceedings of the 8th International Workshop on Multiple Classifier Systems
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part I
Learning Nondeterministic Classifiers
The Journal of Machine Learning Research
Orientation distance-based discriminative feature extraction for multi-class classification
CIKM '10 Proceedings of the 19th ACM international conference on Information and knowledge management
Tree Decomposition for Large-Scale SVM Problems
The Journal of Machine Learning Research
CHIRP: a new classifier based on composite hypercubes on iterated random projections
Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining
Efficient classification of images with taxonomies
ACCV'09 Proceedings of the 9th Asian conference on Computer Vision - Volume Part III
Learning data structure from classes: A case study applied to population genetics
Information Sciences: an International Journal
On Taxonomies for Multi-class Image Categorization
International Journal of Computer Vision
Enhancing directed binary trees for multi-class classification
Information Sciences: an International Journal
Large margin principle in hyperrectangle learning
Neurocomputing
Hi-index | 0.00 |
We propose a method for the classification of more than two classes, from high-dimensional features. Our approach is to build a binary decision tree in a top-down manner, using the optimal margin classifier at each split. We implement an exact greedy algorithm for this task, and compare its performance to less greedy procedures based on clustering of the matrix of pairwise margins. We compare the performance of the "margin tree" to the closely related "all-pairs" (one versus one) support vector machine, and nearest centroids on a number of cancer microarray data sets. We also develop a simple method for feature selection. We find that the margin tree has accuracy that is competitive with other methods and offers additional interpretability in its putative grouping of the classes.