Instance-Based Learning Algorithms
Machine Learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Robust Classification for Imprecise Environments
Machine Learning
Tree Induction for Probability-Based Ranking
Machine Learning
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
The Journal of Machine Learning Research
In Defense of One-Vs-All Classification
The Journal of Machine Learning Research
Probability Estimates for Multi-class Classification by Pairwise Coupling
The Journal of Machine Learning Research
Naive Bayesian Classification of Structured Data
Machine Learning
Real-coded memetic algorithms with crossover hill-climbing
Evolutionary Computation - Special issue on magnetic algorithms
Training Cost-Sensitive Neural Networks with Methods Addressing the Class Imbalance Problem
IEEE Transactions on Knowledge and Data Engineering
Improving Multiclass Pattern Recognition by the Combination of Two Strategies
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Data Complexity in Pattern Recognition (Advanced Information and Knowledge Processing)
Data Complexity in Pattern Recognition (Advanced Information and Knowledge Processing)
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Top 10 algorithms in data mining
Knowledge and Information Systems
On the Consistency of Multiclass Classification Methods
The Journal of Machine Learning Research
Quantifying counts and costs via classification
Data Mining and Knowledge Discovery
An experimental comparison of performance measures for classification
Pattern Recognition Letters
KEEL: a software tool to assess evolutionary algorithms for data mining problems
Soft Computing - A Fusion of Foundations, Methodologies and Applications - Special Issue on Evolutionary and Metaheuristics based Data Mining (EMBDM); Guest Editors: José A. Gámez, María J. del Jesús, José M. Puerta
Dataset Shift in Machine Learning
Dataset Shift in Machine Learning
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
IEEE Transactions on Knowledge and Data Engineering
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Aggregation Functions: A Guide for Practitioners
Aggregation Functions: A Guide for Practitioners
A review on the combination of binary classifiers in multiclass problems
Artificial Intelligence Review
Information Sciences: an International Journal
Evaluating Learning Algorithms: A Classification Perspective
Evaluating Learning Algorithms: A Classification Perspective
International Journal of Approximate Reasoning
Single pass text classification by direct feature weighting
Knowledge and Information Systems
A unifying view on dataset shift in classification
Pattern Recognition
Feature Selection with Conjunctions of Decision Stumps and Learning from Microarray Data
IEEE Transactions on Pattern Analysis and Machine Intelligence
Texture Classification from Random Features
IEEE Transactions on Pattern Analysis and Machine Intelligence
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Nesting One-Against-One Algorithm Based on SVMs for Pattern Classification
IEEE Transactions on Neural Networks
Interval-Valued Fuzzy Sets Applied to Stereo Matching of Color Images
IEEE Transactions on Image Processing
Enhancing directed binary trees for multi-class classification
Information Sciences: an International Journal
Hi-index | 0.07 |
One-vs-One strategy divides the original multi-class problem into as many binary classification problems as pairs of classes. Then, independent base classifiers are learned to face each problem, whose outputs are combined to predict a single class label. This way, the accuracy of the baseline classifiers without decomposition is usually enhanced, aside from enabling the usage of binary classifiers, i.e., Support Vector Machines, to solve multi-class problems. This paper analyzes the fact that existing aggregations favor easily recognizable classes; hence, the accuracy enhancement mainly comes from the higher correct classification rates over these classes. Using other evaluation criteria, the significant improvements of One-vs-One are diminished, showing a weakness due to the presence of difficult classes. Difficult classes can be defined as those obtaining a lower correct classification rate than that obtained by the other classes in the problem. After studying the problem of difficult classes in this framework and aiming to empower these classes, a novel similarity-based aggregation is presented, which generalizes the well-known weighted voting. The experimental analysis shows that the new methodology is able to increase the recognition of difficult classes, obtaining a more balanced performance over all classes, which is a desirable behavior. The methodology is tested within several Machine Learning paradigms and is compared with the state-of-the-art on aggregations for One-vs-One strategy. The results are contrasted by the proper statistical tests, as suggested in the literature.