Dissimilarity representations allow for building good classifiers
Pattern Recognition Letters
Editorial: special issue on learning from imbalanced data sets
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
A study of the behavior of several methods for balancing machine learning training data
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
The Dissimilarity Representation for Pattern Recognition: Foundations And Applications (Machine Perception and Artificial Intelligence)
Prototype selection for dissimilarity-based classifiers
Pattern Recognition
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Prototype Selection for Dissimilarity Representation by a Genetic Algorithm
ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
An empirical evaluation on dimensionality reduction schemes for dissimilarity-based classifications
Pattern Recognition Letters
The dissimilarity space: Bridging structural and statistical pattern recognition
Pattern Recognition Letters
Beyond accuracy, f-score and ROC: a family of discriminant measures for performance evaluation
AI'06 Proceedings of the 19th Australian joint conference on Artificial Intelligence: advances in Artificial Intelligence
The dissimilarity representation for structural pattern recognition
CIARP'11 Proceedings of the 16th Iberoamerican Congress conference on Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
The condensed nearest neighbor rule (Corresp.)
IEEE Transactions on Information Theory
Hi-index | 0.00 |
In the dissimilarity representation paradigm, several prototype selection methods have been used to cope with the topic of how to select a small representation set for generating a low-dimensional dissimilarity space. In addition, these methods have also been used to reduce the size of the dissimilarity matrix. However, these approaches assume a relatively balanced class distribution, which is grossly violated in many real-life problems. Often, the ratios of prior probabilities between classes are extremely skewed. In this paper, we study the use of renowned prototype selection methods adapted to the case of learning from an imbalanced dissimilarity matrix. More specifically, we propose the use of these methods to under-sample the majority class in the dissimilarity space. The experimental results demonstrate that the one-sided selection strategy performs better than the classical prototype selection methods applied over all classes.