Adaptation in natural and artificial systems
Adaptation in natural and artificial systems
Data Mining and Knowledge Discovery
A study of the behavior of several methods for balancing machine learning training data
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
Ensemble of Linear Perceptrons with Confidence Level Output
HIS '04 Proceedings of the Fourth International Conference on Hybrid Intelligent Systems
Adaptive mixtures of local experts
Neural Computation
SMOTE: synthetic minority over-sampling technique
Journal of Artificial Intelligence Research
Neighbor-weighted K-nearest neighbor for unbalanced text corpus
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
In real-world applications, it has been observed that class imbalance (significant differences in class prior probabilities) may produce an important deterioration of the classifier performance, in particular with patterns belonging to the less represented classes. One method to tackle this problem consists to resample the original training set, either by over-sampling the minority class and/or under-sampling the majority class. In this paper, we propose two ensemble models (using a modular neural network and the nearest neighbor rule) trained on datasets under-sampled with genetic algorithms. Experiments with real datasets demonstrate the effectiveness of the methodology here proposed.