Learning When Negative Examples Abound
ECML '97 Proceedings of the 9th European Conference on Machine Learning
Improving Identification of Difficult Small Classes by Balancing Class Distribution
AIME '01 Proceedings of the 8th Conference on AI in Medicine in Europe: Artificial Intelligence Medicine
Mining with rarity: a unifying framework
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
SMOTE: synthetic minority over-sampling technique
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
The majority of machine learning algorithms previously designed usually assume that their training sets are well-balanced, but data in the real-world is usually imbalanced. The class imbalance problem is pervasive and ubiquitous, causing trouble to a large segment of the data mining community. As the conventional machine learning algorithms have bad performance when they learn from imbalanced data sets, it is necessary to find solutions to machine learning on imbalanced data sets. This paper presents a novel Isomap-based hybrid re-sampling approach to improve the conventional SMOTE algorithm by incorporating the Isometric feature mapping algorithm (Isomap). Experiment results demonstrate that this hybrid re-sampling algorithm attains a performance superior to that of the re-sampling. It is clear that the Isomap method is an effective means to reduce the dimension of the re-sampling. This provides a new possible solution for dealing with the IDS classification.