Robust Classification for Imprecise Environments
Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Self-Organizing Maps
AdaCost: Misclassification Cost-Sensitive Boosting
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
A Comparative Study of Cost-Sensitive Boosting Algorithms
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Mining with rarity: a unifying framework
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
Class imbalances versus small disjuncts
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
A Survey of Outlier Detection Methodologies
Artificial Intelligence Review
KBA: Kernel Boundary Alignment Considering Imbalanced Data Distribution
IEEE Transactions on Knowledge and Data Engineering
Training Cost-Sensitive Neural Networks with Methods Addressing the Class Imbalance Problem
IEEE Transactions on Knowledge and Data Engineering
An introduction to ROC analysis
Pattern Recognition Letters - Special issue: ROC analysis in pattern recognition
Exploratory Under-Sampling for Class-Imbalance Learning
ICDM '06 Proceedings of the Sixth International Conference on Data Mining
Boosting for Learning Multiple Classes with Imbalanced Class Distribution
ICDM '06 Proceedings of the Sixth International Conference on Data Mining
Fast Kernel Classifiers with Online and Active Learning
The Journal of Machine Learning Research
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Active learning for class imbalance problem
SIGIR '07 Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval
Cost-sensitive boosting for classification of imbalanced data
Pattern Recognition
Learning on the border: active learning in imbalanced data classification
Proceedings of the sixteenth ACM conference on Conference on information and knowledge management
IEEE Transactions on Knowledge and Data Engineering
SMOTE: synthetic minority over-sampling technique
Journal of Artificial Intelligence Research
Constructing ensembles of classifiers by means of weighted instance selection
IEEE Transactions on Neural Networks
SOMSO: a self-organizing map approach for spatial outlier detection with multiple attributes
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
New usage of SOM for genetic algorithms
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartI
Sensitivity versus accuracy in multiclass problems using memetic Pareto evolutionary neural networks
IEEE Transactions on Neural Networks
RAMOBoost: ranked minority oversampling in boosting
IEEE Transactions on Neural Networks
Borderline-SMOTE: a new over-sampling method in imbalanced data sets learning
ICIC'05 Proceedings of the 2005 international conference on Advances in Intelligent Computing - Volume Part I
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Self-organizing maps, vector quantization, and mixture modeling
IEEE Transactions on Neural Networks
A Kernel-Based Two-Class Classifier for Imbalanced Data Sets
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
In this paper, a hybrid learning model of imbalanced evolving self-organizing maps (IESOMs) is proposed to address the imbalanced learning problems. In our approach, we propose to modify the classic SOM learning rule to search the winner neuron based on energy function by minimally reducing local error in the competitive learning phase. The advantage of IESOM is that it can improve the classification performance through obtaining useful knowledge from the limited and underrepresented minority class data. The positive and negative SOMs are employed to train the minority and majority class, respectively. Based on the original minority class, the positive SOM evolves into a new stage that might discover novel knowledge. The purpose of convergent evolution is to recurrently search the fitness value via minimal mean quantization error in the feature space, which can motivate the offspring individuals to move toward the center of positive SOM so as to form more explicit boundary. The iterative learning procedure is used to adaptively update the incremental feature maps and create more minority instances to facilitate learning from imbalanced data. The effectiveness of the proposed algorithm is compared with several existing methods under various assessment metrics.