Training Cost-Sensitive Neural Networks with Methods Addressing the Class Imbalance Problem
IEEE Transactions on Knowledge and Data Engineering
Cost-sensitive boosting for classification of imbalanced data
Pattern Recognition
Deterministic neural classification
Neural Computation
Cluster-based under-sampling approaches for imbalanced data distributions
Expert Systems with Applications: An International Journal
No-reference image quality assessment using modified extreme learning machine classifier
Applied Soft Computing
IEEE Transactions on Knowledge and Data Engineering
Exploratory undersampling for class-imbalance learning
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Extreme support vector machine classifier
PAKDD'08 Proceedings of the 12th Pacific-Asia conference on Advances in knowledge discovery and data mining
Maximum Ambiguity-Based Sample Selection in Fuzzy Decision Tree Induction
IEEE Transactions on Knowledge and Data Engineering
Weighted extreme learning machine for imbalance learning
Neurocomputing
Hi-index | 0.01 |
Extreme learning machine (ELM) for single-hidden-layer feedforward neural networks (SLFN) is a powerful machine learning technique, and has been attracting attentions for its fast learning speed and good generalization performance. Recently, a weighted ELM is proposed to deal with data with imbalanced class distribution. The key essence of weighted ELM is that each training sample is assigned with an extra weight. Although some empirical weighting schemes were provided, how to determine better sample weights remains an open problem. In this paper, we proposed a Boosting weighted ELM, which embedded weighted ELM seamlessly into a modified AdaBoost framework, to solve the above problem. Intuitively, the distribution weights in AdaBoost framework, which reflect importance of training samples, are input into weighted ELM as training sample weights. Furthermore, AdaBoost is modified in two aspects to be more effective for imbalanced learning: (i) the initial distribution weights are set to be asymmetric so that AdaBoost converges at a faster speed; (ii) the distribution weights are updated separately for different classes to avoid destroying the distribution weights asymmetry. Experimental results on 16 binary datasets and 5 multiclass datasets from KEEL repository show that the proposed method could achieve more balanced results than weighted ELM.