The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
MetaCost: a general method for making classifiers cost-sensitive
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
Optimizing classifiers for imbalanced training sets
Proceedings of the 1998 conference on Advances in neural information processing systems II
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Learning When Negative Examples Abound
ECML '97 Proceedings of the 9th European Conference on Machine Learning
Editorial: special issue on learning from imbalanced data sets
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
Mining with rarity: a unifying framework
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
An introduction to ROC analysis
Pattern Recognition Letters - Special issue: ROC analysis in pattern recognition
SMOTE: synthetic minority over-sampling technique
Journal of Artificial Intelligence Research
The foundations of cost-sensitive learning
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
Response modeling with support vector machines
Expert Systems with Applications: An International Journal
GSVM: An SVM for handling imbalanced accuracy between classes inbi-classification problems
Applied Soft Computing
Hi-index | 0.00 |
The paper surveys the previous solutions and proposes further a new solution based on the cost-sensitive learning for solving the imbalanced dataset learning problem in the support vector machines. The general idea of cost-sensitive approach is to adopt an inverse proportional penalization scheme for dealing with the problem and forms a penalty regularized model. In the paper, additional margin compensation is further included to achieve a more accurate solution. As known, the margin plays an important role in drawing the decision boundary. It motivates the study to produce imbalanced margin between the classes which enables the decision boundary shift. The imbalanced margin is hence allowed to recompense the overwhelmed class as margin compensation. Incorporating with the penalty regularization, the margin compensation is capable to calibrate moderately the decision boundary and can be utilized to refine the bias boundary. The effect decreases the need of high penalty on the minority class and prevents the classification from the risk of overfitting. Experimental results show a promising potential in future applications.