Learning Decision Trees Using the Area Under the ROC Curve
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
The Case against Accuracy Estimation for Comparing Induction Algorithms
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Mining with rarity: a unifying framework
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
Optimising area under the ROC curve using gradient descent
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Differential Evolution: A Practical Approach to Global Optimization (Natural Computing Series)
Differential Evolution: A Practical Approach to Global Optimization (Natural Computing Series)
An empirical comparison of supervised learning algorithms
ICML '06 Proceedings of the 23rd international conference on Machine learning
An introduction to ROC analysis
Pattern Recognition Letters - Special issue: ROC analysis in pattern recognition
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
A critical analysis of variants of the AUC
Machine Learning
Efficient AUC Optimization for Classification
PKDD 2007 Proceedings of the 11th European conference on Principles and Practice of Knowledge Discovery in Databases
Proper Model Selection with Significance Test
ECML PKDD '08 Proceedings of the 2008 European Conference on Machine Learning and Knowledge Discovery in Databases - Part I
A new evolutionary system for evolving artificial neural networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Real-world classification problems usually involve imbalanced data sets In such cases, a classifier with high classification accuracy does not necessarily imply a good classification performance for all classes The Area Under the ROC Curve (AUC) has been recognized as a more appropriate performance indicator in such cases Quite a few methods have been developed to design classifiers with the maximum AUC In the context of Neural Networks (NNs), however, it is usually an approximation of AUC rather than the exact AUC itself that is maximized, because AUC is non-differentiable and cannot be directly maximized by gradient-based methods In this paper, we propose to use evolutionary algorithms to train NNs with the maximum AUC The proposed method employs AUC as the objective function An evolutionary algorithm, namely the Self-adaptive Differential Evolution with Neighborhood Search (SaNSDE) algorithm, is used to optimize the weights of NNs with respect to AUC Empirical studies on 19 binary and multi-class imbalanced data sets show that the proposed evolutionary AUC maximization (EAM) method can train NN with larger AUC than existing methods.