Evolving neural networks with maximum AUC for imbalanced data classification

  • Authors:
  • Xiaofen Lu;Ke Tang;Xin Yao

  • Affiliations:
  • Nature Inspired Computation and Applications Laboratory (NICAL), School of Computer Science and Technology, University of Science and Technology of China, Hefei, China;Nature Inspired Computation and Applications Laboratory (NICAL), School of Computer Science and Technology, University of Science and Technology of China, Hefei, China;Nature Inspired Computation and Applications Laboratory (NICAL), School of Computer Science and Technology, University of Science and Technology of China, Hefei, China

  • Venue:
  • HAIS'10 Proceedings of the 5th international conference on Hybrid Artificial Intelligence Systems - Volume Part I
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Real-world classification problems usually involve imbalanced data sets In such cases, a classifier with high classification accuracy does not necessarily imply a good classification performance for all classes The Area Under the ROC Curve (AUC) has been recognized as a more appropriate performance indicator in such cases Quite a few methods have been developed to design classifiers with the maximum AUC In the context of Neural Networks (NNs), however, it is usually an approximation of AUC rather than the exact AUC itself that is maximized, because AUC is non-differentiable and cannot be directly maximized by gradient-based methods In this paper, we propose to use evolutionary algorithms to train NNs with the maximum AUC The proposed method employs AUC as the objective function An evolutionary algorithm, namely the Self-adaptive Differential Evolution with Neighborhood Search (SaNSDE) algorithm, is used to optimize the weights of NNs with respect to AUC Empirical studies on 19 binary and multi-class imbalanced data sets show that the proposed evolutionary AUC maximization (EAM) method can train NN with larger AUC than existing methods.