Genetic programming: on the programming of computers by means of natural selection
Genetic programming: on the programming of computers by means of natural selection
Data Mining and Knowledge Discovery
Adapting the Fitness Function in GP for Data Mining
Proceedings of the Second European Workshop on Genetic Programming
Learning and example selection for object and pattern detection
Learning and example selection for object and pattern detection
Editorial: special issue on learning from imbalanced data sets
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
An Experimental Study on Pedestrian Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Evolutionary Algorithms for Solving Multi-Objective Problems (Genetic and Evolutionary Computation)
Evolutionary Algorithms for Solving Multi-Objective Problems (Genetic and Evolutionary Computation)
Learning when training data are costly: the effect of class distribution on tree induction
Journal of Artificial Intelligence Research
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
GP classification under imbalanced data sets: active sub-sampling and AUC approximation
EuroGP'08 Proceedings of the 11th European conference on Genetic programming
A fast and elitist multiobjective genetic algorithm: NSGA-II
IEEE Transactions on Evolutionary Computation
Training genetic programming on half a million patterns: an example from anomaly detection
IEEE Transactions on Evolutionary Computation
Multi-objective evolutionary optimization for generating ensembles of classifiers in the ROC space
Proceedings of the 14th annual conference on Genetic and evolutionary computation
Hi-index | 0.00 |
Learning algorithms can suffer a performance bias when data sets are unbalanced. This paper proposes a Multi-Objective Genetic Programming (MOGP) approach using the accuracy of the minority and majority class as learning objectives. We focus our analysis on the classification ability of evolved Pareto-front solutions using the Area Under the ROC Curve (AUC) and investigate which regions of the objective trade-off surface favour high-scoring AUC solutions. We show that a diverse set of well-performing classifiers is simultaneously evolved along the Pareto-front using the MOGP approach compared to canonical GP where only one solution is found along the objective trade-off surface, and that in some problems the MOGP solutions had better AUC than solutions evolved with canonical GP using hand-crafted fitness functions.