Multiple comparison procedures
Multiple comparison procedures
C4.5: programs for machine learning
C4.5: programs for machine learning
Evolutionary computation: toward a new philosophy of machine intelligence
Evolutionary computation: toward a new philosophy of machine intelligence
Robust classification systems for imprecise environments
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Using AUC and Accuracy in Evaluating Learning Algorithms
IEEE Transactions on Knowledge and Data Engineering
Machine Learning
An introduction to ROC analysis
Pattern Recognition Letters - Special issue: ROC analysis in pattern recognition
Multi-class ROC analysis from a multi-objective optimisation perspective
Pattern Recognition Letters - Special issue: ROC analysis in pattern recognition
Selecting features in microarray classification using ROC curves
Pattern Recognition
Weighting fuzzy classification rules using receiver operating characteristics (ROC) analysis
Information Sciences: an International Journal
Information Sciences: an International Journal
JCLEC: a Java framework for evolutionary computation
Soft Computing - A Fusion of Foundations, Methodologies and Applications - Special issue (pp 315-357) "Ordered structures in many-valued logic"
Maximizing the area under the ROC curve by pairwise feature combination
Pattern Recognition
An information granulation based data mining approach for classifying imbalanced data
Information Sciences: an International Journal
Efficient Multiclass ROC Approximation by Decomposition via Confusion Matrix Perturbation Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
KEEL: a software tool to assess evolutionary algorithms for data mining problems
Soft Computing - A Fusion of Foundations, Methodologies and Applications - Special Issue on Evolutionary and Metaheuristics based Data Mining (EMBDM); Guest Editors: José A. Gámez, María J. del Jesús, José M. Puerta
Evolutionary product-unit neural networks classifiers
Neurocomputing
Bio-inspired and gradient-based algorithms to train MLPs: The influence of diversity
Information Sciences: an International Journal
Sensitivity versus accuracy in multiclass problems using memetic Pareto evolutionary neural networks
IEEE Transactions on Neural Networks
An evolutionary artificial neural networks approach for breast cancer diagnosis
Artificial Intelligence in Medicine
A new evolutionary system for evolving artificial neural networks
IEEE Transactions on Neural Networks
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Logistic Regression by Means of Evolutionary Radial Basis Function Neural Networks
IEEE Transactions on Neural Networks
An evolutionary algorithm that constructs recurrent neural networks
IEEE Transactions on Neural Networks
Artificial Intelligence in Medicine
Information Sciences: an International Journal
A random forest classifier for lymph diseases
Computer Methods and Programs in Biomedicine
Nonparallel hyperplane support vector machine for binary classification problems
Information Sciences: an International Journal
Hi-index | 0.07 |
The machine learning community has traditionally used correct classification rates or accuracy (C) values to measure classifier performance and has generally avoided presenting classification levels of each class in the results, especially for problems with more than two classes. C values alone are insufficient because they cannot capture the myriad of contributing factors that differentiate the performance of two different classifiers. Receiver Operating Characteristic (ROC) analysis is an alternative to solve these difficulties, but it can only be used for two-class problems. For this reason, this paper proposes a new approach for analysing classifiers based on two measures: C and sensitivity (S) (i.e., the minimum of accuracies obtained for each class). These measures are optimised through a two-stage evolutionary process. It was conducted by applying two sequential fitness functions in the evolutionary process, including entropy (E) for the first stage and a new fitness function, area (A), for the second stage. By using these fitness functions, the C level was optimised in the first stage, and the S value of the classifier was generally improved without significantly reducing C in the second stage. This two-stage approach improved S values in the generalisation set (whereas an evolutionary algorithm (EA) based only on the S measure obtains worse S levels) and obtained both high C values and good classification levels for each class. The methodology was applied to solve 16 benchmark classification problems and two complex real-world problems in analytical chemistry and predictive microbiology. It obtained promising results when compared to other competitive multi-class classification algorithms and a multi-objective alternative based on E and S.