Computer systems that learn: classification and prediction methods from statistics, neural nets, machine learning, and expert systems
Induction By Attribute Elimination
IEEE Transactions on Knowledge and Data Engineering
Machine Learning
Parallel algorithms for computing all possible subset regression models using the QR decomposition
Parallel Computing - Special issue: Parallel computing in numerical optimization
Design of hybrids for the minimum sum-of-squares clustering problem
Computational Statistics & Data Analysis
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Optimization-based feature selection with adaptive instance sampling
Computers and Operations Research
A data mining-constraint satisfaction optimization problem for cost effective classification
Computers and Operations Research
Analysis of new variable selection methods for discriminant analysis
Computational Statistics & Data Analysis
Data Mining techniques for the detection of fraudulent financial statements
Expert Systems with Applications: An International Journal
Learning multicriteria fuzzy classification method PROAFTN from data
Computers and Operations Research
Information Sciences: an International Journal
A co-evolving decision tree classification method
Expert Systems with Applications: An International Journal
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Blind paraunitary equalization
Signal Processing
Inducing a marketing strategy for a new pet insurance company using decision trees
Expert Systems with Applications: An International Journal
Earnings management prediction: A pilot study of combining neural networks and decision trees
Expert Systems with Applications: An International Journal
Constructing a decision tree from data with hierarchical class labels
Expert Systems with Applications: An International Journal
Improved use of continuous attributes in C4.5
Journal of Artificial Intelligence Research
A probabilistic heuristic for a computationally difficult set covering problem
Operations Research Letters
Hi-index | 12.05 |
This paper proposes a new method for constructing binary classification trees. The aim is to build simple trees, i.e. trees which are as less complex as possible, thereby facilitating interpretation and favouring the balance between optimization and generalization in the test data sets. The proposed method is based on the metaheuristic strategy known as GRASP in conjunction with optimization tasks. Basically, this method modifies the criterion for selecting the attributes that determine the split in each node. In order to do so, a certain amount of randomisation is incorporated in a controlled way. We compare our method with the traditional method by means of a set of computational experiments. We conclude that the GRASP method (for small levels of randomness) significantly reduces tree complexity without decreasing classification accuracy.