Instance-Based Learning Algorithms
Machine Learning
A Nearest Hyperrectangle Learning Method
Machine Learning
Unifying instance-based and rule-based induction
Machine Learning
Separate-and-Conquer Rule Learning
Artificial Intelligence Review
Reduction Techniques for Instance-BasedLearning Algorithms
Machine Learning
Data Mining and Knowledge Discovery with Evolutionary Algorithms
Data Mining and Knowledge Discovery with Evolutionary Algorithms
SIA: A Supervised Inductive Algorithm with Genetic Search for Learning Attributes based Concepts
ECML '93 Proceedings of the European Conference on Machine Learning
Introduction to Evolutionary Computing
Introduction to Evolutionary Computing
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Feature selection based on rough sets and particle swarm optimization
Pattern Recognition Letters
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Transferring neural network based knowledge into an exemplar-based learner
Neural Computing and Applications
A hybrid case adaptation approach for case-based reasoning
Applied Intelligence
A memetic algorithm for evolutionary prototype selection: A scaling up approach
Pattern Recognition
A version of the NGE model suitable for fuzzy domains
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
KEEL: a software tool to assess evolutionary algorithms for data mining problems
Soft Computing - A Fusion of Foundations, Methodologies and Applications - Special Issue on Evolutionary and Metaheuristics based Data Mining (EMBDM); Guest Editors: José A. Gámez, María J. del Jesús, José M. Puerta
Handbook of Parametric and Nonparametric Statistical Procedures
Handbook of Parametric and Nonparametric Statistical Procedures
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Machine Learning and Data Mining: Introduction to Principles and Algorithms
Machine Learning and Data Mining: Introduction to Principles and Algorithms
Improved heterogeneous distance functions
Journal of Artificial Intelligence Research
Evolutionary undersampling for classification with imbalanced datasets: Proposals and taxonomy
Evolutionary Computation
A survey of evolutionary algorithms for clustering
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
A First Approach to Nearest Hyperrectangle Selection by Evolutionary Algorithms
ISDA '09 Proceedings of the 2009 Ninth International Conference on Intelligent Systems Design and Applications
Information Sciences: an International Journal
Honey Bees Mating Optimization algorithm for financial classification problems
Applied Soft Computing
Multiple Instance Learning with Multiple Objective Genetic Programming for Web Mining
Applied Soft Computing
Using evolutionary algorithms as instance selection for data reduction in KDD: an experimental study
IEEE Transactions on Evolutionary Computation
Nearest neighbor pattern classification
IEEE Transactions on Information Theory
Hi-index | 0.00 |
The nested generalized exemplar theory accomplishes learning by storing objects in Euclidean n-space, as hyperrectangles. Classification of new data is performed by computing their distance to the nearest ''generalized exemplar'' or hyperrectangle. This learning method allows the combination of the distance-based classification with the axis-parallel rectangle representation employed in most of the rule-learning systems. In this paper, we propose the use of evolutionary algorithms to select the most influential hyperrectangles to obtain accurate and simple models in classification tasks. The proposal has been compared with the most representative models based on hyperrectangle learning; such as the BNGE, RISE, INNER, and SIA genetics based learning approach. Our approach is also very competitive with respect to classical rule induction algorithms such as C4.5Rules and RIPPER. The results have been contrasted through non-parametric statistical tests over multiple data sets and they indicate that our approach outperforms them in terms of accuracy requiring a lower number of hyperrectangles to be stored, thus obtaining simpler models than previous NGE approaches. Larger data sets have also been tackled with promising outcomes.