Evolutionary selection of hyperrectangles in nested generalized exemplar learning
Applied Soft Computing
Large margin principle in hyperrectangle learning
Neurocomputing
Hi-index | 0.00 |
The nested generalized exemplar theory accomplishes learning by storing objects in Euclidean n-space, as hyperrectangles. Classification of new data is performed by computing their distance to the nearest “generalized exemplar” or hyperrectangle. This learning method permits to combine the distance-based classification with the axis-parallel rectangle representation employed in most of the rule-learning systems. This contribution proposes the use of evolutionary algorithms to select the most influential hyperrectangles to obtain accurate and simple models in classification tasks. The proposal is compared with the most representative nearest hyperrectangle learning approaches and the results obtained show that the evolutionary proposal outperforms them in accuracy and requires storing a lower number of hyperrectangles.