Large margin principle in hyperrectangle learning

  • Authors:
  • Matthias Kirmse;Uwe Petersohn

  • Affiliations:
  • -;-

  • Venue:
  • Neurocomputing
  • Year:
  • 2014

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this paper, we propose a new meta learning approach to incorporate the large margin principle into hyperrectangle based learning. The goal of Large Margin Rectangle Learning (LMRL) is to combine the natural interpretability of hyperrectangle models like decision trees and rule learners with the risk minimization property related to the large margin principle. Our approach consists of two basic steps: supervised clustering and decision boundary creation. In the first step, we apply a supervised clustering algorithm to generate an initial rectangle based generalization of the training data. Subsequently, these labeled clusters are used to produce a large margin hyperrectangle model. Besides the overall approach, we also developed Large Margin Supervised Clustering (LMSC), an attempt to introduce the large margin principle directly into the supervised clustering process. Corresponding experiments not only provided empirical evidence for the supposed margin-accuracy relation, but also showed that LMRL performs equally well or better than compared decision tree and rule learner. Altogether, this new learning approach is a promising way to create more accurate interpretable models.