Hinging hyperplane models for multiple predicted variables

  • Authors:
  • Anca Maria Ivanescu;Philipp Kranen;Thomas Seidl

  • Affiliations:
  • RWTH Aachen University, Germany;RWTH Aachen University, Germany;RWTH Aachen University, Germany

  • Venue:
  • SSDBM'12 Proceedings of the 24th international conference on Scientific and Statistical Database Management
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Model-based learning for predicting continuous values involves building an explicit generalization of the training data. Simple linear regression and piecewise linear regression techniques are well suited for this task, because, unlike neural networks, they yield an interpretable model. The hinging hyperplane approach is a nonlinear learning technique which computes a continuous model. It consists of linear submodels over individual partitions in the regressor space. However, it is only designed for one predicted variable. In the case of r predicted variables the number of partitions grows quickly with r and the result is no longer being compact or interpretable. We propose a generalization of the hinging hyperplane approach for several predicted variables. The algorithm considers all predicted variables simultaneously. It enforces common hinges, while at the same time restoring the continuity of the resulting functions. The model complexity no longer depends on the number of predicted variables, remaining compact and interpretable.