Non-strict heterogeneous Stacking

  • Authors:
  • Niall Rooney;David Patterson;Chris Nugent

  • Affiliations:
  • School of Computing and Mathematics, Faculty of Engineering, University of Ulster at Jordanstown, Newtownabbey BT37 OQB, United Kingdom;School of Computing and Mathematics, Faculty of Engineering, University of Ulster at Jordanstown, Newtownabbey BT37 OQB, United Kingdom;School of Computing and Mathematics, Faculty of Engineering, University of Ulster at Jordanstown, Newtownabbey BT37 OQB, United Kingdom

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2007

Quantified Score

Hi-index 0.10

Visualization

Abstract

In this paper, we evaluate a new ensemble schema for regression, where the ensemble is composed of a number of models where each model is built using feature sampled data using a learning algorithm drawn from a set of simple and stable learning algorithms, and the ensemble integration method is Stacking. We evaluate this schema referred to as non-strict heterogeneous Stacking to a number of baseline methods and to strict heterogeneous Stacking, which uses the same number of models as there are base learning algorithms, built using un-sampled data. We demonstrate that non-strict Stacking for the set of base learning algorithms evaluated, strongly outperformed the baseline methods. In addition the added flexibility of non-strict Stacking, allowed it both to outperform strict Stacking and homogeneous Stacking for the same set of base learning algorithms considered. We discuss the conditions in general where non-strict heterogeneous Stacking is likely to be advantageous.