Exploring the Parameter State Space of Stacking

  • Authors:
  • Alexander K. Seewald

  • Affiliations:
  • -

  • Venue:
  • ICDM '02 Proceedings of the 2002 IEEE International Conference on Data Mining
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

Ensemble learning schemes are a new field in data mining.While current research concentrates mainly on improvingthe performance of single learning algorithms, an alternativeis to combine learners with different biases. Stackingis the best-known such scheme which tries to combine learners'predictions or confidences via another learning algorithm.However, the adoption of Stacking into the data mining communityis hampered by its large parameter space, consistingmainly of other learning algorithms: (1) the set of learning algorithmsto combine, (2) the meta-learner responsible for thecombining and (3) the type of meta-data to use: confidencesor predictions. None of these parameters are obvious choices.Furthermore, little is known about the relation between parametersettings and performance of Stacking. By exploring all ofStacking's parameter settings and their interdependencies, weintend make Stacking a suitable choice for mainstream datamining applications.