Lazy meta-learning: creating customized model ensembles on demand

  • Authors:
  • Piero P. Bonissone

  • Affiliations:
  • GE Global Research Center, Niskayuna, NY

  • Venue:
  • WCCI'12 Proceedings of the 2012 World Congress conference on Advances in Computational Intelligence
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

In the not so distant future, we expect analytic models to become a commodity. We envision having access to a large number of data-driven models, obtained by a combination of crowdsourcing, crowdservicing, cloud-based evolutionary algorithms, outsourcing, in-house development, and legacy models. In this new context, the critical question will be model ensemble selection and fusion, rather than model generation. We address this issue by proposing customized model ensembles on demand, inspired by Lazy Learning. In our approach, referred to as Lazy Meta-Learning, for a given query we find the most relevant models from a DB of models, using their meta-information. After retrieving the relevant models, we select a subset of models with highly uncorrelated errors. With these models we create an ensemble and use their meta-information for dynamic bias compensation and relevance weighting. The output is a weighted interpolation or extrapolation of the outputs of the models ensemble. Furthermore, the confidence interval around the output is reduced as we increase the number of uncorrelated models in the ensemble. We have successfully tested this approach in a power plant management application.