Mixture of random prototype-based local experts

  • Authors:
  • Giuliano Armano;Nima Hatami

  • Affiliations:
  • DIEE-Department of Electrical and Electronic Engineering, University of Cagliari, Cagliari, Italy;DIEE-Department of Electrical and Electronic Engineering, University of Cagliari, Cagliari, Italy

  • Venue:
  • HAIS'10 Proceedings of the 5th international conference on Hybrid Artificial Intelligence Systems - Volume Part I
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

The Mixture of Experts (ME) is one of the most popular ensemble methods used in pattern recognition and machine learning This algorithm stochastically partitions the input space of the problem into a number of subspaces, experts becoming specialized on each subspace The ME uses an expert called gating network to manage this process, which is trained together with the experts In this paper, we propose a modified version of the ME algorithm which first partitions the original problem into centralized regions and then uses a simple distance-based gating function to specialize the expert networks Each expert contributes to classify an input sample according to the distance between the input and a prototype embedded by the expert As a result, an accurate classifier with shorter training time and smaller number of parameters is achieved Experimental results on a binary toy problem and selected datasets from the UCI machine learning repository show the robustness of the proposed method compared to the standard ME model.