A Single Layer Perceptron Approach to Selective Multi-task Learning

  • Authors:
  • Jaisiel Madrid-Sánchez;Miguel Lázaro-Gredilla;Aníbal R. Figueiras-Vidal

  • Affiliations:
  • Department of Signal Theory and Communications, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés (Madrid), Spain;Department of Signal Theory and Communications, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés (Madrid), Spain;Department of Signal Theory and Communications, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés (Madrid), Spain

  • Venue:
  • IWINAC '07 Proceedings of the 2nd international work-conference on The Interplay Between Natural and Artificial Computation, Part I: Bio-inspired Modeling of Cognitive Tasks
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

A formal definition of task relatedness to theoretically justify multi-task learning (MTL) improvements has remained quite elusive. The implementation of MTL using multi-layer perceptron (MLP) neural networks evoked the notion of related tasks sharing an underlying representation. This assumption of relatedness can sometimes hurt the training process if tasks are not truly related in that way. In this paper we present a novel single-layer perceptron (SLP) approach to selectively achieve knowledge transfer in a multi-tasking scenario by using a different notion of task relatedness. The experimental results show that the proposed scheme largely outperforms single-task learning (STL) using single layer perceptrons, working in a robust way even when not closely related tasks are present.