Consolidation using context-sensitive multiple task learning

  • Authors:
  • Ben Fowler;Daniel L. Silver

  • Affiliations:
  • Jodrey School of Computer Science, Acadia University, Wolfville, NS, Canada;Jodrey School of Computer Science, Acadia University, Wolfville, NS, Canada

  • Venue:
  • Canadian AI'11 Proceedings of the 24th Canadian conference on Advances in artificial intelligence
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Machine lifelong learning (ML3) is concerned with machines capable of learning and retaining knowledge over time, and exploiting this knowledge to assist new learning. An ML3 system must accurately retain knowledge of prior tasks while consolidating in knowledge of new tasks, overcoming the stability-plasticity problem. A system is presented using a context-sensitive multiple task learning (csMTL) neural network. csMTL uses a single output and additional context inputs for associating examples with tasks. A csMTL-based ML3 system is analyzed empirically using synthetic and real domains. The experiments focus on the effective retention and consolidation of task knowledge using both functional and representational transfer. The results indicate that combining the two methods of transfer serves best to retain prior knowledge, but at the cost of less effective new task consolidation.