Learning the Dynamics of Embedded Clauses

  • Authors:
  • Mikael Bodén;Alan Blair

  • Affiliations:
  • School of Information Technology and Electrical Engineering, University of Queensland, 14072, Australia. mikael.itee@uq.edu.au;School of Computer Science and Engineering, University of New South Wales, 2052, Australia. blair@cse.unsw.edu.au

  • Venue:
  • Applied Intelligence
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recent work by Siegelmann has shown that the computational power of recurrent neural networks matches that of Turing Machines. One important implication is that complex language classes (infinite languages with embedded clauses) can be represented in neural networks. Proofs are based on a fractal encoding of states to simulate the memory and operations of stacks.In the present work, it is shown that similar stack-like dynamics can be learned in recurrent neural networks from simple sequence prediction tasks. Two main types of network solutions are found and described qualitatively as dynamical systems: damped oscillation and entangled spiraling around fixed points. The potential and limitations of each solution type are established in terms of generalization on two different context-free languages. Both solution types constitute novel stack implementations—generally in line with Siegelmann's theoretical work—which supply insights into how embedded structures of languages can be handled in analog hardware.