On The Proper Treatment of Semantic Systematicity

  • Authors:
  • Robert F. Hadley

  • Affiliations:
  • School of Computing Science, and Cognitive Science Program, Simon Fraser University, Burnaby, B.C., Canada V5A 1S6/ E-mail: hadley@cs.sfu.ca

  • Venue:
  • Minds and Machines
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

The past decade has witnessed the emergence of a novel stance on semantic representation, and its relationship to context sensitivity. Connectionist-minded philosophers, including Clark and van Gelder, have espoused the merits of viewing hidden-layer, context-sensitive representations as possessing semantic content, where this content is partially revealed via the representations’ position in vector space. In recent work, Bodén and Niklasson have incorporated a variant of this view of semantics within their conception of semantic systematicity. Moreover, Bodén and Niklasson contend that they have produced experimental results which not only satisfy a kind of context-based, semantic systematicity, but which, to the degree that reality permits, effectively deals with challenges posed by Fodor and Pylyshyn (1988), and Hadley (1994a). The latter challenge involved well-defined criteria for strong semantic systematicity. This paper examines the relevant claims and experiments of Bodén and Niklasson. It is argued that their case fatally involves two fallacies of equivocation; one concerning ‘semantic content’ and the other concerning ‘novel test sentences’. In addition, it is argued that their ultimate construal of context sensitive semantics contains serious confusions. These confusions are also found in certain publications dealing with «latent semantic analysis”. Thus, criticisms presented here have relevance beyond the work of Bodén and Niklasson.