Cross-Modal Interactions between Audition, Touch, and Vision in Endogenous Spatial Attention: ERP Evidence on Preparatory States and Sensory Modulations

  • Authors:
  • Martin Eimer;José Van Velzen;Jon Driver

  • Affiliations:
  • Bickbeck College London;Bickbeck College London;University College London

  • Venue:
  • Journal of Cognitive Neuroscience
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recent behavioral and event-related brain potential (ERP) studies have revealed cross-modal interactions in endogenous spatial attention between vision and audition, plus vision and touch. The present ERP study investigated whether these interactions reflect supramodal attentional control mechanisms, and whether similar cross-modal interactions also exist between audition and touch. Participants directed attention to the side indicated by a cue to detect infrequent auditory or tactile targets at the cued side. The relevant modality (audition or touch) was blocked. Attentional control processes were reflected in systematic ERP modulations elicited during cued shifts of attention. An anterior negativity contralateral to the cued side was followed by a contralateral positivity at posterior sites. These effects were similar whether the cue signaled which side was relevant for audition or for touch. They also resembled previously observed ERP modulations for shifts of visual attention, thus implicating supramodal mechanisms in the control of spatial attention. Following each cue, single auditory, tactile, or visual stimuli were presented at the cued or uncued side. Although stimuli in task-irrelevant modalities could be completely ignored, visual and auditory ERPs were nevertheless affected by spatial attention when touch was relevant, revealing cross-modal interactions. When audition was relevant, visual ERPs, but not tactile ERPs, were affected by spatial attention, indicating that touch can be decoupled from cross-modal attention when task-irrelevant.