Journal of Cognitive Neuroscience
Semantic Integration in Sentences and Discourse: Evidence from the N400
Journal of Cognitive Neuroscience
Journal of Cognitive Neuroscience
The Role of Iconic Gestures in Speech Disambiguation: ERP Evidence
Journal of Cognitive Neuroscience
Journal of Cognitive Neuroscience
Computer Music Modeling and Retrieval. Sense of Sounds
Journal of Cognitive Neuroscience
Integrating speech and iconic gestures in a stroop-like task: Evidence for automatic processing
Journal of Cognitive Neuroscience
The recognition and comprehension of hand gestures: a review and research agenda
ZiF'06 Proceedings of the Embodied communication in humans and machines, 2nd ZiF research group international conference on Modeling communication with robots and virtual humans
Journal of Cognitive Neuroscience
Journal of Cognitive Neuroscience
The role of synchrony and ambiguity in speech-gesture integration during comprehension
Journal of Cognitive Neuroscience
GW'09 Proceedings of the 8th international conference on Gesture in Embodied Communication and Human-Computer Interaction
Guest Editorial: Gesture and speech in interaction: An overview
Speech Communication
Hi-index | 0.00 |
During language comprehension, listeners use the global semantic representation from previous sentence or discourse context to immediately integrate the meaning of each upcoming word into the unfolding message-level representation. Here we investigate whether communicative gestures that often spontaneously co-occur with speech are processed in a similar fashion and integrated to previous sentence context in the same way as lexical meaning. Event-related potentials were measured while subjects listened to spoken sentences with a critical verb (e.g., knock), which was accompanied by an iconic co-speech gesture (i.e., KNOCK). Verbal and/or gestural semantic content matched or mismatched the content of the preceding part of the sentence. Despite the difference in the modality and in the specificity of meaning conveyed by spoken words and gestures, the latency, amplitude, and topographical distribution of both word and gesture mismatches are found to be similar, indicating that the brain integrates both types of information simultaneously. This provides evidence for the claim that neural processing in language comprehension involves the simultaneous incorporation of information coming from a broader domain of cognition than only verbal semantics. The neural evidence for similar integration of information from speech and gesture emphasizes the tight interconnection between speech and co-speech gestures.