A gesture-based concept for speech movement control in articulatory speech synthesis

  • Authors:
  • Bernd J. Kröger;Peter Birkholz

  • Affiliations:
  • Department of Phoniatrics, Pedaudiology and Communication Disorders, University Hospital Aachen and Aachen University, Aachen, Germany;Institute for Computer Science, University of Rostock, Rostock, Germany

  • Venue:
  • COST 2102'07 Proceedings of the 2007 COST action 2102 international conference on Verbal and nonverbal communication behaviours
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

An articulatory speech synthesizer comprising a three-dimensional vocal tract model and a gesture-based concept for control of articulatory movements is introduced and discussed in this paper. A modular learning concept based on speech perception is outlined for the creation of gestural control rules. The learning concept includes on sensory feedback information for articulatory states produced by the model itself, and auditory and visual information of speech items produced by external speakers. The complete model (control module and synthesizer) is capable of producing high-quality synthetic speech signals and introduces a scheme for the natural speech production and speech perception processes.