Variations in gesturing and speech by GESTYLE

  • Authors:
  • Han Noot;Zsófia Ruttkay

  • Affiliations:
  • Center for Mathematics and Computer Science, INS, Kruislaan 413, 1090 GB Amsterdam, The Netherlands;University of Twente, EWi-HMI, P.O. Box 217, 7500 AE Enschede, The Netherlands

  • Venue:
  • International Journal of Human-Computer Studies - Special issue: Subtle expressivity for characters and robots
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Humans tend to attribute human qualities to computers. It is expected that people, when using their natural communicational skills, can perform cognitive tasks with computers in a more enjoyable and effective way. For these reasons, human-like embodied conversational agents (ECAs) as components of user interfaces have received a lot of attention. It has been shown that the style of the agent's look and behaviour strongly influences the user's attitude. In this paper we discuss our GESTYLE language making it possible to endow ECAs with style. Style is defined in terms of when and how the ECA uses certain gestures, and how it modulates its speech (e.g. to indicate emphasis or sadness). There are also GESTYLE tags to annotate text, which has to be uttered by an ECA to prescribe the usage of hand, head and facial gestures accompanying the speech in order to augment the communication. The annotation ranges from direct, low level (e.g. perform a specific gesture) to indirect, high level (e.g. take turn in a conversation) instructions, which will be interpreted with respect to the style defined. Using style dictionaries and defining different aspects like age and culture of an ECA, it is possible to tune the behaviour of an ECA to suit a given user or target group the best.