CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Using a human face in an interface
CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The media equation: how people treat computers, television, and new media like real people and places
Improvising linguistic style: social and affective bases for agent personality
AGENTS '97 Proceedings of the first international conference on Autonomous agents
Embodiment in conversational interfaces: Rea
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Helper agent: designing an assistant for human-human interaction in a virtual meeting space
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
NPAR '00 Proceedings of the 1st international symposium on Non-photorealistic animation and rendering
The EMOTE model for effort and shape
Proceedings of the 27th annual conference on Computer graphics and interactive techniques
Nudge nudge wink wink: elements of face-to-face conversation for embodied conversational agents
Embodied conversational agents
Emotion and personality in a conversational agent
Embodied conversational agents
Developing and evaluating conversational agents
Embodied conversational agents
Truth is beauty: researching embodied conversational agents
Embodied conversational agents
Social role awareness in animated agents
Proceedings of the fifth international conference on Autonomous agents
Real Time Responsive Animation with Personality
IEEE Transactions on Visualization and Computer Graphics
Formational Parameters and Adaptive Prototype Instantiation for MPEG-4 Compliant Gesture Synthesis
CA '02 Proceedings of the Computer Animation
International Journal of Human-Computer Studies - Application of affective computing in humanComputer interaction
Planning and Motion Control in Lifelike Gesture: A Refined Approach
CA '00 Proceedings of the Computer Animation
Etiquette equality: exhibitions and expectations of computer politeness
Communications of the ACM - Human-computer etiquette
From brows to trust
Too close for comfort?: adapting to the user's cultural background
Proceedings of the international workshop on Human-centered multimedia
A Qualitative and Quantitative Characterisation of Style in Sign Language Gestures
Gesture-Based Human-Computer Interaction and Simulation
EMBR --- A Realtime Animation Engine for Interactive Embodied Agents
IVA '09 Proceedings of the 9th International Conference on Intelligent Virtual Agents
Why is the creation of a virtual signer challenging computer animation?
MIG'10 Proceedings of the Third international conference on Motion in games
ACM Transactions on Interactive Intelligent Systems (TiiS)
A story about gesticulation expression
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
Synthesizing mood-affected signed messages: Modifications to the parametric synthesis
International Journal of Human-Computer Studies
Customizing by doing for responsive video game characters
International Journal of Human-Computer Studies
Intelligent virtual humans with autonomy and personality: State-of-the-art
Intelligent Decision Technologies
Hi-index | 0.00 |
Humans tend to attribute human qualities to computers. It is expected that people, when using their natural communicational skills, can perform cognitive tasks with computers in a more enjoyable and effective way. For these reasons, human-like embodied conversational agents (ECAs) as components of user interfaces have received a lot of attention. It has been shown that the style of the agent's look and behaviour strongly influences the user's attitude. In this paper we discuss our GESTYLE language making it possible to endow ECAs with style. Style is defined in terms of when and how the ECA uses certain gestures, and how it modulates its speech (e.g. to indicate emphasis or sadness). There are also GESTYLE tags to annotate text, which has to be uttered by an ECA to prescribe the usage of hand, head and facial gestures accompanying the speech in order to augment the communication. The annotation ranges from direct, low level (e.g. perform a specific gesture) to indirect, high level (e.g. take turn in a conversation) instructions, which will be interpreted with respect to the style defined. Using style dictionaries and defining different aspects like age and culture of an ECA, it is possible to tune the behaviour of an ECA to suit a given user or target group the best.