SIGGRAPH '85 Proceedings of the 12th annual conference on Computer graphics and interactive techniques
BEAT: the Behavior Expression Animation Toolkit
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Synthetic characters as multichannel interfaces
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
TSD '08 Proceedings of the 11th international conference on Text, Speech and Dialogue
Study on Sensitivity to ECA Behavior Parameters
IVA '09 Proceedings of the 9th International Conference on Intelligent Virtual Agents
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
3D Talking-Head Interface to Voice-Interactive Services on Mobile Phones
International Journal of Mobile Human Computer Interaction
Avatar and Dialog Turn-Yielding Phenomena
International Journal of Technology and Human Interaction
Hi-index | 0.00 |
Embodied Conversational Agent (ECA) is the user interface metaphor that allows to naturally communicate information during human-computer interaction in synergic modality dimensions, including voice, gesture, emotion, text, etc. Due to its anthropological representation and the ability to express humanlike behavior, ECAs are becoming popular interface front-ends for dialog and conversational applications. One important prerequisite for efficient authoring of such ECA-based applications is the existence of a suitable programming language that exploits the expressive possibilities of multimodally blended messages conveyed to the user. In this paper, we present an architecture and interaction language ECAF, which we used for authoring several ECA-based applications. We also provide the feedback from usability testing we carried for user acceptance of several multimodal blending strategies.