Building natural language generation systems
Building natural language generation systems
The automated design of believable dialogues for animated presentation teams
Embodied conversational agents
BEAT: the Behavior Expression Animation Toolkit
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Natural Language Processing and User Modeling: Synergies and Limitations
User Modeling and User-Adapted Interaction
Negotiated Collusion: Modeling Social Language and its Relationship Effects in Intelligent Agents
User Modeling and User-Adapted Interaction
Formational Parameters and Adaptive Prototype Instantiation for MPEG-4 Compliant Gesture Synthesis
CA '02 Proceedings of the Computer Animation
A fast and portable realizer for text generation systems
ANLC '97 Proceedings of the fifth conference on Applied natural language processing
Synthesizing multimodal utterances for conversational agents: Research Articles
Computer Animation and Virtual Worlds
Trainable sentence planning for complex information presentation in spoken dialog systems
ACL '04 Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics
Investigating Human Tutor Responses to Student Uncertainty for Adaptive System Development
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
Responding to Student Uncertainty During Computer Tutoring: An Experimental Evaluation
ITS '08 Proceedings of the 9th international conference on Intelligent Tutoring Systems
Using linguistic cues for the automatic recognition of personality in conversation and text
Journal of Artificial Intelligence Research
Controlling user perceptions of linguistic style: Trainable generation of personality traits
Computational Linguistics
Increasing the expressivity of humanoid robots with variable gestural expressions
Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction
Hi-index | 0.00 |
Robots are more and more present in our daily life; they have to move into human-centered environments, to interact with humans, and to obey some social rules so as to produce an appropriate social behavior in accordance with human's profile (i.e., personality, state of mood, and preferences). Recent researches discussed the effect of personality traits on the verbal and nonverbal production, which plays a major role in transferring and understanding messages in a social interaction between a human and a robot. The characteristics of the generated gestures (e.g., amplitude, direction, rate, and speed) during the nonverbal communication can differ according to the personality trait, which, similarly, influences the verbal content of the human speech in terms of verbosity, repetitions, etc. Therefore, our research tries to map a human's verbal behavior to a corresponding combined robot's verbal-nonverbal behavior based on the personality dimensions of the interacting human. The system estimates first the interacting human's personality traits through a psycholinguistic analysis of the spoken language, then it uses PERSONAGE natural language generator that tries to generate a corresponding verbal language to the estimated personality traits. Gestures are generated by using BEAT toolkit, which performs a linguistic and contextual analysis of the generated language relying on rules derived from extensive research into human conversational behavior. We explored the human-robot personality matching aspect and the differences of the adapted mixed robot's behavior (gesture and speech) over the adapted speech only robot's behavior in an interaction. Our model validated that individuals preferred more to interact with a robot that had the same personality with theirs and that an adapted mixed robot's behavior (gesture and speech) was more engaging and effective than a speech only robot's behavior. Our experiments were done with Nao robot.