Describing and generating multimodal contents featuring affective lifelike agents with MPML
New Generation Computing
Exploring emotions and multimodality in digitally augmented puppeteering
AVI '08 Proceedings of the working conference on Advanced visual interfaces
SuperDreamCity: An Immersive Virtual Reality Experience That Responds to Electrodermal Activity
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
Hi-index | 0.00 |
This paper highlights some of our recent research efforts in designing and evaluating life-like characters that are capable of entertaining affective and social communication with human users. The key novelty of our approach is the use of human physiological information: first, as a method to evaluate the effect of life-like character behavior on a moment-to-moment basis, and second, as an input modality for a new generation of interface agents that we call 'physiologically perceptive' life-like characters. By exploiting the stream of primarily involuntary human responses, such as autonomic nervous system activity or eye movements, those characters are expected to respond to users' affective and social needs in a truly sensitive, and hence effective, friendly, and beneficial way.