Real time labeling of affect in music using the affectbutton
Proceedings of the 3rd international workshop on Affective interaction in natural environments
Interpreting non-linguistic utterances by robots: studying the influence of physical appearance
Proceedings of the 3rd international workshop on Affective interaction in natural environments
EMOGIB: emotional gibberish speech database for affective human-robot interaction
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II
People interpret robotic non-linguistic utterances categorically
Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction
Using the AffectButton to measure affect in child and adult-robot interaction
Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction
Situational context directs how people affectively interpret robotic non-linguistic utterances
Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction
Hi-index | 0.00 |
Vocal affective displays are vital for achieving engaging and effective Human-Robot Interaction. The same can be said for linguistic interaction also, however, while emphasis may be placed upon linguistic interaction, there are also inherent risks: users are bound to a single language, and breakdowns are frequent due to current technical limitations. This work explores the potential of non-linguistic utterances. A recent study is briefly outlined in which school children were asked to rate a variety of non-linguistic utterances on an affective level using a facial gesture tool. Results suggest, for example, that utterance rhythm may be an influential independent factor, whilst the pitch contour of an utterance may have little importance. Also evidence for categorical perception of emotions is presented, an issue that may impact important areas of HRI away from vocal displays of affect.